This article is contributed. See the original author and article here.

App Service Hybrid connection offers a quick and uncomplicated way to reach your on-premises services in scenarios where there aren’t other networking solutions like VPN or ExpressRoute available. Normally, you don’t even need to open any firewall ports in your on-premises environments because it only requires outbound HTTP connection over port 443 towards Azure to work. Behind the scenes, it is a TCP relay proxy over websockets. It only works to reach services that run on TCP protocols and not UDP. 


Therefore, it might be a good fit if you are planning to migrate your application(s) to Azure App Service but this app has dependencies to on-premises databases or APIs and your networking team is not yet ready to set up a VPN/ExpressRoute connection between these environments. The migration work can be unblocked using Hybrid connections towards these external dependencies with no code changes within your app.


However, what to expect in terms of performance? Apart from the pure networking latency of having an App Service connecting back to on-premises service… will the Hybrid connection itself introduce extra latency on top of network? What about the different scenarios:



  • Reaching on-premises HTTP APIs;

  • Reaching on-premises databases;

  • Downloading on-premises large files over HTTP


 


In this article we will run benchmarks on all given scenarios above and compare them with and without Hybrid connection. It is not the goal here how to configure such a connection, because that tutorial is very well described here.


 


The test setup


 


An App Service Hybrid connection relies on a service called Azure Relay to work (and Azure Relay is based on Azure Service Bus platform). This is how the architecture looks like:


 

AndreDewes_MSFT_3-1664899839679.png


Now, let me explain how the setup in this test is done when comparing to the diagram above:



  • App Service: a small PremiumV2 .NET Core 6 app running in Brazil South;

  • Azure Relay: if you don’t have an already created Azure Relay created, the App Service Hybrid connection will ask you to do so. Here, I created one in Brazil South region;

  • On Premises: to simulate an on-premises environment, here I have a physical computer with a fast and modern hardware (Ryzen 5 5600H, 16GB ram, 512gb SSD) connected to a 600mbps stable fiber connection. This system has an average 12ms (milliseconds) latency to Azure and vice-versa. It also has one SQL Express 2019 database, a .NET 6 API to simulate on-premises services for these tests and the HCM (Hybrid Connection Manager) that is required for this setup.


Now, we want to compare the Hybrid connection overhead over the raw network connection. So, for each test that will follow in this article, we will configure the App Service to hit the services via Hybrid connection endpoints and then run the same test but going directly to the public IP of the “on-premises” server, skipping the relay completely. 


Here’s the configuration in the Portal:


 


AndreDewes_MSFT_4-1664901074611.png


 


Scenario 1: HTTP requests


 


Let’s assume you got on-premises HTTP services to reach from an App Service via Hybrid connection. In the configuration picture above, that endpoint name is “andre-api” which points to a on-premises DNS name of “testerelay” on port 5001. That is the .NET API running in the on-premises computer. This API has a REST endpoint that returns random strings of around ~8kb in size.


From the App Service side, it runs another .NET API that calls the previous endpoint in three different ways:



  • Single request: App Service calls the on-premises API once

  • Sequentially: App Service calls the on-premises API 50 times in a row. When the previous request finishes, the next goes ahead and so on… until we reach 50 requests;

  • Parallel: App Service calls the on-premises API 50 times at the same time. This is accomplished by making use of .NET tasks


The intention here is to verify how well the relay handles a typical real-world scenario where you get many parallel requests at a given time. All requests here are using HTTP2 protocol.


Check out the results table:


 

































 


 



Average response time per HTTP request



Difference



Direct



Hybrid connection



Single request



13ms



24ms



+84%



Sequential (50)



13ms



34ms



+161%



Parallel (50)



50ms



60ms



+20%



 


Important note


Having the App Service .NET API calls the relay forcing the HttpClient to use HTTP2 by default made a huge difference for the positive side in the tests results. HTTP 1.1 was much worse especially in the parallel requests test;


 


Conclusion for HTTP tests


If we look at the difference numbers in % it seems to be a huge overhead added by the Hybrid Connection, but looking at absolute numbers, it is not. In the more realistic test of this setup – the Parallel HTTP simulation – we get only 10ms added compared to a direct connection, which is negligible for most applications. Another point to keep in mind here is that we are comparing the Hybrid connection to a direct connection back to on-premises. In reality we would have a VPN or other appliance which might add some extra delay there too.


 


Scenario 2: database connections


 


Another very common use case is the need to fetch data from a on-premises database that could not be migrated to Azure at the same time as the application. Here we will make the App Service .NET API call the on-premises SQL Server using the relay connection and then directly. The query returns from the database around ~8kb of data per call. Like the HTTP tests, there will be three different scenarios:



  • Single request: AppService queries the database once

  • Sequentially: App Service queries the database 50 times in a row. When the previous query finishes, the next goes ahead and so on… until we reach 50 queries;

  • Parallel: App Service queries the on-premises database 50 times at the same time. This is accomplished by making use of .NET tasks

































 


 



Average response time per SQL query



Difference



Direct



Hybrid connection



Single query



13ms



13ms



0%



Sequential (50)



13ms



27ms



+107%



Parallel (50)



13ms



30ms



+130%



 


Conclusion for database tests


Compared to the HTTP tests, the database queries have less overhead because of the TCP nature of the connections. While the direct connection had no extra overhead even when querying 50 in parallel, the Hybrid counterpart added some but not significantly – again, looking from absolute numbers perspective and not purely in percentage.


 


Scenario 3: large file downloads


 


Now let’s benchmark something less usual: what about using the Hybrid connection to stream a 1GB file (a Linux ISO file) from on-premises REST API via HTTP? Here I’m expecting more overhead because the underlying websockets protocol that Azure Relay is using is not really meant for these cases.  But anyway, here are the results:


 



















REST API HTTP download speed



Difference



Direct



Hybrid connection



27 MB/s



20 MB/s



35%



 


Conclusion for file download test


I was expecting a much worse result, but the Hybrid connection surprised for the better here. I wouldn’t recommend this connection for streaming large files but this test shows that this is possible if it is really needed.


 


Overall conclusion


 


These benchmarks did not cover all the possibilities for a Hybrid connection but certainly give us an idea what to expect. Generally speaking, it is a solid alternative and I would recommend for scenarios where a VPN or ExpressRoute connection is not possible. The biggest advantage for sure is ease of use – setting up your own environment to run similar tests will take just a couple of hours top. 


 


If you wish that I run additional benchmarks and scenarios, please let me know in the comments!


 


 

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.