SSL Passthrough vs SSL Offloading: Know the Difference

With Secure Sockets Layer (SSL) passthrough, encrypted traffic from clients is passed on to web servers without undergoing decryption in a load balancer or proxy server located between client and server. Instead, data packets are decrypted directly on the web server. SSL passthrough is ideal for secure data transfers, as encrypted traffic is secure from malicious attacks until it reaches its destination.

In contrast, SSL offloading decrypts the data with a load balancer, after which the decrypted data packets get forwarded on to the web server. This process is inherently less secure since the decrypted data packets can be subjected to malicious attacks on their way to the destination web server. This article discusses SSL passthrough and SSL offloading in more detail, and how these processes are configured in Parallels® Remote Application Server (RAS).

What Is SSL Passthrough?

When the internet came about, all web traffic was governed using the Hypertext Transfer Protocol (HTTP). Because HTTP was unencrypted, it was inherently insecure. Hypertext Transfer Protocol Secure (HTTPS) came about to address this issue.

HTTPS used to secure all traffic on the internet using SSL. While the more secure Transport Layer Security (TLS) protocol has since superseded SSL, the latter acronym remains in wide use today, as seen in the concepts of SSL passthrough and SSL offloading.

SSL passthrough passes encrypted HTTPS traffic from clients on to web servers, then back from web servers to clients, without the requests undergoing decryption at a load balancer or proxy server on their way to the web server and back. Since the requests are decrypted only on the web server, this means that SSL passthrough is ideal for scenarios that require strict data security.

With SSL passthrough, there is little chance of man-in-the-middle attacks targeting the traffic between load balancer and server, since the traffic remains secure all throughout the connection process, only getting decrypted when it reaches its destination. In addition, since load balancers do not perform decryption on the traffic that passes between client and server, they have relatively little overhead. Thus, load balancers are able to direct traffic more accurately.

However, SSL passthrough does require more central processing unit (CPU) cycles, making it more expensive in terms of operational costs. It also does not allow inspection of requests nor does it allow you to perform any action on the web traffic, meaning you cannot use access rules, redirects, and cookie-based sticky sessions with SSL passthrough. This makes SSL passthrough suitable only for small deployments. If you have more stringent usage requirements for your websites, you may need to look at other alternatives.

What Is SSL Offloading?

SSL offloading is an alternative way of handling HTTPS traffic. With SSL offloading, load balancers or proxy servers located between the clients and servers are tasked with decrypting the traffic originating from clients onto the web servers, then encrypting the traffic sent from the web servers back to the clients.

By letting load balancers or proxy servers take care of decrypting and encrypting web traffic, web servers literally get offloaded from this computation-heavy task. This allows them to perform their primary task of serving web pages to requesting clients in as fast a manner as possible.

However, since incoming traffic from load balancers to web servers is already unencrypted, SSL offloading may leave your network vulnerable to man-in-the-middle attacks and data theft. The sharing of encryption and decryption keys between network instances can compound the problem. To offset these potential disadvantages, you may need to beef up your IT team’s data and network security capabilities.

Due to the security challenges of SSL offloading, it is best used where secure network traffic is not of paramount importance.

What Is High Availability Load Balancing?

High availability means ensuring that your systems and processes are operational continuously. When applied to IT infrastructure, high availability means adding a layer of redundancy to your setup so that when a system component fails, another component with the same function takes over. This helps your organization avoid potentially costly downtime.

Load balancing distributes the workload among several servers, allowing systems to better handle network traffic. The typical load balancing setup comprises multiple resources, with load balancers located between clients and servers. When incoming traffic from clients comes in, the load balancers direct traffic to the servers most capable of handling the requests. This leads to maximal throughput and more reliable and efficient response times.

You can use either resource-based or round-robin load balancing with Parallels RAS. Resource-based load balancing distributes traffic based on server availability. Thus, incoming requests are always redirected to the least busy server. On the other hand, round-robin load balancing redirects traffic based on sequential order. For example, traffic from Client A is redirected to Server 1, Client B is redirected to Server 2, and so on until you circle back again.

Parallels RAS has a High Availability Load Balancing (HALB) feature that distributes incoming connections based on workload and directs traffic dynamically to healthy gateways, an unlimited number of which are supported. Parallels RAS HALB allows running many HALB appliances simultaneously, reducing the possibility of downtime and ensuring the high availability of your applications.

Parallels RAS HALB: Take Control of Your SSL Connections

From the Parallels RAS Console, you can configure Parallels RAS HALB to perform effective load-balancing for your network. Parallels RAS HALB is flexible, as you can set it up to either add redundancy to your network by routing traffic to available gateways or bypass decryption using SSL passthrough.

Parallels RAS uses resource-based load balancing by default when there is more than one available server in your network. You can set up resource-based load balancing to redirect users based on the number of user sessions, available memory, and CPU usage of the servers on your network. If you want to use round-robin load balancing instead, simply select this method from the Load Balancing section in the Parallels RAS console.

Download the trial to see how you can use Parallels HALB for managing your SSL connections.