man typing on laptop working

The pros and cons of offloading TLS/SSL encryption and decryption to your ADC

Security Published on 6 mins

TLS encryption (formerly known as SSL encryption) is used to improve the safety of data exchanged over a network. But where should it sit in your network architecture? Some might suggest offloading this from the backend servers to the load balancer. However, as I will explain, the pros and cons of this course of action should first be carefully considered.

Why use TLS/SSL to encrypt your data?

In 2023, no application should be communicating over the public Internet in plain text. Data needs to be encrypted, transforming it into formats that cannot be deciphered easily or accessed by unauthorized users. In particular, passwords, credit card numbers, health records, and personal information should unquestionably be encrypted to make them secure when they are being transmitted electronically.

Furthermore, some form of encryption is often required to meet national or international laws or industry regulations, such as the EU's General Data Protection Regulation (GDPR) and the Payment Card Industry Data Security Standard (PCI DSS).

How can you use TLS/SSL to encrypt your data?

Best practice today is to use the latest version of Transport Layer Security (TLS), the encryption protocol that replaces the previous and more commonly known Secure Socket Layer (SSL) protocol.

SSL offloading is the process of moving SSL traffic decryption and encryption away from your web servers onto a centralised device, be it a load balancer or specific SSL offloading hardware.

There are two main phases in SSL/TLS:

  1. The handshake
  2. Data exchange

Old, legacy applications may not incorporate TLS or SSL cryptography however, or they may make use of older cryptographic standards that are no longer considered secure. Cryptographic failures can therefore occur when the algorithms or protocols used become out-of-date, or vulnerabilities are discovered in their implementation, making them easier to break.

When should you offload TLS/SSL encryption and decryption?

For applications that either cannot perform encryption, or use outdated encryption methods, ADCs can be used to encrypt outgoing data and decipher incoming data using the latest standards and protocols, effectively taking over responsibility for cryptography from backend servers.  

The advantages of TLS/SSL offloading

TLS offloading on the load balancer has the potential to offer a number of advantages. Here are the main benefits:

  1. Increased server capacity — Backend servers can be alleviated of the computational load associated with SSL/TLS operations by outsourcing the encryption and decryption procedure to the load balancer. This in turn increases the capacity of the backend server, allowing it to serve more client requests.
  2. Simplified certificate monitoring — Offloading TLS/SSL certificates to the load balancer allows them to be managed from a single location, simplifying the management of the process and ensuring all backend servers have current certificates.
  3. Improved security - Terminating the traffic on the load balancer/ADC introduces options such as mTLS where the users of the service/application can be authenticated based on client-certificates. Doing so can provide a robust additional layer of security control and protection for the service.

The disadvantages of TLS/SSL offloading

While TLS/SSL offloading to the ADC might seem appealing, the reality is that there are some significant disadvantages to consider:

  1. Reduced load balancer capacity — Depending on the volume of traffic, SSL offloading can double your load and halve your speed (after all, you are essentially re-encrypting the backend).
Bearing in mind that the load balancer is intended to facilitate high availability and performance, making the ADC a bottleneck may not make a lot of sense.

2.  Higher latency for geographically dispersed workloads — Related to the above is the impact of this additional workload on network latency. In certain scenarios (depending on where the client is located relative to the server), the processing requirements of TLS/SSL encryption and decryption may result in a noticeable delay.

3.  Inhibited scalability— Moving the SSL termination from multiple devices to just the load balancer could be seen as inhibiting the potential scalability.

4.  Broader attack surface — By terminating SSL/TLS at the ADC, it becomes a critical security endpoint. If, for any reason, the load balancer is hijacked by a malicious actor, this would give the would-be attacker access to decrypted traffic, private keys, and other sensitive information.

5. End-to-end encryption limitations — Related to the above is the need to segment the load balancer from the network firewall when TLS/SSL offloading on the ADC. With TLS/SSL encryption, communication between the load balancer and the backend servers is typically unencrypted, meaning additional security measures may need to be taken to avoid introducing a potential security vulnerability if the internal network is compromised.

It is therefore important to carefully examine the use case being considered, as well as the specifics of the application and supporting infrastructure involved, to determine the appropriate course of action.

Using SSL offloading responsibly

Some organizations elect to offload TLS/SSL encryption to load balancers, even when their web application servers could securely handle this function. Indeed some ADC vendors actively promote this approach to increase the computational load on their load balancers and therefore sell ever more powerful (and more expensive) load balancers with bigger licenses.

Loadbalancer.org has always recommends that you use the application cluster for horizontally scaling your SSL, because it can be more cost effective and scalable for organizations to continue to handle TLS/SSL encryption on their backend servers, if their applications have this capability.

However TLS/SSL termination on the load balancer is definitely required if you need to use load balancer based cookie persistence instead of just source IP persistence, plus a few other things that you really should not be doing like directing requests based on contents of URLs etc.

TLS/SSL offloading should, therefore, only be considered if the application cannot perform cryptography securely itself, or if there is another compelling business or technical reason to do so.

What are the implications of using a WAF for TLS/SSL offloading?

It’s often desirable to use a web application firewall (WAF) to ensure that potentially harmful requests (e.g. SQL injection, remote command execution etc) aren’t passed on to the real servers. Think of it as a sieve for HTTP traffic, which is fully tuneable to the individual requirements of a deployment. To make this possible we have a WAF built right into our appliance as standard, configurable via the WebUI. The important part to appreciate here is that the WAF can’t directly read encrypted traffic...

For the WAF to process the traffic, we must first decrypt the HTTPS traffic so it can be handled in ‘clear text’ - importantly, this can be restricted to clear text within the load balancer only, if desired - then pass it through the WAF before passing this onwards toward the real server.

Decrypting the traffic is done by configuring an ‘SSL termination’, using an appropriate certificate that would be recognised and trusted by the connecting browser - otherwise you’ll see nasty redness in the address bar and your users will get warning messages. This is placed before the WAF.

So, now we can catch the encrypted traffic at the SSL termination, decrypt it and pass it through the WAF, before it makes its way to the real server. The WAF itself doesn’t perform the load balancing directly, it hands this over to HAProxy, the Layer 7 load balancing service, for it to make the final connection to the real server which - could be encrypted, if desired.

In the event that you need to mandate that all user requests arrive on the secure port (i.e. HTTPS rather than HTTP) then it is also possible with the load balancer.  Force to HTTPS can be enabled and this makes sure requests to the insecure HTTP service on port 80 are redirected to HTTPS on port 443 (or whichever port you want).

The redirection capability lies within the Layer 7 service, which issues a redirect code to the client. This is known as ‘Force to HTTPS’ within our WebGUI and is configured within a Layer 7 virtual service.

So, we have:

  • An HTTP mode Layer 7 virtual service accepting traffic on port 80 in clear text, telling any clients to reconnect on HTTPS
  • HTTPS connections are picked up by the SSL termination and passed into that same Layer 7 virtual service which detects the traffic coming via the SSL termination and passes it onward to the WAF
  • The WAF processes the traffic and hands it over to a Layer 7 virtual service where load balancing algorithms are applied and finally onward to the real servers, encrypted if so chosen
  • The response traffic takes the reverse traffic through the ‘chain’ on its route back to the client machine.

Conclusion

For applications that either cannot perform encryption or use outdated encryption methods, ADCs can be used to encrypt outgoing data and decipher incoming data. Although, in general, offloading TLS/SSL encryption and decryption to the load balancer is less commonly applied in practice because the disadvantages outweigh the advantages.

Want more on security?

The OWASP Top 10 and role of ADCs