An Introduction to Application Delivery Controllers
Application delivery makes online applications, such as websites, SaaS apps or ecommerce sites, available, secure and reliable. Application delivery solutions ensure access to online services, with the application delivery controller (ADC) the core component of an application delivery model.
An ADC (application delivery controller) is a network appliance that optimizes and manages how enterprise and web application servers connect with client machines. Traditionally these appliances have been hardware-based but software-defined ADCs are increasingly common. Purpose-built networking components, application delivery controllers improve the performance, resiliency, and security of applications delivered online.
In this context, the term controller refers to the data flow between computing systems. The two systems transferring the information might both be servers, as when an application accesses sensitive information. The systems might also be a client and server, as when a user buys items on an e-commerce website through their web browser.
As it manages internet traffic, an ADC directs data packets as they move in and out of the server environment. Modern packet transmission is reliable and fast. The primary goals for application delivery solutions are to ensure that applications are delivered successfully, quickly, and securely.
Today, application delivery includes cloud and mobile access. Applications must function outside a physical work location and across all types of networks. Business applications at the enterprise level have transitioned away from locally installed, desktop-bound software accessed by users across the LAN.
Crucial to this transition are application delivery controllers, which assist applications in adapting to current protocols and networks. ADCs also ensure that applications are always available, perform optimally, and don’t present security risks to the business or to users.
Application Delivery Controller Definition
Often part of an application delivery network (ADN), an application delivery controller is a network device or application that assists with common tasks such as load balancing and SSL termination. For example, many act as web accelerators to ease load from the web servers themselves or offer load balancing. Application delivery controllers are generally located in the DMZ, between a web farm and the outer router or firewall.
Application delivery controllers include traditional load balancing capabilities that automatically distribute communications and processing evenly across computer servers and networks to prevent any single device from becoming overwhelmed. ADCs have additional features such as application access management (AAM), automatic application health checks, RAM caching, proxy and reverse proxy capabilities, SSL offload, TCP reuse, web application firewalls (WAF) and DNS application firewalls (DAF).
Together these ADC features prevent performance degradation and potential downtime, ensuring that an enterprise’s networks and applications remain accelerated, highly available, and secure.
How Do Application Delivery Controllers Work?
An application delivery controller distributes inbound application traffic using algorithms and policies. Basic round robin load balancing forwards client requests in turn to each server, but it will not account for responsiveness or health, assuming all servers are the same. In contrast, an administrator can direct an ADC via additional policies to determine which servers should receive which inbound requests based on a number of criteria. For example, the application delivery controller can inspect and route packet headers based on requested file types or keywords.
In a typical configuration, the ADC mimics a single virtual server from the perspective of an end user, sitting in front of a cluster of application and web servers and mediating responses and requests between them and clients. An application delivery controller employs various methods to improve web application performance.
What Does an Application Delivery Controller Do?
An application delivery controller typically uses a number of techniques to enhance performance:
Administrators rely on application delivery controllers for their monitoring capabilities, which extend to confirming a server’s operability and health far past the standard ping. The ADC avoids potential disruptions by routing traffic to alternate servers when monitoring indicates that specific health criteria for server responsiveness are not being met, or when a server is experiencing an issue.
Application delivery controllers also offer both historical and real-time analysis of all user and network traffic, with metrics for bandwidth usage, round-trip times, and WAN and datacenter latency. This data can minimize the time help desk staff spends identifying the source of problems and provide quick solutions.
ADCs sit in the data path between applications and clients, with the opportunity to offer visibility of application behavior and performance. Application delivery controller systems monitor, analyze, and log incoming client requests and responses from the application, offering real-time insight into bad server and application behaviors. However, traditional ADC appliances focus on delivering packets efficiently and do not offer application insights.
Load balancing means using algorithms to distribute incoming requests across servers. Simpler round robin algorithms send requests sequentially, while more sophisticated algorithms weigh factors such as client location, server capacity, fields in the HTTP header, type of content requested, and more.
An application delivery controller is also capable of global server load balancing, redirecting traffic to server clusters located in a separate datacenter. Another ADC may work in tandem with the first application delivery controller, front-ending those servers at the offsite datacenter. These sites may be configured in active-active mode, in which both sites support inbound traffic actively, or in active-passive mode. Each ADC routes client requests to the nearest datacenter for a given user, minimizing round trip times and latency and ensuring a better experience.
In case of a shutdown, this configuration also supports business continuity, allowing the ADC to divert traffic from the affected area to an available application delivery controller at a co-located site. The new ADC then directs requests to viable resources. Acting as a reverse proxy, the application delivery controller interacts with application servers for the client, performing application monitoring, management, acceleration, and security services right in the data path.
Caching reduces server load and speeds delivery by storing content locally on the ADC rather than fetching it from the backend servers again each time a client requests it.
Offloading SSL processing allows the ADC to decrypt requests and encrypt responses in place of the servers, essentially replacing the backend servers as the SSL endpoint for connections with clients. This speeds content delivery by freeing servers to do the tasks they are designed for, handling these computationally intensive operations. Server load balancer systems provide application availability, performance, and scalability.
Application delivery controller platforms guard against and mitigate DDoS attacks. With SSL offloading capabilities enabled, the ADC can detect and block DDoS attacks that use SSL tunneled traffic without exposing servers and applications.
Web Application Firewall
Certain application delivery controller platforms include a built-in web application firewall (WAF). A WAF helps prevent web attacks and protects applications from security vulnerabilities such as cookie poisoning, cross-site scripting (XSS), data form overruns, SQL injection, and malformed HTTP packets.
Application delivery controllers may serve as central authentication points which verify authentication and authorization for clients. In this way, application access management (AAM) systems and ADC systems can interface, offloading the task of processing central authentication services away from the application servers and reducing the complexity of the application environment.
Multi-tenancy designs support multiple application tenants such as engineering, marketing, and sales, or multiple application teams such as customer services, inventory, and payments. Better utilization of underlying infrastructure and the cost benefits that follow drives these designs.
Organizations can achieve agility and savings in administration, acquisition, and ongoing support costs by consolidating more services onto fewer devices, whether virtual or physical. Time for provisioning and de-provisioning new services falls dramatically once the existing infrastructure can be used to provision them without creating or purchasing new components. An expandable pool of resources can service new business units, customers, or projects as it supplies the required layer 4–7 services for those applications.
This rapid infrastructure provisioning not only adds value to service consumers, but also facilitates monitoring and control for the IT department. IT can allot infrastructure costs to specific tenants by using the services fabric to create standardized services and assigning resource controls to the tenants.
Application Delivery Controller Explained
Who Uses Application Delivery Controllers?
Almost any enterprise or company that uses large scale CDNs or those that need to provide fast software/web application services to ensure the availability and security of their high traffic websites uses application delivery controllers. ADCs are frequently used as a reverse proxy server, and to ensure high availability for a seamless end-user experience.
DevOps engineers—who build the applications that depend on application delivery controllers—are increasingly responsible for managing and analyzing ADC performance and optimizing for application acceleration as more companies move toward digital transformation and update their network architectures. However, in many organizations, IT and network teams still manage application delivery, especially where legacy or application delivery controller hardware is deployed.
What Is an Application Delivery Controller in Networking?
Application delivery networking refers to creating a technological framework to provide appropriate levels of acceleration, security, availability, and visibility to networked applications. Typically, application delivery controller technologies reside at network endpoints and feature their own proprietary CPUs. They deploy a range of optimization techniques to enhance application delivery across the internet and use real-time data to prioritize access and applications.
The technologies, once deployed, form an application delivery network (ADN). Application delivery controllers and WAN optimization controllers (WOCs) are the two components that comprise application delivery networking.
The ADC is located near the end of the application delivery network close to the data center. The WOC is located both near the end point and in the data center. It focuses on latency optimization, improving application performance, and also handles compression, caching, protocol spoofing, and de-duplication.
Although an ADN is sometimes called a content delivery network (CDN), ADNs optimize the acceleration of dynamic content while CDNs focus on static content.
What is an Application Delivery Platform?
An application delivery platform (ADP) is a suite of application delivery management technologies that handles services such as security controls, load balancing, and traffic management in cloud environments and data centers. The role of the application delivery platform is to securely and reliably deliver applications to end users.
What is Application Delivery Management?
Application delivery management (ADM) is the implementation of best practices in analytics, application security, throughput optimization, and troubleshooting to achieve fast, secure, responsive, reliable user access to applications.
Web Application Security
Online application delivery generates unique vulnerabilities that are distinct from those that arise with traditional LAN-bound applications. Stricter safeguards against data leakage and external attacks are critical as workers require remote access to data and applications.
Application delivery controllers function as the natural gateway to the network, authenticating each user’s attempt to access an application. The ADC can use an on-premises active directory data store to validate user identity for SaaS-based applications. This eradicates the need to store credentials in the cloud, providing single sign-on capabilities and enhancing the user experience as well as improving web application security.
The XML-based protocol SAML is widely used to simplify the application login process. An application delivery controller can authorize users via data stores, acting as a SAML agent and confirming their identities. For example, when platforms such as Twitter or Facebook grant the use of credentials to applications to validate identity, ADCs can serve as the SAML identity or service provider.
The application delivery controller can guard against specially designed DDoS attacks by protecting internal server resources from being targeted with rate-limiting measures. Once such measures are implemented, the ADC can suppress any unusually large surges of inbound requests and minimize how much available bandwidth they consume—or reject such requests completely.
Traditionally, advanced Layer 7 protection and load balancing were only available as standalone solutions. Application delivery controllers have brought those capabilities together, creating application firewalls that detect malicious scripts or suspicious content in data packet headers that the network firewall may miss.
ADCs support both positive and negative security models. They can integrate with third-party security providers to employ signature-based protection. An application delivery controller can also determine usage patterns in “learning” mode, analyzing which traffic patterns signify normal behavior. Based on that learning, the ADC will automatically flag and block any malicious inbound requests sent by attackers using cross-site scripting or SQL injection. Together, these techniques enable the ADC to generate a comprehensive, hybrid positive and negative security model for users and applications.
Application Delivery Controller Benefits
Enhanced application performance is central to the use of application delivery controllers. Particularly over high-latency and mobile networks, an application delivery controller can employ various techniques to expand application services that improve application performance.
SSL and TLS are essential to e-commerce, but it is very CPU intensive to manage traffic encrypted with new ciphers. Application delivery controllers decrypt traffic before it reaches the server and manage certificates, and can handle massive volumes of both encrypted and unencrypted traffic.
Application delivery controllers also engage in TCP multiplexing to more effectively manage high volumes of inbound server requests. The ADC routes requests using open channels as traffic arrives, eliminating the inefficient open and close overhead for every transaction.
Application delivery controllers can also optimize performance across mobile networks using domain sharding, breaking down content on each page into a sequence of subdomains that enable many channels to open at once. This improves performance and decreases page load time.
Application delivery controllers offer visibility into delivered content, and can further optimize delivery of image-heavy web pages by converting GIF files into PNG formats. ADCs can also reduce download times by compressing other large components of web pages including cascading style sheet (CSS) files by removing white space and unnecessary characters.
Application availability is critical to consumers, and application environments should be built for fault tolerance as much as possible. Typical failover strategies include deploying additional servers at a co-located site or in the datacenter. ADCs can help provide seamless failover, ensuring high availability of applications, by balancing application workloads across multiple active servers and sites.
Application Delivery Controller vs Load Balancer
It is a common error that an application delivery controller is just a next-generation load balancer.
While an application delivery controller is a network component that provides various OSI layer 4-7 services, including load balancing, it also offers many other features. Most ADCs also include GSLB, CGNAT, DNS and IPAM, proxy/reverse proxy, IP SSL offload, traffic chaining/steering, traffic optimization, and web application firewall services. Many also provide more advanced application delivery controller features such as server health monitoring, content redirection, and application performance monitoring services.
Virtual Application Delivery Controllers
All application delivery controllers function as single points of control. They are the location where an application’s security needs, authorization, authentication, and accounting can all take place.
Virtual application delivery controllers offer these same application delivery controller services, but specifically tailored for cloud computing systems and virtualized data centers. Virtual ADCs are similar in architecture and function to hardware ADCs but offer lower performance since they don’t have the hardware acceleration of customized computer chips (ASICs) or specialized network interfaces.
What is Virtual Application Delivery?
Virtual application delivery controllers (vADCs) or virtual load balancers, provide application management and load balancing capabilities. Typically this is the same software within a hardware appliance but it runs in a virtualized environment in a cloud. In contrast to a hardware-based ADC, virtual application delivery can run on any infrastructure, including the public cloud. The software from the appliance is virtualized to run in a cloud environment. However, virtual ADCs are still architected for monolithic applications and would typically benefit from the hardware acceleration of the appliance, so not as performant running on standard cloud infrastructure. This is a hybrid application delivery controller architecture that might be suitable for organizations wanting to maintain a familiar hardware control interface, but it lacks the full potential of a cloud-native platform.
With the adoption of cloud computing, many public cloud providers recognized the short-comings of virtual application delivery and developed their own cloud-native solutions. Systems such as Amazon ELB (Elastic Load Balancer) offer the advantage that they were built for microservices applications and can scale up as demand increases unlike a virtualized version of a traditional ADC. Tools like Amazon ELB offer the basic features to enable organizations to run their applications in the cloud, but do not provide the advanced capabilities – such as intelligent autoscaling, service discovery and automation – provided by software-defined, cloud-native application delivery platforms.
Some vendors also offer the application delivery controller as a service. This is a SaaS-based approach to hosting the control application with all the benefits in terms of uptime, maintenance, patching, and scaling which come with on-demand applications. The data plane or agents must still be deployed locally to the applications in this architecture.
Benefits of Virtual/Cloud-based Application Delivery
There are several important benefits inherent to cloud-based application delivery:
Cloud-based application delivery replaces a hardware-based solution. A public cloud service is better-suited to deliver quality while scaling globally.
A cloud-based application delivery process obviously reduces hardware acquisition and maintenance costs. Businesses also spend less on support as application performance improves—and user experience with it.
Cloud-based application delivery enables faster performance from cloud-based applications, and users can immediately access services and information anywhere, from any device.
History of Application Delivery Controllers
Beginning in around 2004, first-generation ADCs offered simple load balancing and application acceleration. By 2006, application delivery controllers began featuring advanced application services such as caching, compression, traffic shaping, connection multiplexing, SSL offload, application layer security, and content switching. Together with existing services in an integrated framework, application delivery controllers optimized and secured application flows that were mission-critical.
By 2007, many companies made application acceleration products available. Cisco Systems offered ADCs until 2012, and Citrix, F5 Networks, and Radware maintained the application delivery control market in its absence.
Application delivery controller market shares divided into two areas: general network optimization and framework/application-specific optimization. Both ADCs improve performance, but framework/application-specific optimization is typically more focused on optimization strategies for particular application frameworks, such as optimization for AJAX or ASP.NET applications.
In 2012, Avi Networks introduced a software-defined networking architecture. This is a cloud-native approach that separates the central control plane from the data plane of load balancers. This enabled centralized configuration, management, governance and monitoring of load balancers across a distributed infrastructure fabric including on-premise, hybrid and public cloud. This architecture enables intelligent automation, autoscaling, centralized policy management, observability of applications no matter where they run from a central console.
As of 2016, Webscale launched a cloud-based application delivery platform, and today cloud-based application delivery solutions are common.