Introduction
In today’s digital landscape, ensuring the security and performance of web applications is paramount. To achieve optimal protection against cyber threats, organizations deploy web application and API protection (WAAPs) like Wallarm. However, to truly leverage the benefits of Wallarm, deploying filtering nodes closest to the client using Amazon’s global infrastructure, including EC2 instances, Route 53, CloudFront, and Lambda functions, can significantly enhance performance. In this blog post, we’ll explore the performance advantages of this strategic deployment approach and delve into how it leverages Amazon’s robust services. While this article is focused on Amazon, the same design principles can be applied across any cloud provider.
Deployment Architecture
Efficient Traffic Routing with Route 53
Amazon Route 53, Amazon’s scalable and highly available DNS service, plays a crucial role in optimizing the performance of Wallarm filtering nodes. By leveraging Route 53’s intelligent traffic routing capabilities, organizations can direct client requests to the nearest Wallarm filtering node. This intelligent routing reduces latency and ensures efficient utilization of resources, resulting in enhanced performance.
Accelerated Content Delivery with CloudFront
Amazon CloudFront, a global content delivery network (CDN), enhances the performance and availability of web applications. By integrating Wallarm filtering nodes with CloudFront, organizations can distribute content and filter traffic closer to end-users worldwide. This integration enables faster response times, reduced load on the origin servers, and improved overall application performance.
Scalable Security with Lambda Functions
Amazon Lambda, a serverless computing service, provides a powerful tool for enhancing Wallarm’s performance. By leveraging Lambda functions, organizations can dynamically scale the Wallarm filtering nodes based on demand. This elasticity ensures that the filtering capacity aligns with the application’s needs, optimizing performance during peak traffic periods while minimizing costs during low-traffic times.
Global Coverage for Enhanced Security
Deploying Wallarm filtering nodes across Amazon’s global infrastructure provides organizations with extensive coverage for web application security. With a presence in multiple regions, organizations can achieve a distributed and redundant architecture, safeguarding against single points of failure and ensuring uninterrupted protection against evolving threats. This global coverage also contributes to improved performance by reducing the distance between clients and filtering nodes.
Performance Implications
Latency Reduction with CloudFront and localized EC2 Instances
Amazon Elastic Compute Cloud (EC2) instances provide scalable compute resources around the world. By deploying Wallarm filtering nodes in various Amazon regions, organizations can significantly reduce latency without compromising security. This approach ensures that traffic is processed and filtered closest to the client, minimizing the round-trip time and improving overall application performance.
In the ideal baseline, the client connection is in the same region as the entire application or API packet route.
Your CDN solution, in this case CloudFront, optimizes connections and caches frequently accessed content improving the overall client experience. However, typically the compute for applications and APIs is not located in the same region as the client. As a result latency is typically doubled. Even if deployed in the same infrastructure as the application or API. Since the packets must be mirrored for analysis, where that traffic is sent directly affects the total packet transit time.
A Wallarm Security Edge deployment brings the initial traffic analysis and real-time protection closest to the client in the same way your CDN solution does.
When end to end encryption is required, the added latency of the TLS operations must also be considered. We typically see an added latency of 8-10 ms when TLS is added to all connections.
The table below outlines the testing results we observed in the scenarios outlined above within Amazon’s Global Infrastructure.
Regional Location | Traffic Route | Avg Latency |
Application and client in the same region | Baseline without CloudFront lambda@edge | 5-6 ms |
Filtering nodes in region with application and client | Wallarm with CloudFront lambda@edge | +5-10 ms |
Filtering nodes in region with application and client + TLS | Wallarm with CloudFront lambda@edge | +13-20 ms |
Application out of region with client | Baseline without CloudFront lambda@edge | 50-60 ms |
Filtering nodes, application and out of region with client | Wallarm with CloudFront lambda@edge | +50-60 ms |
Filtering nodes, application and out of region with client + TLS | Wallarm with CloudFront lambda@edge | +58-70 ms |
Filtering nodes in region with client | Wallarm with CloudFront lambda@edge | +5-10 ms |
Filtering nodes in region with client + TLS | Wallarm with CloudFront lambda@edge | +13-20 ms |
Conclusion
In today’s fast-paced digital world, optimizing web application performance while ensuring robust security is critical. By deploying Wallarm filtering nodes closest to the client using Amazon’s global infrastructure, organizations can achieve significant performance advantages. Leveraging EC2 instances, Route 53, CloudFront, and Lambda functions enables reduced latency, efficient traffic routing, accelerated content delivery, and scalable security. This strategic approach not only enhances application performance but also provides organizations with comprehensive security coverage in a rapidly evolving threat landscape. By harnessing the power of Wallarm and Amazon’s services, organizations can achieve a secure and high-performing web application ecosystem.
The post Maximizing Performance with Wallarm Filtering Nodes in Amazon’s Global Infrastructure appeared first on Wallarm.