
Answer-first summary for fast verification
Answer: Add an Amazon CloudFront distribution for the static content. Add an Amazon Simple Queue Service (Amazon SQS) queue to receive requests from the website for later processing by the EC2 instances.
## Explanation **Correct Answer: D** **Why Option D is correct:** 1. **Amazon CloudFront for static content**: The website is hosted on Amazon S3, which serves static content. CloudFront is a content delivery network (CDN) that caches static content at edge locations, reducing latency and offloading traffic from the origin servers. This is the appropriate use of CloudFront for static content. 2. **Amazon SQS for asynchronous processing**: The scenario mentions that the API has "backend workers that process sales requests asynchronously." This is a perfect use case for Amazon SQS (Simple Queue Service). During traffic spikes, sales requests can be placed in an SQS queue, and EC2 instances can process them at their own pace. This decouples the front-end request handling from the backend processing, ensuring that requests are not lost during traffic surges. **Why other options are incorrect:** **Option A**: Incorrect because: - CloudFront is not typically used for dynamic content in this way (dynamic content should be served from the API) - Simply increasing EC2 instances manually doesn't provide automatic scaling for sudden traffic spikes **Option B**: Partially correct about CloudFront for static content, but: - Auto Scaling based on network traffic might not be sufficient for asynchronous processing - Doesn't address the decoupling needed for asynchronous request processing - Network traffic metrics might not accurately reflect the actual processing load **Option C**: Incorrect because: - CloudFront is not optimized for dynamic content in this context - ElastiCache is for caching database queries, not for reducing API traffic - Adding ElastiCache in front of the ALB doesn't make architectural sense **Key architectural principles demonstrated:** - **Decoupling**: Using SQS to separate request submission from processing - **Scalability**: CloudFront handles static content scaling, SQS handles request queuing - **Resilience**: No requests are lost during traffic spikes as they're stored in SQS - **Cost optimization**: Resources scale based on actual processing needs rather than peak traffic This solution ensures that all sales requests are received and queued for processing, even during sudden traffic increases, without overwhelming the backend systems.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
A company is hosting a three-tier ecommerce application in the AWS Cloud. The company hosts the website on Amazon S3 and integrates the website with an API that handles sales requests. The company hosts the API on three Amazon EC2 instances behind an Application Load Balancer (ALB). The API consists of static and dynamic front-end content along with backend workers that process sales requests asynchronously.
The company is expecting a significant and sudden increase in the number of sales requests during events for the launch of new products.
What should a solutions architect recommend to ensure that all the requests are processed successfully?
A
Add an Amazon CloudFront distribution for the dynamic content. Increase the number of EC2 instances to handle the increase in traffic.
B
Add an Amazon CloudFront distribution for the static content. Place the EC2 instances in an Auto Scaling group to launch new instances based on network traffic.
C
Add an Amazon CloudFront distribution for the dynamic content. Add an Amazon ElastiCache instance in front of the ALB to reduce traffic for the API to handle.
D
Add an Amazon CloudFront distribution for the static content. Add an Amazon Simple Queue Service (Amazon SQS) queue to receive requests from the website for later processing by the EC2 instances.