
Answer-first summary for fast verification
Answer: Store forecast locations in an Amazon ES cluster. Use an Amazon CloudFront distribution targeting an API Gateway endpoint with AWS Lambda functions responding to queries as the origin. Create an Amazon Lambda@Edge function that caches the data locally at edge locations for 15 minutes.
Option C is the best choice for the following reasons: 1. **Amazon ES Cluster for Storage**: Amazon Elasticsearch Service (ES) allows for efficient search, analysis, and visualization of the forecast data. It is designed to handle large datasets with low latency access, which is critical for the application needing quick retrieval of weather data. 2. **AWS Lambda for Query Responses**: Using AWS Lambda functions ensures scalable, on-demand compute power to handle incoming queries. It simplifies the management compared to maintaining an EC2 fleet because you don't need to worry about provisioning and scaling instances. 3. **Amazon API Gateway**: API Gateway provides a managed service for creating, publishing, maintaining, monitoring, and securing RESTful APIs. It allows the integration of Lambda functions with HTTP endpoints efficiently, handling the distribution and scaling of API requests. 4. **Amazon CloudFront**: CloudFront as a CDN ensures that the forecast data is cached and served from locations closer to the end-users, reducing latency and improving access times across the globe. 5. **Lambda@Edge for Caching**: Lambda@Edge extends the capabilities of AWS Lambda to CloudFront to execute functions at edge locations. Caching the forecast data locally at edge locations for 15 minutes ensures that repeated requests are served from the cache, significantly reducing the load on origin servers and improving response times during high traffic periods. The combination of these services addresses both the high request rate and the stringent response time requirements by leveraging effective caching strategies, scalable compute, and efficient data retrieval mechanisms.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
A business has an application that creates a 15-minute weather prediction with a resolution of 1 billion distinct locations, each around 20 bytes in size (20 Gigabytes per forecast). Each hour, roughly 5 million times (1,400 requests per second) the prediction data is accessed worldwide, and up to ten times more during severe weather occurrences. Each update overwrites the predicted data. Users of the present weather forecast program anticipate receiving results to inquiries in less than two seconds. Which architecture satisfies the specified request rate and response time requirements?
A
Store forecast locations in an Amazon ES cluster. Use an Amazon CloudFront distribution targeting an Amazon API Gateway endpoint with AWS Lambda functions responding to queries as the origin. Enable API caching on the API Gateway stage with a cache-control timeout set for 15 minutes.
B
Store forecast locations in an Amazon EFS volume. Create an Amazon CloudFront distribution that targets an Elastic Load Balancing group of an Auto Scaling fleet of Amazon EC2 instances that have mounted the Amazon EFS volume. Set the cache-control timeout for 15 minutes in the CloudFront distribution.
C
Store forecast locations in an Amazon ES cluster. Use an Amazon CloudFront distribution targeting an API Gateway endpoint with AWS Lambda functions responding to queries as the origin. Create an Amazon Lambda@Edge function that caches the data locally at edge locations for 15 minutes.
D
Store forecast locations in Amazon S3 as individual objects. Create an Amazon CloudFront distribution targeting an Elastic Load Balancing group of an Auto Scaling fleet of EC2 instances, querying the origin of the S3 object. Set the cache- control timeout for 15 minutes in the CloudFront distribution.