
Ultimate access to all questions.
Answer-first summary for fast verification
Answer: Create an Amazon Kinesis Data Firehose delivery stream to ingest the alerts. Configure the Kinesis Data Firehose stream to deliver the alerts to an Amazon S3 bucket. Set up an S3 Lifecycle configuration to transition data to Amazon S3 Glacier after 14 days.
## Detailed Explanation **Why Option A is correct:** 1. **Operational Efficiency**: Kinesis Data Firehose is a fully managed service that automatically scales to handle the data volume (1 TB/day) without requiring infrastructure management. 2. **High Availability**: Kinesis Data Firehose is inherently highly available and durable. 3. **Cost Optimization**: No EC2 instances to manage, reducing operational overhead and costs. 4. **Storage Strategy**: S3 provides durable, scalable storage with immediate availability for 14 days. 5. **Automated Archiving**: S3 Lifecycle policies automatically transition data to Glacier after 14 days, meeting the archival requirement. 6. **Data Size**: With 2 KB alerts, Kinesis Data Firehose can efficiently batch and deliver to S3. **Why Option B is incorrect:** - Requires managing EC2 instances, load balancers, and custom scripts - Higher operational overhead and management complexity - Not serverless, requiring infrastructure maintenance - Less cost-effective due to EC2 instance costs **Why Option C is incorrect:** - OpenSearch Service is expensive for storage (compared to S3) - Requires manual snapshot management - Not optimal for long-term storage and archival - Higher operational complexity **Why Option D is incorrect:** - SQS has a maximum message retention of 14 days, which aligns with the requirement - However, requires custom consumer applications to manage data movement - Higher operational complexity with consumer management - Not as efficient for large-scale data ingestion (1 TB/day) - Consumers would need to handle 500 million messages per day (1TB ÷ 2KB) **Key AWS Services Used:** - **Amazon Kinesis Data Firehose**: Serverless data ingestion service - **Amazon S3**: Durable object storage - **S3 Lifecycle Policies**: Automated data tiering to Glacier - **Amazon S3 Glacier**: Low-cost archival storage This solution provides the best balance of operational efficiency, cost optimization, and meeting all requirements without infrastructure management.
Author: LeetQuiz Editorial Team
No comments yet.
A company has thousands of edge devices that collectively generate 1 TB of status alerts each day. Each alert is approximately 2 KB in size. A solutions architect needs to implement a solution to ingest and store the alerts for future analysis.
The company wants a highly available solution. However, the company needs to minimize costs and does not want to manage additional infrastructure. Additionally, the company wants to keep 14 days of data available for immediate analysis and archive any data older than 14 days.
What is the MOST operationally efficient solution that meets these requirements?
A
Create an Amazon Kinesis Data Firehose delivery stream to ingest the alerts. Configure the Kinesis Data Firehose stream to deliver the alerts to an Amazon S3 bucket. Set up an S3 Lifecycle configuration to transition data to Amazon S3 Glacier after 14 days.
B
Launch Amazon EC2 instances across two Availability Zones and place them behind an Elastic Load Balancer to ingest the alerts. Create a script on the EC2 instances that will store the alerts in an Amazon S3 bucket. Set up an S3 Lifecycle configuration to transition data to Amazon S3 Glacier after 14 days.
C
Create an Amazon Kinesis Data Firehose delivery stream to ingest the alerts. Configure the Kinesis Data Firehose stream to deliver the alerts to an Amazon OpenSearch Service (Amazon Elasticsearch Service) cluster. Set up the Amazon OpenSearch Service (Amazon Elasticsearch Service) cluster to take manual snapshots every day and delete data from the cluster that is older than 14 days.
D
Create an Amazon Simple Queue Service (Amazon SQS) standard queue to ingest the alerts, and set the message retention period to 14 days. Configure consumers to poll the SQS queue, check the age of the message, and analyze the message data as needed. If the message is 14 days old, the consumer should copy the message to an Amazon S3 bucket and delete the message from the SQS queue.