
Ultimate access to all questions.
Answer-first summary for fast verification
Answer: Configure Amazon S3 to send an event notification to an Amazon Simple Queue Service (Amazon SQS) queue. Use an AWS Lambda function to read from the queue and process the data. Store the resulting JSON file in Amazon DynamoDB.
## Explanation **Option C is the correct answer** because it provides the LEAST operational overhead while meeting all requirements: ### Key Requirements Analysis: 1. **Small files uploaded to S3** - Simple processing needed 2. **Process as quickly as possible** - Event-driven architecture needed 3. **Variable demand** - Need auto-scaling capability 4. **LEAST operational overhead** - Serverless solutions preferred ### Why Option C is Best: - **AWS Lambda**: Serverless compute that automatically scales with demand (high files on some days, few on others) - **S3 Event Notifications + SQS**: Provides reliable event-driven processing with queuing for durability - **DynamoDB**: Serverless NoSQL database suitable for JSON storage - **Minimal operational overhead**: No servers to manage, no capacity planning needed ### Why Other Options are Not Optimal: **Option A (Amazon EMR)**: - EMR is designed for big data processing, not simple file transformations - Requires cluster management and scaling - High operational overhead for simple tasks - Not event-driven, would require polling **Option B (EC2 instances)**: - Requires managing EC2 instances (scaling, patching, monitoring) - Higher operational overhead than serverless - Need to implement auto-scaling logic **Option D (EventBridge + Kinesis + Lambda + Aurora)**: - More complex than needed (EventBridge + Kinesis introduces unnecessary complexity) - Aurora DB cluster has higher operational overhead than DynamoDB - Kinesis is overkill for simple file processing ### Architecture Benefits of Option C: 1. **Event-driven**: S3 triggers Lambda via SQS immediately after upload 2. **Auto-scaling**: Lambda automatically scales with file upload volume 3. **Cost-effective**: Pay only for actual processing time 4. **Reliable**: SQS provides message durability and retry capability 5. **Simple**: Minimal components, easy to maintain This solution perfectly balances speed, scalability, and minimal operational overhead.
Author: LeetQuiz Editorial Team
No comments yet.
A company is designing an application where users upload small files into Amazon S3. After a user uploads a file, the file requires one-time simple processing to transform the data and save the data in JSON format for later analysis.
Each file must be processed as quickly as possible after it is uploaded. Demand will vary. On some days, users will upload a high number of files. On other days, users will upload a few files or no files.
Which solution meets these requirements with the LEAST operational overhead?
A
Configure Amazon EMR to read text files from Amazon S3. Run processing scripts to transform the data. Store the resulting JSON file in an Amazon Aurora DB cluster.
B
Configure Amazon S3 to send an event notification to an Amazon Simple Queue Service (Amazon SQS) queue. Use Amazon EC2 instances to read from the queue and process the data. Store the resulting JSON file in Amazon DynamoDB.
C
Configure Amazon S3 to send an event notification to an Amazon Simple Queue Service (Amazon SQS) queue. Use an AWS Lambda function to read from the queue and process the data. Store the resulting JSON file in Amazon DynamoDB.
D
Configure Amazon EventBridge (Amazon CloudWatch Events) to send an event to Amazon Kinesis Data Streams when a new file is uploaded. Use an AWS Lambda function to consume the event from the stream and process the data. Store the resulting JSON file in an Amazon Aurora DB cluster.