
Answer-first summary for fast verification
Answer: Use Amazon Kinesis Data Streams for data collection, Kinesis Data Firehose for data transmission to S3, and Redshift for data analysis.
Option D is the correct answer. Amazon Kinesis Data Streams is designed to handle large-scale, real-time data streaming, making it ideal for collecting clickstream data. Amazon Kinesis Data Firehose can then be used to automatically deliver the data to an Amazon S3 data lake. This setup simplifies the data transmission process. Once the data is in S3, Amazon Redshift can efficiently load and analyze the data, providing powerful analytics capabilities. This solution leverages managed AWS services that are scalable and cost-effective, ensuring reliable and real-time processing of large datasets.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
How should a solutions architect handle the transmission and processing of over 30 TB of daily clickstream data for a company with more than 300 global websites and applications?
A
Utilize AWS Data Pipeline for data archiving to S3 and Amazon EMR for analytics generation.
B
Employ an Auto Scaling group of EC2 instances for data processing, directing output to an S3 data lake for Redshift analysis.
C
Leverage Amazon CloudFront for data caching, with S3 for storage, and AWS Lambda for processing upon new S3 object addition.
D
Use Amazon Kinesis Data Streams for data collection, Kinesis Data Firehose for data transmission to S3, and Redshift for data analysis.
No comments yet.