Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
How should a solutions architect handle the transmission and processing of over 30 TB of daily clickstream data for a company with more than 300 global websites and applications?
A
Utilize AWS Data Pipeline for data archiving to S3 and Amazon EMR for analytics generation.
B
Employ an Auto Scaling group of EC2 instances for data processing, directing output to an S3 data lake for Redshift analysis.
C
Leverage Amazon CloudFront for data caching, with S3 for storage, and AWS Lambda for processing upon new S3 object addition.
D
Use Amazon Kinesis Data Streams for data collection, Kinesis Data Firehose for data transmission to S3, and Redshift for data analysis.