
Ultimate access to all questions.
Given a scenario where you need to normalize and parse logs from various AWS services to correlate events and identify potential security threats, which approach would you take? Consider the log formats from CloudTrail, VPC Flow Logs, and ELB access logs.
A
Use AWS Glue to create a data catalog and ETL jobs for normalization and parsing, then store the results in Amazon S3 for correlation.
B
Use Amazon Elasticsearch with Logstash for real-time parsing and normalization, then use Kibana for correlation and visualization.
C
Use AWS Lambda functions to parse and normalize logs in real-time, then store the results in Amazon DynamoDB for correlation.
D
Use Amazon Redshift for batch processing of log data, then use SQL queries for normalization and correlation.