
Answer-first summary for fast verification
Answer: Use AWS Glue to create a data catalog and ETL jobs for normalization and parsing, then store the results in Amazon S3 for correlation.
Option A is the most appropriate as AWS Glue can handle the ETL process for different log formats, and storing the results in Amazon S3 allows for easy access and correlation of log data using various analytics tools.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
Given a scenario where you need to normalize and parse logs from various AWS services to correlate events and identify potential security threats, which approach would you take? Consider the log formats from CloudTrail, VPC Flow Logs, and ELB access logs.
A
Use AWS Glue to create a data catalog and ETL jobs for normalization and parsing, then store the results in Amazon S3 for correlation.
B
Use Amazon Elasticsearch with Logstash for real-time parsing and normalization, then use Kibana for correlation and visualization.
C
Use AWS Lambda functions to parse and normalize logs in real-time, then store the results in Amazon DynamoDB for correlation.
D
Use Amazon Redshift for batch processing of log data, then use SQL queries for normalization and correlation.
No comments yet.