
Answer-first summary for fast verification
Answer: Configure Organizational Log Sinks to export logs to a Cloud Pub/Sub Topic, which will be sent to the SIEM via Dataflow.
Option C is the correct answer because it follows Google's recommended best practice for reliably exporting GCP logs to an on-premises SIEM system. The approach uses Organizational Log Sinks to export logs to Cloud Pub/Sub, which provides reliable, scalable message queuing, and then uses Dataflow to stream the logs to the SIEM (commonly via Splunk HTTP Event Collector). This method is officially documented by Google for Splunk integration and ensures reliable delivery with built-in retry mechanisms. Option A (syslog) is less reliable for cross-environment delivery and doesn't scale well. Option B (BigQuery) introduces latency and is not optimal for real-time SIEM ingestion. Option D (REST API queries) would be inefficient, create high API load, and lacks the reliability of the Pub/Sub/Dataflow pipeline. The community discussion strongly supports C with 100% consensus and references to Google's official documentation.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
What is the recommended method for reliably exporting Google Cloud Platform (Google Cloud) logs to an on-premises SIEM system?
A
Send all logs to the SIEM system via an existing protocol such as syslog.
B
Configure every project to export all their logs to a common BigQuery DataSet, which will be queried by the SIEM system.
C
Configure Organizational Log Sinks to export logs to a Cloud Pub/Sub Topic, which will be sent to the SIEM via Dataflow.
D
Build a connector for the SIEM to query for all logs in real time from the GCP RESTful JSON APIs.
No comments yet.