
Ultimate access to all questions.
Your company is currently undertaking a project to migrate its on-premises data warehousing solutions to Google BigQuery. The existing data warehouse implements trigger-based change data capture (CDC) methods to update data from various transactional database sources on a daily schedule. The goal of migrating to BigQuery is to enhance the CDC process so that changes to the source systems can be reflected in BigQuery almost in real-time by using log-based CDC streams. Additionally, there is a need to optimize how changes are applied to the data warehouse to improve performance and reduce compute overhead. Which two steps should be taken to ensure that changes are propagated to the BigQuery reporting table with minimal latency while simultaneously reducing the amount of compute resources required? (Choose two.)
A
Perform a DML INSERT, UPDATE, or DELETE to replicate each individual CDC record in real time directly on the reporting table.
B
Insert each new CDC record and corresponding operation type to a staging table in real time.
C
Periodically DELETE outdated records from the reporting table.
D
Periodically use a DML MERGE to perform several DML INSERT, UPDATE, and DELETE operations at the same time on the reporting table.
E
Insert each new CDC record and corresponding operation type in real time to the reporting table, and use a materialized view to expose only the newest version of each unique record.