Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
For the TerramEarth case study, a new architecture writes all incoming data to BigQuery. You have observed that the data is dirty and want to implement an automated daily process to ensure data quality while managing cost. What should you do?
A
Set up a streaming Cloud Dataflow job, receiving data by the ingestion process. Clean the data in a Cloud Dataflow pipeline.
B
Create a Cloud Function that reads data from BigQuery and cleans it. Trigger the Cloud Function from a Compute Engine instance.
C
Create a SQL statement on the data in BigQuery, and save it as a view. Run the view daily, and save the result to a new table.
D
Use Cloud Dataprep and configure the BigQuery tables as the source. Schedule a daily job to clean the data.