
Ultimate access to all questions.
You have uploaded 5 years of application log data to Google Cloud Storage for further analysis and storage. Recently, a user reported that certain data points within these logs fall outside the expected operational ranges, suggesting the presence of errors in the dataset. Your task is to identify and correct these anomalies, ensure the integrity of future log updates, and maintain the original log data for compliance and auditing purposes. What steps should you take to achieve these objectives?
A
Import the data from Cloud Storage into BigQuery. Create a new BigQuery table, and skip the rows with errors.
B
Create a Compute Engine instance and create a new copy of the data in Cloud Storage. Skip the rows with errors.
C
Create a Dataflow workflow that reads the data from Cloud Storage, checks for values outside the expected range, sets the value to an appropriate default, and writes the updated records to a new dataset in Cloud Storage.
D
Create a Dataflow workflow that reads the data from Cloud Storage, checks for values outside the expected range, sets the value to an appropriate default, and writes the updated records to the same dataset in Cloud Storage.