
Answer-first summary for fast verification
Answer: Import the new records from the CSV file into a new BigQuery table. Create a BigQuery job that merges the new records with the existing records and writes the results to a new BigQuery table.
The correct answer is D. BigQuery is primarily designed to handle large-scale data mutations efficiently through bulk operations rather than individual row updates, which can easily exceed quota limits for DML statements. To handle large updates, you should import the new records from the CSV file into a new BigQuery table. Then create a BigQuery job that merges the new records with the existing records and writes the results to a new BigQuery table. This approach ensures you stay within operational limits and efficiently manage large datasets.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
The marketing team at your organization is responsible for providing regular updates to a segment of your customer dataset. Recently, they provided you with a CSV file containing 1 million customer records that need to be updated in BigQuery. However, when you attempt to use the UPDATE statement in BigQuery to incorporate these records, you encounter a quotaExceeded error. What steps should you take to resolve this issue?
A
Reduce the number of records updated each day to stay within the BigQuery UPDATE DML statement limit.
B
Increase the BigQuery UPDATE DML statement limit in the Quota management section of the Google Cloud Platform Console.
C
Split the source CSV file into smaller CSV files in Cloud Storage to reduce the number of BigQuery UPDATE DML statements per BigQuery job.
D
Import the new records from the CSV file into a new BigQuery table. Create a BigQuery job that merges the new records with the existing records and writes the results to a new BigQuery table.
No comments yet.