
Answer-first summary for fast verification
Answer: 1. Perform a BigQuery export to a Cloud Storage bucket with archive storage class. 2. Set a locked retention policy on the bucket. 3. Create a BigQuery external table on the exported files.
**Why Option D is correct:** 1. **Export to Cloud Storage with archive storage class:** This is cost-effective for long-term retention, offering lower storage costs for rarely accessed data. 2. **Set a locked retention policy:** Ensures data immutability for 3 years, preventing deletion or modification. 3. **Create a BigQuery external table:** Allows querying the data in BigQuery without storing it in a BigQuery table, reducing costs. **Why other options are incorrect:** - **A:** Maintaining a duplicate in BigQuery increases costs and lacks immutability. - **B:** Snapshots are for point-in-time recovery, not long-term backup or immutability. - **C:** Versioning increases storage costs and doesn't provide the same immutability as a locked retention policy.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You have 100 GB of outdated data in a BigQuery table, which will be accessed infrequently for analytics using SQL. Your goal is to store this data securely and immutably for backup purposes for 3 years while minimizing storage costs. What is the best approach to achieve this?
A
B
C
D
No comments yet.