
Answer-first summary for fast verification
Answer: Create a Cloud Storage Transfer Service Job to copy the files to a Coldline Storage bucket.
The question requires a low-cost solution for archiving database backup files from on-premises to Google Cloud Storage. Based on Google's documentation and the community discussion consensus, Storage Transfer Service is recommended for transferring more than 1 TB of data from on-premises, which aligns with the Dress4Win case study indicating significant data volumes (e.g., 600 TB used out of 1 PB total storage). Option C (Storage Transfer Service to Coldline Storage) is optimal because: 1) Coldline Storage is cost-effective for archival, meeting the low-cost requirement; 2) Storage Transfer Service is a managed service, reducing maintenance overhead and improving reliability over custom scripts (gsutil), as highlighted in Google's docs and supported by high-upvoted comments; 3) It handles large data transfers efficiently. Options A and B (gsutil with cron) are less suitable for large datasets due to potential unreliability and manual maintenance, and Option D (Regional Storage) is not cost-effective for archival compared to Coldline.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
At Dress4Win, an operations engineer needs to implement a low-cost solution for remotely archiving copies of database backup files. The database files are compressed tar archives located in their on-premises data center. What is the recommended approach?
A
Create a cron script using gsutil to copy the files to a Coldline Storage bucket.
B
Create a cron script using gsutil to copy the files to a Regional Storage bucket.
C
Create a Cloud Storage Transfer Service Job to copy the files to a Coldline Storage bucket.
D
Create a Cloud Storage Transfer Service job to copy the files to a Regional Storage bucket.
No comments yet.