
Answer-first summary for fast verification
Answer: Load logs into Google BigQuery, Upload log files into Google Cloud Storage
To meet the requirements of archiving 100 TB of log data to the cloud and testing analytics features while retaining the data for long-term disaster recovery, the recommended steps are as follows: 1. **A: Load logs into Google BigQuery** - Google BigQuery is a fully managed, cloud-native data warehouse designed for processing large-scale data analytics quickly and efficiently. It enables you to perform advanced analytics on your log data. 2. **E: Upload log files into Google Cloud Storage** - Google Cloud Storage is ideal for archiving large amounts of data due to its scalability, durability, and cost-effective storage classes like Coldline, which is suitable for long-term data retention and disaster recovery. Other options like Google Cloud SQL, Google Stackdriver (now part of Google Cloud Operations Suite), or Google Cloud Bigtable are not as suitable for the combined needs of data analytics and long-term storage.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
Your company is considering transitioning to cloud-based services in an effort to minimize risk. As part of this initiative, they plan to archive approximately 100 TB of log data to the cloud. Additionally, they want to explore the cloud’s analytics capabilities and maintain the data for long-term disaster recovery. Which two steps should you take to achieve these goals? (Choose two.)
A
Load logs into Google BigQuery
B
Load logs into Google Cloud SQL
C
Import logs into Google Stackdriver
D
Insert logs into Google Cloud Bigtable
E
Upload log files into Google Cloud Storage
No comments yet.