
Answer-first summary for fast verification
Answer: Employ Cloud Dataflow, initiating the process with a Cloud Storage Avro to Bigtable template
The optimal solution is to use Cloud Dataflow with a Cloud Storage Avro to Bigtable template. This approach minimizes development effort and leverages a managed service for reliability and monitoring. Alternatives like a custom Python program or gsutil either require excessive development or are unsuitable for direct Bigtable loading. The Storage Transfer Service is designed for transfers into Cloud Storage from other object storage systems, not Bigtable. For more details, refer to [Google Cloud's documentation](https://cloud.google.com/architecture/streaming-avro-records-into-bigquery-using-dataflow).
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
A data engineer is tasked with loading data from Avro files stored in Cloud Storage into Bigtable. They seek a reliable and easily monitored solution for this data transfer. Which method would you recommend?
A
Develop a custom Python 3 program for the task
B
Utilize the Storage Transfer Service for the data transfer
C
Employ Cloud Dataflow, initiating the process with a Cloud Storage Avro to Bigtable template
D
Use gsutil to directly load the data into Bigtable