
Answer-first summary for fast verification
Answer: Use Cloud Dataflow with a Cloud Storage Avro to Bigtable template.
The optimal solution is to use Cloud Dataflow with a Cloud Storage Avro to Bigtable template. This approach is both reliable and easily monitored, reducing the need for extensive custom development. Alternatives like a custom Python program or gsutil are either more labor-intensive or unsuitable for direct Bigtable transfers. The Storage Transfer Service is designed for moving data into Cloud Storage from other object storage systems, not for loading data into Bigtable. For more details, refer to [Google Cloud's documentation](https://cloud.google.com/architecture/streaming-avro-records-into-bigquery-using-dataflow).
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
A data engineer is tasked with loading data from Avro files stored in Cloud Storage into Bigtable. They seek a reliable and easily monitored solution for this data transfer. Which method would you recommend?
A
Develop a custom Python 3 program for the task.
B
Utilize gsutil for transferring the data directly.
C
Employ the Storage Transfer Service for the operation.
D
Use Cloud Dataflow with a Cloud Storage Avro to Bigtable template.
No comments yet.