
Answer-first summary for fast verification
Answer: Store the data in Cloud Storage and create an extract, transform, and load (ETL) pipeline.
The recommended approach for batch data pipelines is to store data in Cloud Storage. Then, create an ETL (or ELT, depending on the use case) pipeline to move the data into a data warehouse.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
Your data engineering team receives data in JSON format from external sources at the end of each day. You need to design the data pipeline. What should you do?
A
Make your BigQuery data warehouse public and ask the external sources to insert the data.
B
Store the data in Cloud Storage and create an extract, transform, and load (ETL) pipeline.
C
Store the data in persistent disks and create an ETL pipeline.
D
Create a public API to allow external applications to add the data to your warehouse.
No comments yet.