
Ultimate access to all questions.
As a data engineer tasked with designing storage for a data pipeline on Google Cloud, you must handle very large text files. Your design needs to support ANSI SQL queries and enable efficient compression. Furthermore, it is crucial to support parallel loading from the input locations while adhering to Google's recommended best practices. What steps should you take to achieve this?
A
Transform text files to compressed Avro using Cloud Dataflow. Use BigQuery for storage and query.
B
Transform text files to compressed Avro using Cloud Dataflow. Use Cloud Storage and BigQuery permanent linked tables for query.
C
Compress text files to gzip using the Grid Computing Tools. Use BigQuery for storage and query.
D
Compress text files to gzip using the Grid Computing Tools. Use Cloud Storage, and then import into Cloud Bigtable for query.