
Answer-first summary for fast verification
Answer: Employ the BigQuery Data Transfer Service by leveraging the Java Database Connectivity (JDBC) driver with a FastExport connection for direct data transfer.
Option C is the optimal choice for this scenario. It involves using the BigQuery Data Transfer Service combined with the JDBC driver and FastExport connection, which allows for efficient and direct data extraction from Teradata to BigQuery with minimal programming. FastExport facilitates high-speed data extraction, making the migration process faster and more efficient, especially under the constraint of limited local storage space. The other options either introduce unnecessary complexity (Options B and D) or involve additional steps that may not be as efficient (Option A).
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You are planning to migrate your Teradata data warehouse to BigQuery. The objective is to transfer historical data to BigQuery as efficiently as possible with minimal programming effort, given the constraint of limited local storage space on your current data warehouse. What is the best approach to achieve this?
A
Develop a script to export historical data in batches, upload them to Cloud Storage, and then use BigQuery Data Transfer Service to move data from Cloud Storage to BigQuery.
B
Utilize the BigQuery Data Transfer Service with the Teradata Parallel Transporter (TPT) tbuild utility for data migration.
C
Employ the BigQuery Data Transfer Service by leveraging the Java Database Connectivity (JDBC) driver with a FastExport connection for direct data transfer.
D
Create a Teradata Parallel Transporter (TPT) export script for historical data and import it into BigQuery using the bq command-line tool.
No comments yet.