
Answer-first summary for fast verification
Answer: Use BigQuery Data Transfer Service by using the Java Database Connectivity (JDBC) driver with FastExport connection.
The correct answer is A. Given that the local storage space is limited on your existing data warehouse, using the BigQuery Data Transfer Service by employing the JDBC driver with FastExport connection is the most efficient option. This approach allows for direct streaming of data from Teradata to BigQuery, bypassing the need for local storage, and minimizes the amount of programming required. This method is specifically recommended when there are constraints on local storage space or if use of TPT is not feasible.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You are planning to migrate your existing Teradata data warehouse to Google BigQuery. Your primary goal is to transfer the historical data to BigQuery in the most efficient manner while minimizing the amount of coding required. Additionally, please note that the local storage capacity on your current data warehouse is constrained. Given these factors, what approach should you take to accomplish this migration?
A
Use BigQuery Data Transfer Service by using the Java Database Connectivity (JDBC) driver with FastExport connection.
B
Create a Teradata Parallel Transporter (TPT) export script to export the historical data, and import to BigQuery by using the bq command-line tool.
C
Use BigQuery Data Transfer Service with the Teradata Parallel Transporter (TPT) tbuild utility.
D
Create a script to export the historical data, and upload in batches to Cloud Storage. Set up a BigQuery Data Transfer Service instance from Cloud Storage to BigQuery.
No comments yet.