
Ultimate access to all questions.
You are planning to migrate your on-premises data to BigQuery on Google Cloud, with the flexibility to either stream or batch-load data as per your needs. Additionally, you need to obfuscate certain sensitive data before the transfer. Your objective is to achieve this programmatically while keeping costs to a minimum. What is the most efficient and cost-effective approach to accomplish this task?
A
Set up Datastream to replicate your on-premise data on BigQuery.
B
Use the BigQuery Data Transfer Service to schedule your migration. After the data is in BigQuery, connect to the Cloud Data Loss Prevention (Cloud DLP) API to de-identify the necessary data.
C
Create your pipeline with Dataflow through the Apache Beam SDK for Python, customizing separate options within your code for streaming, batch processing, and Cloud DLP. Select BigQuery as your data sink.
D
Use Cloud Data Fusion to design your pipeline, use the Cloud DLP plug-in to de-identify data within your pipeline, and then move the data into BigQuery.