
Ultimate access to all questions.
You are tasked with developing a machine learning model that will handle continuous streaming data from multiple vendors. The goal is to use BigQuery ML for creating the model and Vertex AI for hosting it, facilitating near-real-time data processing. Given the possibility of invalid data values within the stream, what steps should you take to achieve this objective?
A
Create a new BigQuery dataset and use streaming inserts to land the data from multiple vendors. Configure your BigQuery ML model to use the 'ingestion' dataset as the framing data.
B
Use BigQuery streaming inserts to land the data from multiple vendors where your BigQuery dataset ML model is deployed.
C
Create a Pub/Sub topic and send all vendor data to it. Connect a Cloud Function to the topic to process the data and store it in BigQuery.
D
Create a Pub/Sub topic and send all vendor data to it. Use Dataflow to process and sanitize the Pub/Sub data and stream it to BigQuery.