
Ultimate access to all questions.
In the context of digital advertising, accurate data is crucial for optimizing AI models and performing effective historical data analysis. Consider you have a dataset comprising ads data, and you need this data for two primary purposes: to serve AI models and to analyze historical trends. A significant aspect of data preparation is identifying longtail and outlier data points, which can potentially skew the analysis and the performance of AI models. To ensure the highest quality of data, you aim to cleanse the data in near-real time before integrating it into your AI models. What actions should you take to achieve this?
A
Use Cloud Storage as a data warehouse, shell scripts for processing, and BigQuery to create views for desired datasets.
B
Use Dataflow to identify longtail and outlier data points programmatically, with BigQuery as a sink.
C
Use BigQuery to ingest, prepare, and then analyze the data, and then run queries to create views.
D
Use Cloud Composer to identify longtail and outlier data points, and then output a usable dataset to BigQuery.