
Ultimate access to all questions.
In your role as a Machine Learning Engineer at a large enterprise, you are tasked with developing a model on AI Platform. The enterprise has thousands of datasets stored in BigQuery, each with accurate descriptions. Given the scale of data and the need for efficiency, how would you best locate the specific BigQuery table required for your model? Consider the following constraints: the solution must be scalable, cost-effective, and leverage Google Cloud's native tools for optimal performance. Choose the best option from the following:
A
Implement a custom script to scan through all BigQuery table descriptions and match keywords related to your model's requirements. This script would run periodically to update a central registry of tables.
B
Query BigQuery's INFORMATION_SCHEMA metadata tables to list all table names and descriptions in your project, then manually sift through the results to identify the needed table._
C
Develop a lookup table within BigQuery that maps table descriptions to table IDs. This lookup table would be queried each time you need to find data, requiring manual updates as new tables are added.
D
Utilize Google Cloud Data Catalog to search through BigQuery datasets by keywords found in the table descriptions, taking advantage of its search capabilities and integration with BigQuery.
E
Combine the use of Data Catalog for initial discovery and maintain a supplementary lookup table in BigQuery for frequently accessed tables to speed up future searches.