
Ultimate access to all questions.
Given a scenario where you are working with a query in Google BigQuery that involves filtering a table using a WHERE clause on timestamp and ID columns, you observe through bq query --dry_run that this query causes a full table scan, even though the filter criteria target only a small portion of the data. To optimize the query and decrease the volume of data scanned by BigQuery while making minimal adjustments to your existing SQL queries, what actions should you take?_
A
Create a separate table for each ID.
B
Use the LIMIT keyword to reduce the number of rows returned.
C
Recreate the table with a partitioning column and clustering column.
D
Use the bq query --maximum_bytes_billed flag to restrict the number of bytes billed.