Ultimate access to all questions.
Given a scenario where you are working with a query in Google BigQuery that involves filtering a table using a WHERE clause on timestamp and ID columns, you observe through bq query --dry_run that this query causes a full table scan, even though the filter criteria target only a small portion of the data. To optimize the query and decrease the volume of data scanned by BigQuery while making minimal adjustments to your existing SQL queries, what actions should you take?