
Ultimate access to all questions.
You manage a network of 1000 sensors, each generating time series data at a rate of one metric per second, complete with timestamps. Currently, you're dealing with 1 TB of data, which grows by 1 GB daily. What's the best storage solution to efficiently access data for both retrieving a specific sensor's metric at a precise timestamp with low latency and conducting complex analytic queries once a day?
A
Store your data in BigQuery. Concatenate the sensor ID and timestamp, and use it as the primary key.
B
Store your data in Bigtable. Concatenate the sensor ID and timestamp and use it as the row key. Perform an export to BigQuery every day.
C
Store your data in Bigtable. Concatenate the sensor ID and metric, and use it as the row key. Perform an export to BigQuery every day.
D
Store your data in BigQuery. Use the metric as a primary key.