
Answer-first summary for fast verification
Answer: Store your data in Bigtable. Concatenate the sensor ID and timestamp and use it as the row key. Perform an export to BigQuery every day.
The correct answer is **B**: Storing your data in Bigtable with the sensor ID and timestamp concatenated as the row key ensures efficient, low-latency retrieval of specific sensor data at specific timestamps. Bigtable's design is ideal for handling large volumes of data with quick access. Exporting data to BigQuery daily facilitates complex analytic queries, leveraging BigQuery's strengths in analytics and cost-effectiveness. **Why other options are less suitable:** - **A**: BigQuery isn't optimized for low-latency data retrieval, making it less ideal for accessing specific sensor data quickly. - **C**: Using the sensor ID and metric as the row key in Bigtable may not efficiently support timestamp-specific queries or complex analytics. - **D**: BigQuery, with the metric as a primary key, lacks the low-latency access needed for specific timestamp queries and isn't the best fit for this scenario.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You manage a network of 1000 sensors, each generating time series data at a rate of one metric per second, complete with timestamps. Currently, you're dealing with 1 TB of data, which grows by 1 GB daily. What's the best storage solution to efficiently access data for both retrieving a specific sensor's metric at a precise timestamp with low latency and conducting complex analytic queries once a day?
A
Store your data in BigQuery. Concatenate the sensor ID and timestamp, and use it as the primary key.
B
Store your data in Bigtable. Concatenate the sensor ID and timestamp and use it as the row key. Perform an export to BigQuery every day.
C
Store your data in Bigtable. Concatenate the sensor ID and metric, and use it as the row key. Perform an export to BigQuery every day.
D
Store your data in BigQuery. Use the metric as a primary key.
No comments yet.