Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
What is the optimal database and data model for storing one-second interval samples of time series CPU and memory usage for millions of computers, enabling real-time, ad hoc analytics, ensuring scalability, and avoiding per-query charges?
A
Design a wide table in Bigtable with a row key that merges the computer identifier with the sample time at each minute, and aggregate the values for each second as column data.
B
Construct a narrow table in Bigtable with a row key that combines the Computer Engine computer identifier with the sample time at each second.
C
Develop a wide table in BigQuery with a column for each sample value at each second and update the row with the interval for each second.
D
Establish a table in BigQuery, and append the new samples for CPU and memory to the table.