
Answer-first summary for fast verification
Answer: 1. Store the historical data in BigQuery for analytics. 2. In a Cloud SQL table, store the last state of the product after every product change. 3. Serve the last state data directly from Cloud SQL to the API.
Option D is the most suitable solution. This approach leverages BigQuery's scalability and efficiency for handling large datasets for analytics. BigQuery is well-suited for managing the 10 PB of historical product data. Cloud SQL provides the necessary performance to handle the API queries with the required low latency. By storing the latest state of each product in Cloud SQL, you can efficiently handle the high QPS with sub-second latency, which is crucial for the API's performance. Separating the analytics workload (BigQuery) from the operational query workload (Cloud SQL) optimizes performance and cost for each use case.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
You have migrated a data backend for an application that needs to store and manage 10 petabytes (PB) of historical product data for analytics purposes. Note that only the most recent state of each product, amounting to approximately 10 gigabytes (GB) of data, needs to be accessible via an API by other applications. The challenge is to select a cost-effective persistent storage solution that meets both the analytics requirements and ensures API performance that supports up to 1000 queries per second (QPS) with a latency of less than 1 second. How should you proceed?
A
B
C
D