
Ultimate access to all questions.
You're managing a real-time application on Google Cloud Bigtable that handles a mix of read and write operations under heavy load. A new requirement has emerged to perform hourly analytical computations across the entire database. Your primary goals are to maintain the reliability of your production application and successfully execute the analytical workload. Which approach should you take?
A
Double the size of your existing cluster and execute the analytics workload on the newly resized cluster.
B
Export a copy of the Bigtable data to Google Cloud Storage (GCS) and run the hourly analytical job on the exported files.
C
Add a second cluster to an existing instance with single-cluster routing. Configure a live-traffic app profile to handle your regular workload, and a batch-analytics profile to handle the analytical workload.
D
Add a second cluster to an existing instance with multi-cluster routing. Configure a live-traffic app profile to handle your regular workload, and a batch-analytics profile to handle the analytical workload.