
Answer-first summary for fast verification
Answer: Create a Google Kubernetes Engine cluster running the Locust or JMeter images to dynamically generate load tests. Analyze the results using Cloud Trace.
The question requires a load testing solution that can autoscale to handle up to 6,000 transactions per second (5,000 reads + 1,000 writes) and identify bottlenecks. Option B is the best choice because it utilizes Google Kubernetes Engine (GKE) with Locust or JMeter, which are capable of dynamically scaling to generate the necessary load and Cloud Trace for analyzing the results to identify any bottlenecks. This approach is scalable and uses tools designed for performance testing and analysis. Other options either lack the scalability (A and D) or are not designed for high-throughput load testing (C).
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
You have developed a new service on Cloud Run that authenticates via a custom service and writes transactional data to a Cloud Spanner database. To ensure your application can handle up to 5,000 read and 1,000 write transactions per second while identifying potential bottlenecks, you need a test infrastructure capable of autoscaling. What is the recommended approach?
A
Build a test harness to generate requests and deploy it to Cloud Run. Analyze the VPC Flow Logs using Cloud Logging.
B
Create a Google Kubernetes Engine cluster running the Locust or JMeter images to dynamically generate load tests. Analyze the results using Cloud Trace.
C
Create a Cloud Task to generate a test load. Use Cloud Scheduler to run 60,000 Cloud Task transactions per minute for 10 minutes. Analyze the results using Cloud Monitoring.
D
Create a Compute Engine instance that uses a LAMP stack image from the Marketplace, and use Apache Bench to generate load tests against the service. Analyze the results using Cloud Trace.