
Answer-first summary for fast verification
Answer: Split traffic between versions using weights.
To perform A/B testing in a randomized manner using Google's global HTTP(S) load balancer, splitting traffic via weights (Option A) is the most appropriate approach. The load balancer allows configuring backend services with specific weights, enabling traffic distribution in a controlled and randomized way (e.g., 10% to the new version, 90% to the old). This ensures statistical validity for comparing the algorithm's impact. Option B is unsuitable as feature flags on a single instance do not guarantee consistent or randomized traffic distribution. Option C (mirroring) does not expose users to the new version, making it ineffective for measuring sales impact. Option D (header-based routing) relies on client headers, which are not mentioned in the question and would require additional logic for randomization. Weight-based splitting (A) directly addresses the requirement for randomized traffic distribution.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
How can you implement A/B testing for a new product recommendation algorithm in an ecommerce application deployed behind a global HTTP(S) load balancer to evaluate its impact on sales through randomized user allocation?
A
Split traffic between versions using weights.
B
Enable the new recommendation feature flag on a single instance.
C
Mirror traffic to the new version of your application.
D
Use HTTP header-based routing.