Ultimate access to all questions.
How would you architect a data processing pipeline that automatically scales with increasing load, ensures at-least-once message processing, and preserves message order within 1-hour intervals?
Explanation:
Option A is the correct choice because Cloud Pub/Sub is a managed messaging service that supports automatic scaling and ordered message delivery within specified windows, meeting the requirement for at-least-once processing within 1-hour intervals. Cloud Dataflow complements this by providing fully-managed, scalable streaming analysis. Option B is incorrect as Cloud Dataproc is not suited for streaming analysis. Option C, while feasible, is less optimal than A due to Kafka's management overhead. Option D is incorrect for the same reason as B regarding Cloud Dataproc.