
Answer-first summary for fast verification
Answer: Deploy a stream processing system that processes data in real-time, enabling immediate analysis and action for time-sensitive data like financial transactions.
Option B is the correct choice because a stream processing system is specifically designed to handle high-velocity data in real-time, which is crucial for time-sensitive applications like financial transactions. It also allows for scalable solutions to manage IoT data spikes and can be configured to comply with data privacy regulations for sensitive data like social media feeds. While batch processing (Option A) and distributed processing (Option D) have their uses, they are not optimized for real-time data handling. Combining both systems (Option C) might offer flexibility but introduces complexity and potential latency issues for real-time requirements.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
As a Microsoft Fabric Analytics Engineer Associate, you are designing a data pipeline to ingest real-time streaming data into a lakehouse. The data sources include IoT sensor data, social media feeds, and financial transaction data, each with high volume and velocity. The solution must ensure low latency for financial transactions, scalability to handle IoT data spikes, and compliance with data privacy regulations for social media feeds. Considering these requirements, which approach would you choose to design the pipeline? (Choose one option)
A
Implement a batch processing system to collect and process data at scheduled intervals, ensuring all data is processed uniformly.
B
Deploy a stream processing system that processes data in real-time, enabling immediate analysis and action for time-sensitive data like financial transactions.
C
Combine batch and stream processing systems, using batch for historical data analysis and stream for real-time data, to cover all data processing needs.
D
Utilize a distributed processing system to parallelize data ingestion and processing, focusing on throughput rather than latency.
No comments yet.