Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
Consider a large-scale e-commerce platform that requires real-time analytics on customer behavior. How would you design a streaming data pipeline using Delta Lake to handle high volumes of data and ensure timely and accurate analytics?
A
Use traditional batch processing for data ingestion and transformation.
B
Implement a hybrid approach with both batch and streaming processing for different data types.
C
Leverage Delta Lake's streaming capabilities for continuous data ingestion and real-time updates to analytics tables.
D
Store all data in a single large table for simplicity in querying.