Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
When designing a lakehouse to accommodate both real-time and batch processing workloads, which strategy ensures data is efficiently accessible across both without redundant storage or processing?
A
Adopt an event sourcing pattern, recording all changes as immutable events in the lakehouse, and creating materialized views for various workload needs.
B
Use separate storage solutions for real-time and batch data, with periodic synchronization between them.
C
Employ a single Delta Lake table that accommodates both streaming writes and batch reads/updates, organizing data by ingestion time.
D
Store streaming data temporarily and batch load it into the lakehouse at predetermined intervals.