Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
How would you model your data in Delta Lake to ensure fast and efficient queries for a reporting dashboard that handles time-sensitive data?
A
Normalize data across multiple Delta tables to minimize redundancy and save on storage costs.
B
Implement row-level security to filter data by time, thereby reducing query load times.
C
Denormalize data into wide tables and leverage Delta Lake's data skipping features on time columns.
D
Store all data in a single, large Delta table partitioned by time, such as by the hour of the day.