Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
When optimizing a Databricks Lakehouse for BI tools, which data modeling and performance optimization techniques would you employ to ensure the BI reporting is both responsive and scalable?
A
Partitioning and clustering tables by frequently queried dimensions to enhance scan efficiency
B
Employing materialized views to pre-calculate and save complex aggregations
C
Adopting incremental load strategies in Delta tables to reduce data movement and refresh durations
D
Flattening data into broad tables to decrease the necessity for real-time joins and aggregations