
Ultimate access to all questions.
Answer-first summary for fast verification
Answer: Create views in Databricks SQL that filter data at the row level and assign access to these views based on user roles.
The most suitable approach for implementing row-level security in Delta Lake using Databricks' native capabilities is to create views in Databricks SQL that filter data at the row level and assign access to these views based on user roles. This method allows for the definition of row-level security logic through views, restricting access to sensitive data based on user roles. Databricks enables the assignment of view access permissions according to user roles, ensuring only authorized users can view sensitive data. This approach offers flexibility and scalability, accommodating growth in users and roles by easily updating access permissions. It also ensures compliance with financial regulations by strictly controlling access to sensitive customer data, protecting it and making it accessible only to authorized users.
Author: LeetQuiz Editorial Team
No comments yet.
A financial services company stores sensitive customer data in Delta Lake and needs to implement row-level security to comply with financial regulations. How would you configure this using Databricks' native capabilities?
A
Encrypt specific rows with Azure Key Vault and manage decryption keys based on user roles.
B
Create views in Databricks SQL that filter data at the row level and assign access to these views based on user roles.
C
Use Delta Lake‘s dynamic partition pruning to implement custom logic that filters rows based on user access levels.
D
Apply Unity Catalog permissions to create row-level security policies directly in Delta Lake.