Ultimate access to all questions.
You have an Azure Data Lake Storage account with a staging zone. You need to design a daily process to ingest incremental data from this staging zone, transform the data by running an R script, and then load the transformed data into an Azure Synapse Analytics data warehouse.
Proposed Solution: You schedule an Azure Databricks job that runs an R notebook and then inserts the data into the data warehouse.
Does this solution meet the goal?