Ultimate access to all questions.
You have an Azure Data Lake Storage account with a staging zone. You need to design a daily process to ingest incremental data from this staging zone, transform the data by running an R script, and then load the transformed data into a data warehouse in Azure Synapse Analytics.
Proposed Solution: You use an Azure Data Factory schedule trigger to run a pipeline that executes a mapping data flow and then inserts the data into the data warehouse.
Does this solution meet the goal?