Ultimate access to all questions.
You have an Azure Data Lake Storage account with a staging zone. You need to design a daily process to ingest incremental data from this staging zone, transform the data by running an R script, and then load the transformed data into an Azure Synapse Analytics data warehouse.
Proposed Solution: You use an Azure Data Factory schedule trigger to run a pipeline. This pipeline copies the data to a staging table in the data warehouse and then uses a stored procedure to execute the R script.
Does this solution meet the goal?