
Answer-first summary for fast verification
Answer: Configure the 'Log storage settings' in the pipeline to store error logs in an Azure Blob Storage account.
Configuring error handling and logging in Azure Data Factory involves setting up appropriate logging mechanisms to capture and store error information. This can be achieved by configuring the 'Log storage settings' in the pipeline to store error logs in an Azure Blob Storage account. This allows you to review the logs for troubleshooting and analysis. While other options like enabling data validation errors or implementing custom logic may be part of the error handling process, they do not directly address the logging and storage of error information.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
In a scenario where you are tasked with configuring error handling for a data transformation process in Azure Data Factory, what steps would you take to ensure robust error handling and logging?
A
Enable the 'Stop on data validation errors' option in the dataset settings.
B
Add a Try-Catch block in the data flow mapping to handle any transformation errors.
C
Configure the 'Log storage settings' in the pipeline to store error logs in an Azure Blob Storage account.
D
Implement a custom error handling logic in the source query of the Copy Data activity.