Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
You are configuring exception handling in a batch processing solution that reads from and writes to a Delta Lake. What strategies would you implement to ensure that the pipeline continues to function smoothly even in the presence of errors?
A
Implement try-except blocks around critical operations and log errors for manual review.
B
Use Azure Data Factory's built-in error handling and retry mechanisms.
C
Set up alerts for any exceptions and pause the pipeline until manual intervention.
D
Ignore exceptions and continue processing, assuming that minor errors will not impact overall results.