Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
You are working on a data pipeline that processes data from multiple sources with varying data formats. How would you approach the problem of data integration and transformation to ensure consistency and accuracy?
A
Create a unified data model that all sources must conform to before ingestion.
B
Implement a data transformation layer that can handle different data formats and convert them to a common format.
C
Use a data virtualization approach to query data from sources directly without integration.
D
Restrict the pipeline to only process data from sources with compatible formats.