
Ultimate access to all questions.
Which of the following changes will need to be made to the pipeline when migrating to Delta Live Tables?
A
None of these changes will need to be made
B
The pipeline will need to stop using the medallion-based multi-hop architecture
C
The pipeline will need to be written entirely in SQL
D
The pipeline will need to use a batch source in place of a streaming source
E
The pipeline will need to be written entirely in Python
Explanation:
Explanation:
Delta Live Tables (DLT) is designed to be compatible with existing Databricks pipelines and provides flexibility in implementation. Let's analyze each option:
Option A (Correct): None of these changes will need to be made - This is correct because DLT supports:
Option B: The pipeline will need to stop using the medallion-based multi-hop architecture - Incorrect. DLT actually encourages and supports the medallion architecture with its built-in support for multi-hop data pipelines.
Option C: The pipeline will need to be written entirely in SQL - Incorrect. DLT supports both SQL and Python, allowing you to use either language or a combination of both.
Option D: The pipeline will need to use a batch source in place of a streaming source - Incorrect. DLT supports both batch and streaming processing, and you can use streaming sources in DLT pipelines.
Option E: The pipeline will need to be written entirely in Python - Incorrect. As mentioned, DLT supports both SQL and Python.
Key Points: