
Ultimate access to all questions.
As a Microsoft Fabric Analytics Engineer, you are optimizing a dataflow to enhance performance by implementing Fast Copy. The solution must ensure minimal latency, support large volumes of data, and maintain data integrity. Considering these requirements, which of the following steps is the MOST appropriate to implement Fast Copy effectively? (Choose one option)
A
Add a Fast Copy activity to the dataflow and configure the source and destination settings to leverage the built-in optimization features for high-speed data transfer.
B
Replace the dataflow with a data pipeline, as pipelines inherently support faster data copying mechanisms without the need for additional configurations.
C
Develop a custom script within the dataflow to manually handle the data copying process, aiming to bypass any potential bottlenecks in the standard Fast Copy functionality.
D
Disable Fast Copy in the dataflow settings to prioritize data integrity over performance, ensuring that all data is copied without any risk of corruption.