
Answer-first summary for fast verification
Answer: Create a parameter rule only.
The question involves ensuring that a Dataflow Gen2's data source points to the correct storage location when deploying from development (Workspace1) to production (Workspace2) using different storage accounts. In Microsoft Fabric deployment pipelines, parameter rules are specifically designed to handle such environment-specific configurations, including data source paths. A parameter rule allows you to define parameters that can be updated during deployment to reference the appropriate storage account in each environment. Creating only a data source rule (option A) is insufficient because data source rules typically manage authentication and connection details, not path changes between environments. Option C (both rules) is unnecessary overkill since parameter rules alone can handle the path mapping. Option D (manual changes) defeats the purpose of automation in deployment pipelines. The community discussion shows 100% consensus on option B, with users confirming this is a common pattern for handling environment-specific data source references in Fabric deployment pipelines.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You have a Fabric tenant with two workspaces: Workspace1 for development and Workspace2 for production. Each environment uses a different storage account. Workspace1 contains a Dataflow Gen2 named Dataflow1, which sources data from a CSV file in blob storage. You plan to use a deployment pipeline to deploy items from Workspace1 to Workspace2.
What should you do to ensure the data source in Dataflow1 points to the correct production location after deployment?
A
Create a data source rule only.
B
Create a parameter rule only.
C
Create a data source rule and a parameter rule.
D
After implementing the deployment pipeline, manually change the data source
No comments yet.