LeetQuiz Logo
Privacy Policy•contact@leetquiz.com
© 2025 LeetQuiz All rights reserved.
Microsoft Fabric Analytics Engineer Associate DP-600

Microsoft Fabric Analytics Engineer Associate DP-600

Get started today

Ultimate access to all questions.


You need to ensure the data loading activities in the AnalyticsPOC workspace are executed in the appropriate sequence. The solution must meet the technical requirements. What should you do?

Overview: Litware, Inc. is a manufacturing company with offices throughout North America. The analytics team at Litware consists of data engineers, analytics engineers, data analysts, and data scientists.

Fabric Environment: Litware has been using a Microsoft Power BI tenant for three years but has not yet enabled any Fabric capacities or features.

Available Data: Litware has various datasets that need to be analyzed. The Product data contains a single table with multiple columns, and the customer satisfaction data contains the following tables: Survey, Question, and Response. For each survey submitted, one row is added to the Survey table, and one row is added to the Response table for each question in the survey. The Question table contains the text of each survey question. The third question in each survey response is an overall satisfaction score. Customers can submit a survey after each purchase.

User Problems: The analytics team deals with large volumes of semi-structured data. They aim to use Fabric to create a new data store. Product data is often classified into three pricing groups: high, medium, and low. This logic is implemented in several databases and semantic models but is not consistently applied across implementations.

Requirements: Planned Changes: Litware plans to enable Fabric features in the existing tenant. The analytics team will create a new data store as a proof of concept (PoC). The broader Litware user base will gain access to Fabric features upon PoC completion, which will be conducted using a Fabric trial capacity.

Three workspaces will be created:

  1. AnalyticsPOC: Will contain the data store, semantic models, reports pipelines, dataflow, and notebooks used to populate the data store.
  2. DataEngPOC: Will contain all pipelines, dataflows, and notebooks used to populate OneLake.
  3. DataSciPOC: Will contain all notebooks and reports created by the data scientists.

In the AnalyticsPOC workspace, the following will be created:

  • A data store (type to be decided)
  • A custom semantic model
  • A default semantic model
  • Interactive reports

Data engineers will create data pipelines to load data into OneLake either hourly or daily, depending on the source. Analytics engineers will ingest, transform, and load the data into the data store in the AnalyticsPOC workspace daily. Whenever possible, data engineers will use low-code tools for ingestion. The choice of data cleansing and transformation tools will be at the data engineers' discretion.

All semantic models and reports in the AnalyticsPOC workspace will use the data store as the sole data source.

Technical Requirements: The data store must support:

  • Read access using T-SQL or Python
  • Semi-structured and unstructured data
  • Row-level security (RLS) for users executing T-SQL queries

Files loaded by the data engineers into OneLake will be stored in Parquet format and meet Delta Lake specifications. Data will be loaded without transformation in one area of the AnalyticsPOC data store. The data will be cleansed, merged, and transformed into a dimensional model.

The data load process must ensure raw and cleansed data is updated completely before populating the dimensional model. The dimensional model must include a date dimension, with dates from 2010 through the current year, matching the Litware fiscal year to the calendar year.

The product pricing group logic must be centralized by the analytics engineers and made available in the data store for T-SQL queries and in the default semantic model. The pricing group logic is as follows:

  • List prices <= 50: low pricing group
  • List prices > 50 and <= 1,000: medium pricing group
  • List prices > 1,000: high pricing group

Security Requirements: Only Fabric administrators and the analytics team can see the Fabric items created for the PoC. Security requirements include:

  • Fabric administrators as workspace administrators
  • Data engineers get read and write access to the data store, but no access to datasets or reports
  • Analytics engineers get read/write access, schema creation rights in the data store, and can create/share semantic models and modify reports
  • Data scientists get read-only access to the data store via Spark notebook
  • Data analysts get read access to dimensional model objects and can create Power BI reports using the semantic models
  • Date dimension must be accessible to all users of the data store
  • Principle of least privilege must be followed

Both the default and custom semantic models must include only tables or views from the dimensional model in the data store. Existing Microsoft Entra security groups are as follows:

  • FabricAdmins: Fabric administrators
  • AnalyticsTeam: Analytics team members
  • DataAnalysts: Data analysts
  • DataScientists: Data scientists
  • DataEngineers: Data engineers
  • AnalyticsEngineers: Analytics engineers

Report Requirements: Data analysts must create a customer satisfaction report that:

  • Enables product selection to filter customer survey responses
  • Displays the average overall satisfaction score for surveys submitted in the last 12 months up to a selected date
  • Shows data immediately after it is updated in the data store
  • Contains data only from the current and previous year
  • Respects table-level security specified in the source data store
  • Minimizes execution time of report queries

Exam-Like



Powered ByGPT-5