20:00

Free Test
/ 10

Quiz

1/10
Topic 1, Contoso, Ltd
Case Study
Overview
This is a case study. Case studies are not timed separately. You can use as much exam time as you
would like to complete each case. However, there may be additional case studies and sections on
this exam. You must manage your time to ensure that you are able to complete all questions included
on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is
provided in the case study. Case studies might contain exhibits and other resources that provide
more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your
answers and to make changes before you move to the next section of the exam. After you begin a
new section, you cannot return to this section.
To start the case study
To display the first question in this case study, click the Next button. Use the buttons in the left pane
to explore the content of the case study before you answer the questions. Clicking these buttons
displays information such as business requirements, existing environment, and problem statements.
If the case study has an All Information tab, note that the information displayed is identical to the
information displayed on the subsequent tabs. When you are ready to answer a question, click the
Question button to return to the question.
Overview. Company Overview
Contoso, Ltd. is an online retail company that wants to modernize its analytics platform by moving to
Fabric. The company plans to begin using Fabric for marketing analytics.
Overview. IT Structure
The company’s IT department has a team of data analysts and a team of data engineers that use
analytics systems.
The data engineers perform the ingestion, transformation, and loading of data. They prefer to use
Python or SQL to transform the data.
The data analysts query data and create semantic models and reports. They are qualified to write
queries in Power Query and T-SQL.
Existing Environment. Fabric
Contoso has an F64 capacity named Cap1. All Fabric users are allowed to create items.
Contoso has two workspaces named WorkspaceA and WorkspaceB that currently use Pro license
mode.
Existing Environment. Source Systems
Contoso has a point of sale (POS) system named POS1 that uses an instance of SQL Server on Azure
Virtual Machines in the same Microsoft Entra tenant as Fabric. The host virtual machine is on a
private virtual network that has public access blocked. POS1 contains all the sales transactions that
were processed on the company’s website.
The company has a software as a service (SaaS) online marketing app named MAR1. MAR1 has seven
entities. The entities contain data that relates to email open rates and interaction rates, as well as
website interactions. The data can be exported from MAR1 by calling REST APIs. Each entity has a
different endpoint.
Contoso has been using MAR1 for one year. Data from prior years is stored in Parquet files in an
Amazon Simple Storage Service (Amazon S3) bucket. There are 12 files that range in size from
300 MB to 900 MB and relate to email interactions.
Existing Environment. Product Data
POS1 contains a product list and related data. The data comes from the following three tables:
Products
ProductCategories
ProductSubcategories
In the data, products are related to product subcategories, and subcategories are related to product
categories.
Existing Environment. Azure
Contoso has a Microsoft Entra tenant that has the following mail-enabled security groups:
DataAnalysts: Contains the data analysts
DataEngineers: Contains the data engineers
Contoso has an Azure subscription.
The company has an existing Azure DevOps organization and creates a new project for repositories
that relate to Fabric.
Existing Environment. User Problems
The VP of marketing at Contoso requires analysis on the effectiveness of different types of email
content. It typically takes a week to manually compile and analyze the data. Contoso wants to reduce
the time to less than one day by using Fabric.
The data engineering team has successfully exported data from MAR1. The team experiences
transient connectivity errors, which causes the data exports to fail.
Requirements. Planned Changes
Contoso plans to create the following two lakehouses:
Lakehouse1: Will store both raw and cleansed data from the sources
Lakehouse2: Will serve data in a dimensional model to users for analytical queries
Additional items will be added to facilitate data ingestion and transformation.
Contoso plans to use Azure Repos for source control in Fabric.
Requirements. Technical Requirements
The new lakehouses must follow a medallion architecture by using the following three layers: bronze,
silver, and gold. There will be extensive data cleansing required to populate the MAR1 data in the
silver layer, including deduplication, the handling of missing values, and the standardizing of
capitalization.
Each layer must be fully populated before moving on to the next layer. If any step in populating the
lakehouses fails, an email must be sent to the data engineers.
Data imports must run simultaneously, when possible.
The use of email data from the Amazon S3 bucket must meet the following requirements:
Minimize egress costs associated with cross-cloud data access.
Prevent saving a copy of the raw data in the lakehouses.
Items that relate to data ingestion must meet the following requirements:
The items must be source controlled alongside other workspace items.
Ingested data must land in the bronze layer of Lakehouse1 in the Delta format.
No changes other than changes to the file formats must be implemented before the data lands in the
bronze layer.
Development effort must be minimized and a built-in connection must be used to import the source
data.
In the event of a connectivity error, the ingestion processes must attempt the connection again.
Lakehouses, data pipelines, and notebooks must be stored in WorkspaceA. Semantic models,
reports, and dataflows must be stored in WorkspaceB.
Once a week, old files that are no longer referenced by a Delta table log must be removed.
Requirements. Data Transformation
In the POS1 product data, ProductID values are unique. The product dimension in the gold layer must
include only active products from product list. Active products are identified by an IsActive value of 1.
Some product categories and subcategories are NOT assigned to any product. They are NOT
analytically relevant and must be omitted from the product dimension in the gold layer.
Requirements. Data Security
Security in Fabric must meet the following requirements:
The data engineers must have read and write access to all the lakehouses, including the underlying
files.
The data analysts must only have read access to the Delta tables in the gold layer.
The data analysts must NOT have access to the data in the bronze and silver layers.
The data engineers must be able to commit changes to source control in WorkspaceA.
You need to ensure that the data analysts can access the gold layer lakehouse.
What should you do?
Select the answer
1 correct answer
A.
Add the DataAnalyst group to the Viewer role for WorkspaceA.
B.
Share the lakehouse with the DataAnalysts group and grant the Build reports on the default semantic model permission.
C.
Share the lakehouse with the DataAnalysts group and grant the Read all SQL Endpoint data permission.
D.
Share the lakehouse with the DataAnalysts group and grant the Read all Apache Spark permission.

Quiz

2/10
Topic 1, Contoso, Ltd
Case Study
Overview
This is a case study. Case studies are not timed separately. You can use as much exam time as you
would like to complete each case. However, there may be additional case studies and sections on
this exam. You must manage your time to ensure that you are able to complete all questions included
on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is
provided in the case study. Case studies might contain exhibits and other resources that provide
more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your
answers and to make changes before you move to the next section of the exam. After you begin a
new section, you cannot return to this section.
To start the case study
To display the first question in this case study, click the Next button. Use the buttons in the left pane
to explore the content of the case study before you answer the questions. Clicking these buttons
displays information such as business requirements, existing environment, and problem statements.
If the case study has an All Information tab, note that the information displayed is identical to the
information displayed on the subsequent tabs. When you are ready to answer a question, click the
Question button to return to the question.
Overview. Company Overview
Contoso, Ltd. is an online retail company that wants to modernize its analytics platform by moving to
Fabric. The company plans to begin using Fabric for marketing analytics.
Overview. IT Structure
The company’s IT department has a team of data analysts and a team of data engineers that use
analytics systems.
The data engineers perform the ingestion, transformation, and loading of data. They prefer to use
Python or SQL to transform the data.
The data analysts query data and create semantic models and reports. They are qualified to write
queries in Power Query and T-SQL.
Existing Environment. Fabric
Contoso has an F64 capacity named Cap1. All Fabric users are allowed to create items.
Contoso has two workspaces named WorkspaceA and WorkspaceB that currently use Pro license
mode.
Existing Environment. Source Systems
Contoso has a point of sale (POS) system named POS1 that uses an instance of SQL Server on Azure
Virtual Machines in the same Microsoft Entra tenant as Fabric. The host virtual machine is on a
private virtual network that has public access blocked. POS1 contains all the sales transactions that
were processed on the company’s website.
The company has a software as a service (SaaS) online marketing app named MAR1. MAR1 has seven
entities. The entities contain data that relates to email open rates and interaction rates, as well as
website interactions. The data can be exported from MAR1 by calling REST APIs. Each entity has a
different endpoint.
Contoso has been using MAR1 for one year. Data from prior years is stored in Parquet files in an
Amazon Simple Storage Service (Amazon S3) bucket. There are 12 files that range in size from
300 MB to 900 MB and relate to email interactions.
Existing Environment. Product Data
POS1 contains a product list and related data. The data comes from the following three tables:
Products
ProductCategories
ProductSubcategories
In the data, products are related to product subcategories, and subcategories are related to product
categories.
Existing Environment. Azure
Contoso has a Microsoft Entra tenant that has the following mail-enabled security groups:
DataAnalysts: Contains the data analysts
DataEngineers: Contains the data engineers
Contoso has an Azure subscription.
The company has an existing Azure DevOps organization and creates a new project for repositories
that relate to Fabric.
Existing Environment. User Problems
The VP of marketing at Contoso requires analysis on the effectiveness of different types of email
content. It typically takes a week to manually compile and analyze the data. Contoso wants to reduce
the time to less than one day by using Fabric.
The data engineering team has successfully exported data from MAR1. The team experiences
transient connectivity errors, which causes the data exports to fail.
Requirements. Planned Changes
Contoso plans to create the following two lakehouses:
Lakehouse1: Will store both raw and cleansed data from the sources
Lakehouse2: Will serve data in a dimensional model to users for analytical queries
Additional items will be added to facilitate data ingestion and transformation.
Contoso plans to use Azure Repos for source control in Fabric.
Requirements. Technical Requirements
The new lakehouses must follow a medallion architecture by using the following three layers: bronze,
silver, and gold. There will be extensive data cleansing required to populate the MAR1 data in the
silver layer, including deduplication, the handling of missing values, and the standardizing of
capitalization.
Each layer must be fully populated before moving on to the next layer. If any step in populating the
lakehouses fails, an email must be sent to the data engineers.
Data imports must run simultaneously, when possible.
The use of email data from the Amazon S3 bucket must meet the following requirements:
Minimize egress costs associated with cross-cloud data access.
Prevent saving a copy of the raw data in the lakehouses.
Items that relate to data ingestion must meet the following requirements:
The items must be source controlled alongside other workspace items.
Ingested data must land in the bronze layer of Lakehouse1 in the Delta format.
No changes other than changes to the file formats must be implemented before the data lands in the
bronze layer.
Development effort must be minimized and a built-in connection must be used to import the source
data.
In the event of a connectivity error, the ingestion processes must attempt the connection again.
Lakehouses, data pipelines, and notebooks must be stored in WorkspaceA. Semantic models,
reports, and dataflows must be stored in WorkspaceB.
Once a week, old files that are no longer referenced by a Delta table log must be removed.
Requirements. Data Transformation
In the POS1 product data, ProductID values are unique. The product dimension in the gold layer must
include only active products from product list. Active products are identified by an IsActive value of 1.
Some product categories and subcategories are NOT assigned to any product. They are NOT
analytically relevant and must be omitted from the product dimension in the gold layer.
Requirements. Data Security
Security in Fabric must meet the following requirements:
The data engineers must have read and write access to all the lakehouses, including the underlying
files.
The data analysts must only have read access to the Delta tables in the gold layer.
The data analysts must NOT have access to the data in the bronze and silver layers.
The data engineers must be able to commit changes to source control in WorkspaceA.
You need to recommend a method to populate the POS1 data to the lakehouse medallion layers.
What should you recommend for each layer? To answer, select the appropriate options in the answer
area.


NOTE: Each correct selection is worth one point.

Certification Exam Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric Microsoft Microsoft-DP-700 1-1094220398
Select the answer
1 correct answer
Certification Exam Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric Microsoft Microsoft-DP-700 2-2993060814

A screenshot of a computer Description automatically generated
Bronze Layer: A pipeline Copy activity
The bronze layer is used to store raw, unprocessed data. The requirements specify that no
transformations should be applied before landing the data in this layer. Using a pipeline Copy activity
ensures minimal development effort, built-in connectors, and the ability to ingest the data directly
into the Delta format in the bronze layer.
Silver Layer: A notebook
The silver layer involves extensive data cleansing (deduplication, handling missing values, and
standardizing capitalization). A notebook provides the flexibility to implement complex
transformations and is well-suited for this task.

Quiz

3/10
Topic 1, Contoso, Ltd
Case Study
Overview
This is a case study. Case studies are not timed separately. You can use as much exam time as you
would like to complete each case. However, there may be additional case studies and sections on
this exam. You must manage your time to ensure that you are able to complete all questions included
on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is
provided in the case study. Case studies might contain exhibits and other resources that provide
more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your
answers and to make changes before you move to the next section of the exam. After you begin a
new section, you cannot return to this section.
To start the case study
To display the first question in this case study, click the Next button. Use the buttons in the left pane
to explore the content of the case study before you answer the questions. Clicking these buttons
displays information such as business requirements, existing environment, and problem statements.
If the case study has an All Information tab, note that the information displayed is identical to the
information displayed on the subsequent tabs. When you are ready to answer a question, click the
Question button to return to the question.
Overview. Company Overview
Contoso, Ltd. is an online retail company that wants to modernize its analytics platform by moving to
Fabric. The company plans to begin using Fabric for marketing analytics.
Overview. IT Structure
The company’s IT department has a team of data analysts and a team of data engineers that use
analytics systems.
The data engineers perform the ingestion, transformation, and loading of data. They prefer to use
Python or SQL to transform the data.
The data analysts query data and create semantic models and reports. They are qualified to write
queries in Power Query and T-SQL.
Existing Environment. Fabric
Contoso has an F64 capacity named Cap1. All Fabric users are allowed to create items.
Contoso has two workspaces named WorkspaceA and WorkspaceB that currently use Pro license
mode.
Existing Environment. Source Systems
Contoso has a point of sale (POS) system named POS1 that uses an instance of SQL Server on Azure
Virtual Machines in the same Microsoft Entra tenant as Fabric. The host virtual machine is on a
private virtual network that has public access blocked. POS1 contains all the sales transactions that
were processed on the company’s website.
The company has a software as a service (SaaS) online marketing app named MAR1. MAR1 has seven
entities. The entities contain data that relates to email open rates and interaction rates, as well as
website interactions. The data can be exported from MAR1 by calling REST APIs. Each entity has a
different endpoint.
Contoso has been using MAR1 for one year. Data from prior years is stored in Parquet files in an
Amazon Simple Storage Service (Amazon S3) bucket. There are 12 files that range in size from
300 MB to 900 MB and relate to email interactions.
Existing Environment. Product Data
POS1 contains a product list and related data. The data comes from the following three tables:
Products
ProductCategories
ProductSubcategories
In the data, products are related to product subcategories, and subcategories are related to product
categories.
Existing Environment. Azure
Contoso has a Microsoft Entra tenant that has the following mail-enabled security groups:
DataAnalysts: Contains the data analysts
DataEngineers: Contains the data engineers
Contoso has an Azure subscription.
The company has an existing Azure DevOps organization and creates a new project for repositories
that relate to Fabric.
Existing Environment. User Problems
The VP of marketing at Contoso requires analysis on the effectiveness of different types of email
content. It typically takes a week to manually compile and analyze the data. Contoso wants to reduce
the time to less than one day by using Fabric.
The data engineering team has successfully exported data from MAR1. The team experiences
transient connectivity errors, which causes the data exports to fail.
Requirements. Planned Changes
Contoso plans to create the following two lakehouses:
Lakehouse1: Will store both raw and cleansed data from the sources
Lakehouse2: Will serve data in a dimensional model to users for analytical queries
Additional items will be added to facilitate data ingestion and transformation.
Contoso plans to use Azure Repos for source control in Fabric.
Requirements. Technical Requirements
The new lakehouses must follow a medallion architecture by using the following three layers: bronze,
silver, and gold. There will be extensive data cleansing required to populate the MAR1 data in the
silver layer, including deduplication, the handling of missing values, and the standardizing of
capitalization.
Each layer must be fully populated before moving on to the next layer. If any step in populating the
lakehouses fails, an email must be sent to the data engineers.
Data imports must run simultaneously, when possible.
The use of email data from the Amazon S3 bucket must meet the following requirements:
Minimize egress costs associated with cross-cloud data access.
Prevent saving a copy of the raw data in the lakehouses.
Items that relate to data ingestion must meet the following requirements:
The items must be source controlled alongside other workspace items.
Ingested data must land in the bronze layer of Lakehouse1 in the Delta format.
No changes other than changes to the file formats must be implemented before the data lands in the
bronze layer.
Development effort must be minimized and a built-in connection must be used to import the source
data.
In the event of a connectivity error, the ingestion processes must attempt the connection again.
Lakehouses, data pipelines, and notebooks must be stored in WorkspaceA. Semantic models,
reports, and dataflows must be stored in WorkspaceB.
Once a week, old files that are no longer referenced by a Delta table log must be removed.
Requirements. Data Transformation
In the POS1 product data, ProductID values are unique. The product dimension in the gold layer must
include only active products from product list. Active products are identified by an IsActive value of 1.
Some product categories and subcategories are NOT assigned to any product. They are NOT
analytically relevant and must be omitted from the product dimension in the gold layer.
Requirements. Data Security
Security in Fabric must meet the following requirements:
The data engineers must have read and write access to all the lakehouses, including the underlying
files.
The data analysts must only have read access to the Delta tables in the gold layer.
The data analysts must NOT have access to the data in the bronze and silver layers.
The data engineers must be able to commit changes to source control in WorkspaceA.
You need to ensure that usage of the data in the Amazon S3 bucket meets the technical
requirements.
What should you do?
Select the answer
1 correct answer
A.
Create a workspace identity and enable high concurrency for the notebooks.
B.
Create a shortcut and ensure that caching is disabled for the workspace.
C.
Create a workspace identity and use the identity in a data pipeline.
D.
Create a shortcut and ensure that caching is enabled for the workspace.

Quiz

4/10
Topic 1, Contoso, Ltd
Case Study
Overview
This is a case study. Case studies are not timed separately. You can use as much exam time as you
would like to complete each case. However, there may be additional case studies and sections on
this exam. You must manage your time to ensure that you are able to complete all questions included
on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is
provided in the case study. Case studies might contain exhibits and other resources that provide
more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your
answers and to make changes before you move to the next section of the exam. After you begin a
new section, you cannot return to this section.
To start the case study
To display the first question in this case study, click the Next button. Use the buttons in the left pane
to explore the content of the case study before you answer the questions. Clicking these buttons
displays information such as business requirements, existing environment, and problem statements.
If the case study has an All Information tab, note that the information displayed is identical to the
information displayed on the subsequent tabs. When you are ready to answer a question, click the
Question button to return to the question.
Overview. Company Overview
Contoso, Ltd. is an online retail company that wants to modernize its analytics platform by moving to
Fabric. The company plans to begin using Fabric for marketing analytics.
Overview. IT Structure
The company’s IT department has a team of data analysts and a team of data engineers that use
analytics systems.
The data engineers perform the ingestion, transformation, and loading of data. They prefer to use
Python or SQL to transform the data.
The data analysts query data and create semantic models and reports. They are qualified to write
queries in Power Query and T-SQL.
Existing Environment. Fabric
Contoso has an F64 capacity named Cap1. All Fabric users are allowed to create items.
Contoso has two workspaces named WorkspaceA and WorkspaceB that currently use Pro license
mode.
Existing Environment. Source Systems
Contoso has a point of sale (POS) system named POS1 that uses an instance of SQL Server on Azure
Virtual Machines in the same Microsoft Entra tenant as Fabric. The host virtual machine is on a
private virtual network that has public access blocked. POS1 contains all the sales transactions that
were processed on the company’s website.
The company has a software as a service (SaaS) online marketing app named MAR1. MAR1 has seven
entities. The entities contain data that relates to email open rates and interaction rates, as well as
website interactions. The data can be exported from MAR1 by calling REST APIs. Each entity has a
different endpoint.
Contoso has been using MAR1 for one year. Data from prior years is stored in Parquet files in an
Amazon Simple Storage Service (Amazon S3) bucket. There are 12 files that range in size from
300 MB to 900 MB and relate to email interactions.
Existing Environment. Product Data
POS1 contains a product list and related data. The data comes from the following three tables:
Products
ProductCategories
ProductSubcategories
In the data, products are related to product subcategories, and subcategories are related to product
categories.
Existing Environment. Azure
Contoso has a Microsoft Entra tenant that has the following mail-enabled security groups:
DataAnalysts: Contains the data analysts
DataEngineers: Contains the data engineers
Contoso has an Azure subscription.
The company has an existing Azure DevOps organization and creates a new project for repositories
that relate to Fabric.
Existing Environment. User Problems
The VP of marketing at Contoso requires analysis on the effectiveness of different types of email
content. It typically takes a week to manually compile and analyze the data. Contoso wants to reduce
the time to less than one day by using Fabric.
The data engineering team has successfully exported data from MAR1. The team experiences
transient connectivity errors, which causes the data exports to fail.
Requirements. Planned Changes
Contoso plans to create the following two lakehouses:
Lakehouse1: Will store both raw and cleansed data from the sources
Lakehouse2: Will serve data in a dimensional model to users for analytical queries
Additional items will be added to facilitate data ingestion and transformation.
Contoso plans to use Azure Repos for source control in Fabric.
Requirements. Technical Requirements
The new lakehouses must follow a medallion architecture by using the following three layers: bronze,
silver, and gold. There will be extensive data cleansing required to populate the MAR1 data in the
silver layer, including deduplication, the handling of missing values, and the standardizing of
capitalization.
Each layer must be fully populated before moving on to the next layer. If any step in populating the
lakehouses fails, an email must be sent to the data engineers.
Data imports must run simultaneously, when possible.
The use of email data from the Amazon S3 bucket must meet the following requirements:
Minimize egress costs associated with cross-cloud data access.
Prevent saving a copy of the raw data in the lakehouses.
Items that relate to data ingestion must meet the following requirements:
The items must be source controlled alongside other workspace items.
Ingested data must land in the bronze layer of Lakehouse1 in the Delta format.
No changes other than changes to the file formats must be implemented before the data lands in the
bronze layer.
Development effort must be minimized and a built-in connection must be used to import the source
data.
In the event of a connectivity error, the ingestion processes must attempt the connection again.
Lakehouses, data pipelines, and notebooks must be stored in WorkspaceA. Semantic models,
reports, and dataflows must be stored in WorkspaceB.
Once a week, old files that are no longer referenced by a Delta table log must be removed.
Requirements. Data Transformation
In the POS1 product data, ProductID values are unique. The product dimension in the gold layer must
include only active products from product list. Active products are identified by an IsActive value of 1.
Some product categories and subcategories are NOT assigned to any product. They are NOT
analytically relevant and must be omitted from the product dimension in the gold layer.
Requirements. Data Security
Security in Fabric must meet the following requirements:
The data engineers must have read and write access to all the lakehouses, including the underlying
files.
The data analysts must only have read access to the Delta tables in the gold layer.
The data analysts must NOT have access to the data in the bronze and silver layers.
The data engineers must be able to commit changes to source control in WorkspaceA.
You need to create the product dimension.
How should you complete the Apache Spark SQL code? To answer, select the appropriate options in
the answer area.


NOTE: Each correct selection is worth one point.

Certification Exam Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric Microsoft Microsoft-DP-700 3-140836279
Select the answer
1 correct answer
Certification Exam Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric Microsoft Microsoft-DP-700 4-2423333510

A screenshot of a computer Description automatically generated
Join between Products and ProductSubCategories:
Use an INNER JOIN.
The goal is to include only products that are assigned to a subcategory. An INNER JOIN ensures that
only matching records (i.e., products with a valid subcategory) are included.
Join between ProductSubCategories and ProductCategories:
Use an INNER JOIN.
Similar to the above logic, we want to include only subcategories assigned to a valid product
category. An INNER JOIN ensures this condition is met.
WHERE Clause
Condition: IsActive = 1
Only active products (where IsActive equals 1) should be included in the gold layer. This filters out
inactive products.

Quiz

5/10
Topic 1, Contoso, Ltd
Case Study
Overview
This is a case study. Case studies are not timed separately. You can use as much exam time as you
would like to complete each case. However, there may be additional case studies and sections on
this exam. You must manage your time to ensure that you are able to complete all questions included
on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is
provided in the case study. Case studies might contain exhibits and other resources that provide
more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your
answers and to make changes before you move to the next section of the exam. After you begin a
new section, you cannot return to this section.
To start the case study
To display the first question in this case study, click the Next button. Use the buttons in the left pane
to explore the content of the case study before you answer the questions. Clicking these buttons
displays information such as business requirements, existing environment, and problem statements.
If the case study has an All Information tab, note that the information displayed is identical to the
information displayed on the subsequent tabs. When you are ready to answer a question, click the
Question button to return to the question.
Overview. Company Overview
Contoso, Ltd. is an online retail company that wants to modernize its analytics platform by moving to
Fabric. The company plans to begin using Fabric for marketing analytics.
Overview. IT Structure
The company’s IT department has a team of data analysts and a team of data engineers that use
analytics systems.
The data engineers perform the ingestion, transformation, and loading of data. They prefer to use
Python or SQL to transform the data.
The data analysts query data and create semantic models and reports. They are qualified to write
queries in Power Query and T-SQL.
Existing Environment. Fabric
Contoso has an F64 capacity named Cap1. All Fabric users are allowed to create items.
Contoso has two workspaces named WorkspaceA and WorkspaceB that currently use Pro license
mode.
Existing Environment. Source Systems
Contoso has a point of sale (POS) system named POS1 that uses an instance of SQL Server on Azure
Virtual Machines in the same Microsoft Entra tenant as Fabric. The host virtual machine is on a
private virtual network that has public access blocked. POS1 contains all the sales transactions that
were processed on the company’s website.
The company has a software as a service (SaaS) online marketing app named MAR1. MAR1 has seven
entities. The entities contain data that relates to email open rates and interaction rates, as well as
website interactions. The data can be exported from MAR1 by calling REST APIs. Each entity has a
different endpoint.
Contoso has been using MAR1 for one year. Data from prior years is stored in Parquet files in an
Amazon Simple Storage Service (Amazon S3) bucket. There are 12 files that range in size from
300 MB to 900 MB and relate to email interactions.
Existing Environment. Product Data
POS1 contains a product list and related data. The data comes from the following three tables:
Products
ProductCategories
ProductSubcategories
In the data, products are related to product subcategories, and subcategories are related to product
categories.
Existing Environment. Azure
Contoso has a Microsoft Entra tenant that has the following mail-enabled security groups:
DataAnalysts: Contains the data analysts
DataEngineers: Contains the data engineers
Contoso has an Azure subscription.
The company has an existing Azure DevOps organization and creates a new project for repositories
that relate to Fabric.
Existing Environment. User Problems
The VP of marketing at Contoso requires analysis on the effectiveness of different types of email
content. It typically takes a week to manually compile and analyze the data. Contoso wants to reduce
the time to less than one day by using Fabric.
The data engineering team has successfully exported data from MAR1. The team experiences
transient connectivity errors, which causes the data exports to fail.
Requirements. Planned Changes
Contoso plans to create the following two lakehouses:
Lakehouse1: Will store both raw and cleansed data from the sources
Lakehouse2: Will serve data in a dimensional model to users for analytical queries
Additional items will be added to facilitate data ingestion and transformation.
Contoso plans to use Azure Repos for source control in Fabric.
Requirements. Technical Requirements
The new lakehouses must follow a medallion architecture by using the following three layers: bronze,
silver, and gold. There will be extensive data cleansing required to populate the MAR1 data in the
silver layer, including deduplication, the handling of missing values, and the standardizing of
capitalization.
Each layer must be fully populated before moving on to the next layer. If any step in populating the
lakehouses fails, an email must be sent to the data engineers.
Data imports must run simultaneously, when possible.
The use of email data from the Amazon S3 bucket must meet the following requirements:
Minimize egress costs associated with cross-cloud data access.
Prevent saving a copy of the raw data in the lakehouses.
Items that relate to data ingestion must meet the following requirements:
The items must be source controlled alongside other workspace items.
Ingested data must land in the bronze layer of Lakehouse1 in the Delta format.
No changes other than changes to the file formats must be implemented before the data lands in the
bronze layer.
Development effort must be minimized and a built-in connection must be used to import the source
data.
In the event of a connectivity error, the ingestion processes must attempt the connection again.
Lakehouses, data pipelines, and notebooks must be stored in WorkspaceA. Semantic models,
reports, and dataflows must be stored in WorkspaceB.
Once a week, old files that are no longer referenced by a Delta table log must be removed.
Requirements. Data Transformation
In the POS1 product data, ProductID values are unique. The product dimension in the gold layer must
include only active products from product list. Active products are identified by an IsActive value of 1.
Some product categories and subcategories are NOT assigned to any product. They are NOT
analytically relevant and must be omitted from the product dimension in the gold layer.
Requirements. Data Security
Security in Fabric must meet the following requirements:
The data engineers must have read and write access to all the lakehouses, including the underlying
files.
The data analysts must only have read access to the Delta tables in the gold layer.
The data analysts must NOT have access to the data in the bronze and silver layers.
The data engineers must be able to commit changes to source control in WorkspaceA.
You need to populate the MAR1 data in the bronze layer.
Which two types of activities should you include in the pipeline? Each correct answer presents part
of the solution.
NOTE: Each correct selection is worth one point.
Select the answer
2 correct answers
A.
ForEach
B.
Copy data
C.
WebHook
D.
Stored procedure

Quiz

6/10
Topic 1, Contoso, Ltd
Case Study
Overview
This is a case study. Case studies are not timed separately. You can use as much exam time as you
would like to complete each case. However, there may be additional case studies and sections on
this exam. You must manage your time to ensure that you are able to complete all questions included
on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is
provided in the case study. Case studies might contain exhibits and other resources that provide
more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your
answers and to make changes before you move to the next section of the exam. After you begin a
new section, you cannot return to this section.
To start the case study
To display the first question in this case study, click the Next button. Use the buttons in the left pane
to explore the content of the case study before you answer the questions. Clicking these buttons
displays information such as business requirements, existing environment, and problem statements.
If the case study has an All Information tab, note that the information displayed is identical to the
information displayed on the subsequent tabs. When you are ready to answer a question, click the
Question button to return to the question.
Overview. Company Overview
Contoso, Ltd. is an online retail company that wants to modernize its analytics platform by moving to
Fabric. The company plans to begin using Fabric for marketing analytics.
Overview. IT Structure
The company’s IT department has a team of data analysts and a team of data engineers that use
analytics systems.
The data engineers perform the ingestion, transformation, and loading of data. They prefer to use
Python or SQL to transform the data.
The data analysts query data and create semantic models and reports. They are qualified to write
queries in Power Query and T-SQL.
Existing Environment. Fabric
Contoso has an F64 capacity named Cap1. All Fabric users are allowed to create items.
Contoso has two workspaces named WorkspaceA and WorkspaceB that currently use Pro license
mode.
Existing Environment. Source Systems
Contoso has a point of sale (POS) system named POS1 that uses an instance of SQL Server on Azure
Virtual Machines in the same Microsoft Entra tenant as Fabric. The host virtual machine is on a
private virtual network that has public access blocked. POS1 contains all the sales transactions that
were processed on the company’s website.
The company has a software as a service (SaaS) online marketing app named MAR1. MAR1 has seven
entities. The entities contain data that relates to email open rates and interaction rates, as well as
website interactions. The data can be exported from MAR1 by calling REST APIs. Each entity has a
different endpoint.
Contoso has been using MAR1 for one year. Data from prior years is stored in Parquet files in an
Amazon Simple Storage Service (Amazon S3) bucket. There are 12 files that range in size from
300 MB to 900 MB and relate to email interactions.
Existing Environment. Product Data
POS1 contains a product list and related data. The data comes from the following three tables:
Products
ProductCategories
ProductSubcategories
In the data, products are related to product subcategories, and subcategories are related to product
categories.
Existing Environment. Azure
Contoso has a Microsoft Entra tenant that has the following mail-enabled security groups:
DataAnalysts: Contains the data analysts
DataEngineers: Contains the data engineers
Contoso has an Azure subscription.
The company has an existing Azure DevOps organization and creates a new project for repositories
that relate to Fabric.
Existing Environment. User Problems
The VP of marketing at Contoso requires analysis on the effectiveness of different types of email
content. It typically takes a week to manually compile and analyze the data. Contoso wants to reduce
the time to less than one day by using Fabric.
The data engineering team has successfully exported data from MAR1. The team experiences
transient connectivity errors, which causes the data exports to fail.
Requirements. Planned Changes
Contoso plans to create the following two lakehouses:
Lakehouse1: Will store both raw and cleansed data from the sources
Lakehouse2: Will serve data in a dimensional model to users for analytical queries
Additional items will be added to facilitate data ingestion and transformation.
Contoso plans to use Azure Repos for source control in Fabric.
Requirements. Technical Requirements
The new lakehouses must follow a medallion architecture by using the following three layers: bronze,
silver, and gold. There will be extensive data cleansing required to populate the MAR1 data in the
silver layer, including deduplication, the handling of missing values, and the standardizing of
capitalization.
Each layer must be fully populated before moving on to the next layer. If any step in populating the
lakehouses fails, an email must be sent to the data engineers.
Data imports must run simultaneously, when possible.
The use of email data from the Amazon S3 bucket must meet the following requirements:
Minimize egress costs associated with cross-cloud data access.
Prevent saving a copy of the raw data in the lakehouses.
Items that relate to data ingestion must meet the following requirements:
The items must be source controlled alongside other workspace items.
Ingested data must land in the bronze layer of Lakehouse1 in the Delta format.
No changes other than changes to the file formats must be implemented before the data lands in the
bronze layer.
Development effort must be minimized and a built-in connection must be used to import the source
data.
In the event of a connectivity error, the ingestion processes must attempt the connection again.
Lakehouses, data pipelines, and notebooks must be stored in WorkspaceA. Semantic models,
reports, and dataflows must be stored in WorkspaceB.
Once a week, old files that are no longer referenced by a Delta table log must be removed.
Requirements. Data Transformation
In the POS1 product data, ProductID values are unique. The product dimension in the gold layer must
include only active products from product list. Active products are identified by an IsActive value of 1.
Some product categories and subcategories are NOT assigned to any product. They are NOT
analytically relevant and must be omitted from the product dimension in the gold layer.
Requirements. Data Security
Security in Fabric must meet the following requirements:
The data engineers must have read and write access to all the lakehouses, including the underlying
files.
The data analysts must only have read access to the Delta tables in the gold layer.
The data analysts must NOT have access to the data in the bronze and silver layers.
The data engineers must be able to commit changes to source control in WorkspaceA.
You need to schedule the population of the medallion layers to meet the technical requirements.
What should you do?
Select the answer
1 correct answer
A.
Schedule a data pipeline that calls other data pipelines.
B.
Schedule a notebook.
C.
Schedule an Apache Spark job.
D.
Schedule multiple data pipelines.

Quiz

7/10
Topic 1, Contoso, Ltd
Case Study
Overview
This is a case study. Case studies are not timed separately. You can use as much exam time as you
would like to complete each case. However, there may be additional case studies and sections on
this exam. You must manage your time to ensure that you are able to complete all questions included
on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is
provided in the case study. Case studies might contain exhibits and other resources that provide
more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your
answers and to make changes before you move to the next section of the exam. After you begin a
new section, you cannot return to this section.
To start the case study
To display the first question in this case study, click the Next button. Use the buttons in the left pane
to explore the content of the case study before you answer the questions. Clicking these buttons
displays information such as business requirements, existing environment, and problem statements.
If the case study has an All Information tab, note that the information displayed is identical to the
information displayed on the subsequent tabs. When you are ready to answer a question, click the
Question button to return to the question.
Overview. Company Overview
Contoso, Ltd. is an online retail company that wants to modernize its analytics platform by moving to
Fabric. The company plans to begin using Fabric for marketing analytics.
Overview. IT Structure
The company’s IT department has a team of data analysts and a team of data engineers that use
analytics systems.
The data engineers perform the ingestion, transformation, and loading of data. They prefer to use
Python or SQL to transform the data.
The data analysts query data and create semantic models and reports. They are qualified to write
queries in Power Query and T-SQL.
Existing Environment. Fabric
Contoso has an F64 capacity named Cap1. All Fabric users are allowed to create items.
Contoso has two workspaces named WorkspaceA and WorkspaceB that currently use Pro license
mode.
Existing Environment. Source Systems
Contoso has a point of sale (POS) system named POS1 that uses an instance of SQL Server on Azure
Virtual Machines in the same Microsoft Entra tenant as Fabric. The host virtual machine is on a
private virtual network that has public access blocked. POS1 contains all the sales transactions that
were processed on the company’s website.
The company has a software as a service (SaaS) online marketing app named MAR1. MAR1 has seven
entities. The entities contain data that relates to email open rates and interaction rates, as well as
website interactions. The data can be exported from MAR1 by calling REST APIs. Each entity has a
different endpoint.
Contoso has been using MAR1 for one year. Data from prior years is stored in Parquet files in an
Amazon Simple Storage Service (Amazon S3) bucket. There are 12 files that range in size from
300 MB to 900 MB and relate to email interactions.
Existing Environment. Product Data
POS1 contains a product list and related data. The data comes from the following three tables:
Products
ProductCategories
ProductSubcategories
In the data, products are related to product subcategories, and subcategories are related to product
categories.
Existing Environment. Azure
Contoso has a Microsoft Entra tenant that has the following mail-enabled security groups:
DataAnalysts: Contains the data analysts
DataEngineers: Contains the data engineers
Contoso has an Azure subscription.
The company has an existing Azure DevOps organization and creates a new project for repositories
that relate to Fabric.
Existing Environment. User Problems
The VP of marketing at Contoso requires analysis on the effectiveness of different types of email
content. It typically takes a week to manually compile and analyze the data. Contoso wants to reduce
the time to less than one day by using Fabric.
The data engineering team has successfully exported data from MAR1. The team experiences
transient connectivity errors, which causes the data exports to fail.
Requirements. Planned Changes
Contoso plans to create the following two lakehouses:
Lakehouse1: Will store both raw and cleansed data from the sources
Lakehouse2: Will serve data in a dimensional model to users for analytical queries
Additional items will be added to facilitate data ingestion and transformation.
Contoso plans to use Azure Repos for source control in Fabric.
Requirements. Technical Requirements
The new lakehouses must follow a medallion architecture by using the following three layers: bronze,
silver, and gold. There will be extensive data cleansing required to populate the MAR1 data in the
silver layer, including deduplication, the handling of missing values, and the standardizing of
capitalization.
Each layer must be fully populated before moving on to the next layer. If any step in populating the
lakehouses fails, an email must be sent to the data engineers.
Data imports must run simultaneously, when possible.
The use of email data from the Amazon S3 bucket must meet the following requirements:
Minimize egress costs associated with cross-cloud data access.
Prevent saving a copy of the raw data in the lakehouses.
Items that relate to data ingestion must meet the following requirements:
The items must be source controlled alongside other workspace items.
Ingested data must land in the bronze layer of Lakehouse1 in the Delta format.
No changes other than changes to the file formats must be implemented before the data lands in the
bronze layer.
Development effort must be minimized and a built-in connection must be used to import the source
data.
In the event of a connectivity error, the ingestion processes must attempt the connection again.
Lakehouses, data pipelines, and notebooks must be stored in WorkspaceA. Semantic models,
reports, and dataflows must be stored in WorkspaceB.
Once a week, old files that are no longer referenced by a Delta table log must be removed.
Requirements. Data Transformation
In the POS1 product data, ProductID values are unique. The product dimension in the gold layer must
include only active products from product list. Active products are identified by an IsActive value of 1.
Some product categories and subcategories are NOT assigned to any product. They are NOT
analytically relevant and must be omitted from the product dimension in the gold layer.
Requirements. Data Security
Security in Fabric must meet the following requirements:
The data engineers must have read and write access to all the lakehouses, including the underlying
files.
The data analysts must only have read access to the Delta tables in the gold layer.
The data analysts must NOT have access to the data in the bronze and silver layers.
The data engineers must be able to commit changes to source control in WorkspaceA.
You need to recommend a solution to resolve the MAR1 connectivity issues. The solution must
minimize development effort. What should you recommend?
Select the answer
1 correct answer
A.
Add a ForEach activity to the data pipeline.
B.
Configure retries for the Copy data activity.
C.
Configure Fault tolerance for the Copy data activity.
D.
Call a notebook from the data pipeline.

Quiz

8/10
Topic 1, Contoso, Ltd
Case Study
Overview
This is a case study. Case studies are not timed separately. You can use as much exam time as you
would like to complete each case. However, there may be additional case studies and sections on
this exam. You must manage your time to ensure that you are able to complete all questions included
on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is
provided in the case study. Case studies might contain exhibits and other resources that provide
more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your
answers and to make changes before you move to the next section of the exam. After you begin a
new section, you cannot return to this section.
To start the case study
To display the first question in this case study, click the Next button. Use the buttons in the left pane
to explore the content of the case study before you answer the questions. Clicking these buttons
displays information such as business requirements, existing environment, and problem statements.
If the case study has an All Information tab, note that the information displayed is identical to the
information displayed on the subsequent tabs. When you are ready to answer a question, click the
Question button to return to the question.
Overview. Company Overview
Contoso, Ltd. is an online retail company that wants to modernize its analytics platform by moving to
Fabric. The company plans to begin using Fabric for marketing analytics.
Overview. IT Structure
The company’s IT department has a team of data analysts and a team of data engineers that use
analytics systems.
The data engineers perform the ingestion, transformation, and loading of data. They prefer to use
Python or SQL to transform the data.
The data analysts query data and create semantic models and reports. They are qualified to write
queries in Power Query and T-SQL.
Existing Environment. Fabric
Contoso has an F64 capacity named Cap1. All Fabric users are allowed to create items.
Contoso has two workspaces named WorkspaceA and WorkspaceB that currently use Pro license
mode.
Existing Environment. Source Systems
Contoso has a point of sale (POS) system named POS1 that uses an instance of SQL Server on Azure
Virtual Machines in the same Microsoft Entra tenant as Fabric. The host virtual machine is on a
private virtual network that has public access blocked. POS1 contains all the sales transactions that
were processed on the company’s website.
The company has a software as a service (SaaS) online marketing app named MAR1. MAR1 has seven
entities. The entities contain data that relates to email open rates and interaction rates, as well as
website interactions. The data can be exported from MAR1 by calling REST APIs. Each entity has a
different endpoint.
Contoso has been using MAR1 for one year. Data from prior years is stored in Parquet files in an
Amazon Simple Storage Service (Amazon S3) bucket. There are 12 files that range in size from
300 MB to 900 MB and relate to email interactions.
Existing Environment. Product Data
POS1 contains a product list and related data. The data comes from the following three tables:
Products
ProductCategories
ProductSubcategories
In the data, products are related to product subcategories, and subcategories are related to product
categories.
Existing Environment. Azure
Contoso has a Microsoft Entra tenant that has the following mail-enabled security groups:
DataAnalysts: Contains the data analysts
DataEngineers: Contains the data engineers
Contoso has an Azure subscription.
The company has an existing Azure DevOps organization and creates a new project for repositories
that relate to Fabric.
Existing Environment. User Problems
The VP of marketing at Contoso requires analysis on the effectiveness of different types of email
content. It typically takes a week to manually compile and analyze the data. Contoso wants to reduce
the time to less than one day by using Fabric.
The data engineering team has successfully exported data from MAR1. The team experiences
transient connectivity errors, which causes the data exports to fail.
Requirements. Planned Changes
Contoso plans to create the following two lakehouses:
Lakehouse1: Will store both raw and cleansed data from the sources
Lakehouse2: Will serve data in a dimensional model to users for analytical queries
Additional items will be added to facilitate data ingestion and transformation.
Contoso plans to use Azure Repos for source control in Fabric.
Requirements. Technical Requirements
The new lakehouses must follow a medallion architecture by using the following three layers: bronze,
silver, and gold. There will be extensive data cleansing required to populate the MAR1 data in the
silver layer, including deduplication, the handling of missing values, and the standardizing of
capitalization.
Each layer must be fully populated before moving on to the next layer. If any step in populating the
lakehouses fails, an email must be sent to the data engineers.
Data imports must run simultaneously, when possible.
The use of email data from the Amazon S3 bucket must meet the following requirements:
Minimize egress costs associated with cross-cloud data access.
Prevent saving a copy of the raw data in the lakehouses.
Items that relate to data ingestion must meet the following requirements:
The items must be source controlled alongside other workspace items.
Ingested data must land in the bronze layer of Lakehouse1 in the Delta format.
No changes other than changes to the file formats must be implemented before the data lands in the
bronze layer.
Development effort must be minimized and a built-in connection must be used to import the source
data.
In the event of a connectivity error, the ingestion processes must attempt the connection again.
Lakehouses, data pipelines, and notebooks must be stored in WorkspaceA. Semantic models,
reports, and dataflows must be stored in WorkspaceB.
Once a week, old files that are no longer referenced by a Delta table log must be removed.
Requirements. Data Transformation
In the POS1 product data, ProductID values are unique. The product dimension in the gold layer must
include only active products from product list. Active products are identified by an IsActive value of 1.
Some product categories and subcategories are NOT assigned to any product. They are NOT
analytically relevant and must be omitted from the product dimension in the gold layer.
Requirements. Data Security
Security in Fabric must meet the following requirements:
The data engineers must have read and write access to all the lakehouses, including the underlying
files.
The data analysts must only have read access to the Delta tables in the gold layer.
The data analysts must NOT have access to the data in the bronze and silver layers.
The data engineers must be able to commit changes to source control in WorkspaceA.
You need to ensure that the data engineers are notified if any step in populating the lakehouses fails.
The solution must meet the technical requirements and minimize development effort.
What should you use? To answer, select the appropriate options in the answer area.


NOTE: Each correct selection is worth one point.

Certification Exam Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric Microsoft Microsoft-DP-700 6-4170998578
Select the answer
1 correct answer
Certification Exam Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric Microsoft Microsoft-DP-700 5-3256506129

Quiz

9/10
Topic 1, Contoso, Ltd
Case Study
Overview
This is a case study. Case studies are not timed separately. You can use as much exam time as you
would like to complete each case. However, there may be additional case studies and sections on
this exam. You must manage your time to ensure that you are able to complete all questions included
on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is
provided in the case study. Case studies might contain exhibits and other resources that provide
more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your
answers and to make changes before you move to the next section of the exam. After you begin a
new section, you cannot return to this section.
To start the case study
To display the first question in this case study, click the Next button. Use the buttons in the left pane
to explore the content of the case study before you answer the questions. Clicking these buttons
displays information such as business requirements, existing environment, and problem statements.
If the case study has an All Information tab, note that the information displayed is identical to the
information displayed on the subsequent tabs. When you are ready to answer a question, click the
Question button to return to the question.
Overview. Company Overview
Contoso, Ltd. is an online retail company that wants to modernize its analytics platform by moving to
Fabric. The company plans to begin using Fabric for marketing analytics.
Overview. IT Structure
The company’s IT department has a team of data analysts and a team of data engineers that use
analytics systems.
The data engineers perform the ingestion, transformation, and loading of data. They prefer to use
Python or SQL to transform the data.
The data analysts query data and create semantic models and reports. They are qualified to write
queries in Power Query and T-SQL.
Existing Environment. Fabric
Contoso has an F64 capacity named Cap1. All Fabric users are allowed to create items.
Contoso has two workspaces named WorkspaceA and WorkspaceB that currently use Pro license
mode.
Existing Environment. Source Systems
Contoso has a point of sale (POS) system named POS1 that uses an instance of SQL Server on Azure
Virtual Machines in the same Microsoft Entra tenant as Fabric. The host virtual machine is on a
private virtual network that has public access blocked. POS1 contains all the sales transactions that
were processed on the company’s website.
The company has a software as a service (SaaS) online marketing app named MAR1. MAR1 has seven
entities. The entities contain data that relates to email open rates and interaction rates, as well as
website interactions. The data can be exported from MAR1 by calling REST APIs. Each entity has a
different endpoint.
Contoso has been using MAR1 for one year. Data from prior years is stored in Parquet files in an
Amazon Simple Storage Service (Amazon S3) bucket. There are 12 files that range in size from
300 MB to 900 MB and relate to email interactions.
Existing Environment. Product Data
POS1 contains a product list and related data. The data comes from the following three tables:
Products
ProductCategories
ProductSubcategories
In the data, products are related to product subcategories, and subcategories are related to product
categories.
Existing Environment. Azure
Contoso has a Microsoft Entra tenant that has the following mail-enabled security groups:
DataAnalysts: Contains the data analysts
DataEngineers: Contains the data engineers
Contoso has an Azure subscription.
The company has an existing Azure DevOps organization and creates a new project for repositories
that relate to Fabric.
Existing Environment. User Problems
The VP of marketing at Contoso requires analysis on the effectiveness of different types of email
content. It typically takes a week to manually compile and analyze the data. Contoso wants to reduce
the time to less than one day by using Fabric.
The data engineering team has successfully exported data from MAR1. The team experiences
transient connectivity errors, which causes the data exports to fail.
Requirements. Planned Changes
Contoso plans to create the following two lakehouses:
Lakehouse1: Will store both raw and cleansed data from the sources
Lakehouse2: Will serve data in a dimensional model to users for analytical queries
Additional items will be added to facilitate data ingestion and transformation.
Contoso plans to use Azure Repos for source control in Fabric.
Requirements. Technical Requirements
The new lakehouses must follow a medallion architecture by using the following three layers: bronze,
silver, and gold. There will be extensive data cleansing required to populate the MAR1 data in the
silver layer, including deduplication, the handling of missing values, and the standardizing of
capitalization.
Each layer must be fully populated before moving on to the next layer. If any step in populating the
lakehouses fails, an email must be sent to the data engineers.
Data imports must run simultaneously, when possible.
The use of email data from the Amazon S3 bucket must meet the following requirements:
Minimize egress costs associated with cross-cloud data access.
Prevent saving a copy of the raw data in the lakehouses.
Items that relate to data ingestion must meet the following requirements:
The items must be source controlled alongside other workspace items.
Ingested data must land in the bronze layer of Lakehouse1 in the Delta format.
No changes other than changes to the file formats must be implemented before the data lands in the
bronze layer.
Development effort must be minimized and a built-in connection must be used to import the source
data.
In the event of a connectivity error, the ingestion processes must attempt the connection again.
Lakehouses, data pipelines, and notebooks must be stored in WorkspaceA. Semantic models,
reports, and dataflows must be stored in WorkspaceB.
Once a week, old files that are no longer referenced by a Delta table log must be removed.
Requirements. Data Transformation
In the POS1 product data, ProductID values are unique. The product dimension in the gold layer must
include only active products from product list. Active products are identified by an IsActive value of 1.
Some product categories and subcategories are NOT assigned to any product. They are NOT
analytically relevant and must be omitted from the product dimension in the gold layer.
Requirements. Data Security
Security in Fabric must meet the following requirements:
The data engineers must have read and write access to all the lakehouses, including the underlying
files.
The data analysts must only have read access to the Delta tables in the gold layer.
The data analysts must NOT have access to the data in the bronze and silver layers.
The data engineers must be able to commit changes to source control in WorkspaceA.
You need to recommend a solution for handling old files. The solution must meet the technical
requirements. What should you include in the recommendation?
Select the answer
1 correct answer
A.
a data pipeline that includes a Copy data activity
B.
a notebook that runs the VACUUM command
C.
a notebook that runs the OPTIMIZE command
D.
a data pipeline that includes a Delete data activity

Quiz

10/10
Topic 1, Contoso, Ltd
Case Study
Overview
This is a case study. Case studies are not timed separately. You can use as much exam time as you
would like to complete each case. However, there may be additional case studies and sections on
this exam. You must manage your time to ensure that you are able to complete all questions included
on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is
provided in the case study. Case studies might contain exhibits and other resources that provide
more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your
answers and to make changes before you move to the next section of the exam. After you begin a
new section, you cannot return to this section.
To start the case study
To display the first question in this case study, click the Next button. Use the buttons in the left pane
to explore the content of the case study before you answer the questions. Clicking these buttons
displays information such as business requirements, existing environment, and problem statements.
If the case study has an All Information tab, note that the information displayed is identical to the
information displayed on the subsequent tabs. When you are ready to answer a question, click the
Question button to return to the question.
Overview. Company Overview
Contoso, Ltd. is an online retail company that wants to modernize its analytics platform by moving to
Fabric. The company plans to begin using Fabric for marketing analytics.
Overview. IT Structure
The company’s IT department has a team of data analysts and a team of data engineers that use
analytics systems.
The data engineers perform the ingestion, transformation, and loading of data. They prefer to use
Python or SQL to transform the data.
The data analysts query data and create semantic models and reports. They are qualified to write
queries in Power Query and T-SQL.
Existing Environment. Fabric
Contoso has an F64 capacity named Cap1. All Fabric users are allowed to create items.
Contoso has two workspaces named WorkspaceA and WorkspaceB that currently use Pro license
mode.
Existing Environment. Source Systems
Contoso has a point of sale (POS) system named POS1 that uses an instance of SQL Server on Azure
Virtual Machines in the same Microsoft Entra tenant as Fabric. The host virtual machine is on a
private virtual network that has public access blocked. POS1 contains all the sales transactions that
were processed on the company’s website.
The company has a software as a service (SaaS) online marketing app named MAR1. MAR1 has seven
entities. The entities contain data that relates to email open rates and interaction rates, as well as
website interactions. The data can be exported from MAR1 by calling REST APIs. Each entity has a
different endpoint.
Contoso has been using MAR1 for one year. Data from prior years is stored in Parquet files in an
Amazon Simple Storage Service (Amazon S3) bucket. There are 12 files that range in size from
300 MB to 900 MB and relate to email interactions.
Existing Environment. Product Data
POS1 contains a product list and related data. The data comes from the following three tables:
Products
ProductCategories
ProductSubcategories
In the data, products are related to product subcategories, and subcategories are related to product
categories.
Existing Environment. Azure
Contoso has a Microsoft Entra tenant that has the following mail-enabled security groups:
DataAnalysts: Contains the data analysts
DataEngineers: Contains the data engineers
Contoso has an Azure subscription.
The company has an existing Azure DevOps organization and creates a new project for repositories
that relate to Fabric.
Existing Environment. User Problems
The VP of marketing at Contoso requires analysis on the effectiveness of different types of email
content. It typically takes a week to manually compile and analyze the data. Contoso wants to reduce
the time to less than one day by using Fabric.
The data engineering team has successfully exported data from MAR1. The team experiences
transient connectivity errors, which causes the data exports to fail.
Requirements. Planned Changes
Contoso plans to create the following two lakehouses:
Lakehouse1: Will store both raw and cleansed data from the sources
Lakehouse2: Will serve data in a dimensional model to users for analytical queries
Additional items will be added to facilitate data ingestion and transformation.
Contoso plans to use Azure Repos for source control in Fabric.
Requirements. Technical Requirements
The new lakehouses must follow a medallion architecture by using the following three layers: bronze,
silver, and gold. There will be extensive data cleansing required to populate the MAR1 data in the
silver layer, including deduplication, the handling of missing values, and the standardizing of
capitalization.
Each layer must be fully populated before moving on to the next layer. If any step in populating the
lakehouses fails, an email must be sent to the data engineers.
Data imports must run simultaneously, when possible.
The use of email data from the Amazon S3 bucket must meet the following requirements:
Minimize egress costs associated with cross-cloud data access.
Prevent saving a copy of the raw data in the lakehouses.
Items that relate to data ingestion must meet the following requirements:
The items must be source controlled alongside other workspace items.
Ingested data must land in the bronze layer of Lakehouse1 in the Delta format.
No changes other than changes to the file formats must be implemented before the data lands in the
bronze layer.
Development effort must be minimized and a built-in connection must be used to import the source
data.
In the event of a connectivity error, the ingestion processes must attempt the connection again.
Lakehouses, data pipelines, and notebooks must be stored in WorkspaceA. Semantic models,
reports, and dataflows must be stored in WorkspaceB.
Once a week, old files that are no longer referenced by a Delta table log must be removed.
Requirements. Data Transformation
In the POS1 product data, ProductID values are unique. The product dimension in the gold layer must
include only active products from product list. Active products are identified by an IsActive value of 1.
Some product categories and subcategories are NOT assigned to any product. They are NOT
analytically relevant and must be omitted from the product dimension in the gold layer.
Requirements. Data Security
Security in Fabric must meet the following requirements:
The data engineers must have read and write access to all the lakehouses, including the underlying
files.
The data analysts must only have read access to the Delta tables in the gold layer.
The data analysts must NOT have access to the data in the bronze and silver layers.
The data engineers must be able to commit changes to source control in WorkspaceA.
You need to ensure that WorkspaceA can be configured for source control. Which two actions should
you perform?
Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.
Select the answer
2 correct answers
A.
Assign WorkspaceA to Capl.
B.
From Tenant setting, set Users can synchronize workspace items with their Git repositories to Enabled
C.
Configure WorkspaceA to use a Premium Per User (PPU) license
D.
From Tenant setting, set Users can sync workspace items with GitHub repositories to Enabled
Looking for more questions?Buy now

Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric Practice test unlocks all online simulator questions

Thank you for choosing the free version of the Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric practice test! Further deepen your knowledge on Microsoft Simulator; by unlocking the full version of our Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric Simulator you will be able to take tests with over 106 constantly updated questions and easily pass your exam. 98% of people pass the exam in the first attempt after preparing with our 106 questions.

BUY NOW

What to expect from our Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric practice tests and how to prepare for any exam?

The Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric Simulator Practice Tests are part of the Microsoft Database and are the best way to prepare for any Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric exam. The Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric practice tests consist of 106 questions and are written by experts to help you and prepare you to pass the exam on the first attempt. The Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric database includes questions from previous and other exams, which means you will be able to practice simulating past and future questions. Preparation with Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric Simulator will also give you an idea of the time it will take to complete each section of the Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric practice test . It is important to note that the Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric Simulator does not replace the classic Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric study guides; however, the Simulator provides valuable insights into what to expect and how much work needs to be done to prepare for the Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric exam.

BUY NOW

Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric Practice test therefore represents an excellent tool to prepare for the actual exam together with our Microsoft practice test . Our Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric Simulator will help you assess your level of preparation and understand your strengths and weaknesses. Below you can read all the quizzes you will find in our Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric Simulator and how our unique Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric Database made up of real questions:

Info quiz:

  • Quiz name:Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric
  • Total number of questions:106
  • Number of questions for the test:50
  • Pass score:80%

You can prepare for the Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric exams with our mobile app. It is very easy to use and even works offline in case of network failure, with all the functions you need to study and practice with our Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric Simulator.

Use our Mobile App, available for both Android and iOS devices, with our Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric Simulator . You can use it anywhere and always remember that our mobile app is free and available on all stores.

Our Mobile App contains all Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric practice tests which consist of 106 questions and also provide study material to pass the final Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric exam with guaranteed success. Our Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric database contain hundreds of questions and Microsoft Tests related to Microsoft Implementing Data Engineering Solutions Using Microsoft Fabric Exam. This way you can practice anywhere you want, even offline without the internet.

BUY NOW