20:00

Free Test
/ 10

Quiz

1/10
Topic 1, Contoso, Ltd Case Study Overview This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided. To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study. At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section. To start the case study To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question. Overview. Company Overview Contoso, Ltd. is an online retail company that wants to modernize its analytics platform by moving to Fabric. The company plans to begin using Fabric for marketing analytics. Overview. IT Structure The company’s IT department has a team of data analysts and a team of data engineers that use analytics systems. The data engineers perform the ingestion, transformation, and loading of data. They prefer to use Python or SQL to transform the data. The data analysts query data and create semantic models and reports. They are qualified to write queries in Power Query and T-SQL. Existing Environment. Fabric Contoso has an F64 capacity named Cap1. All Fabric users are allowed to create items. Contoso has two workspaces named WorkspaceA and WorkspaceB that currently use Pro license mode. Existing Environment. Source Systems Contoso has a point of sale (POS) system named POS1 that uses an instance of SQL Server on Azure Virtual Machines in the same Microsoft Entra tenant as Fabric. The host virtual machine is on a private virtual network that has public access blocked. POS1 contains all the sales transactions that were processed on the company’s website. The company has a software as a service (SaaS) online marketing app named MAR1. MAR1 has seven entities. The entities contain data that relates to email open rates and interaction rates, as well as website interactions. The data can be exported from MAR1 by calling REST APIs. Each entity has a different endpoint. Contoso has been using MAR1 for one year. Data from prior years is stored in Parquet files in an Amazon Simple Storage Service (Amazon S3) bucket. There are 12 files that range in size from 300 MB to 900 MB and relate to email interactions. Existing Environment. Product Data POS1 contains a product list and related data. The data comes from the following three tables: Products Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 1-317495320 ProductCategories Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 2-317495320 ProductSubcategories Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 3-317495320 In the data, products are related to product subcategories, and subcategories are related to product categories. Existing Environment. Azure Contoso has a Microsoft Entra tenant that has the following mail-enabled security groups: DataAnalysts: Contains the data analysts Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 4-317495320 DataEngineers: Contains the data engineers Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 5-317495320 Contoso has an Azure subscription. The company has an existing Azure DevOps organization and creates a new project for repositories that relate to Fabric. Existing Environment. User Problems The VP of marketing at Contoso requires analysis on the effectiveness of different types of email content. It typically takes a week to manually compile and analyze the data. Contoso wants to reduce the time to less than one day by using Fabric. The data engineering team has successfully exported data from MAR1. The team experiences transient connectivity errors, which causes the data exports to fail. Requirements. Planned Changes Contoso plans to create the following two lakehouses: Lakehouse1: Will store both raw and cleansed data from the sources Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 6-317495320 Lakehouse2: Will serve data in a dimensional model to users for analytical queries Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 7-3510210820 Additional items will be added to facilitate data ingestion and transformation. Contoso plans to use Azure Repos for source control in Fabric. Requirements. Technical Requirements The new lakehouses must follow a medallion architecture by using the following three layers: bronze, silver, and gold. There will be extensive data cleansing required to populate the MAR1 data in the silver layer, including deduplication, the handling of missing values, and the standardizing of capitalization. Each layer must be fully populated before moving on to the next layer. If any step in populating the lakehouses fails, an email must be sent to the data engineers. Data imports must run simultaneously, when possible. The use of email data from the Amazon S3 bucket must meet the following requirements: Minimize egress costs associated with cross-cloud data access. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 8-3510210820 Prevent saving a copy of the raw data in the lakehouses. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 9-3510210820 Items that relate to data ingestion must meet the following requirements: The items must be source controlled alongside other workspace items. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 10-317495320 Ingested data must land in the bronze layer of Lakehouse1 in the Delta format. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 11-317495320 No changes other than changes to the file formats must be implemented before the data lands in Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 12-3510210820 the bronze layer. Development effort must be minimized and a built-in connection must be used to import the Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 13-317495320 source data. In the event of a connectivity error, the ingestion processes must attempt the connection again. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 14-317495320 Lakehouses, data pipelines, and notebooks must be stored in WorkspaceA. Semantic models, reports, and dataflows must be stored in WorkspaceB. Once a week, old files that are no longer referenced by a Delta table log must be removed. Requirements. Data Transformation In the POS1 product data, ProductID values are unique. The product dimension in the gold layer must include only active products from product list. Active products are identified by an IsActive value of 1. Some product categories and subcategories are NOT assigned to any product. They are NOT analytically relevant and must be omitted from the product dimension in the gold layer. Requirements. Data Security Security in Fabric must meet the following requirements: The data engineers must have read and write access to all the lakehouses, including the underlying Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 15-317495320 files. The data analysts must only have read access to the Delta tables in the gold layer. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 16-3510210820 The data analysts must NOT have access to the data in the bronze and silver layers. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 17-317495320 The data engineers must be able to commit changes to source control in WorkspaceA. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 18-317495320
You need to ensure that the data analysts can access the gold layer lakehouse. What should you do?
Select the answer
1 correct answer
A.
Add the DataAnalyst group to the Viewer role for WorkspaceA.
B.
Share the lakehouse with the DataAnalysts group and grant the Build reports on the default semantic model permission.
C.
Share the lakehouse with the DataAnalysts group and grant the Read all SQL Endpoint data permission.
D.
Share the lakehouse with the DataAnalysts group and grant the Read all Apache Spark permission.

Quiz

2/10
Topic 1, Contoso, Ltd Case Study Overview This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided. To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study. At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section. To start the case study To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question. Overview. Company Overview Contoso, Ltd. is an online retail company that wants to modernize its analytics platform by moving to Fabric. The company plans to begin using Fabric for marketing analytics. Overview. IT Structure The company’s IT department has a team of data analysts and a team of data engineers that use analytics systems. The data engineers perform the ingestion, transformation, and loading of data. They prefer to use Python or SQL to transform the data. The data analysts query data and create semantic models and reports. They are qualified to write queries in Power Query and T-SQL. Existing Environment. Fabric Contoso has an F64 capacity named Cap1. All Fabric users are allowed to create items. Contoso has two workspaces named WorkspaceA and WorkspaceB that currently use Pro license mode. Existing Environment. Source Systems Contoso has a point of sale (POS) system named POS1 that uses an instance of SQL Server on Azure Virtual Machines in the same Microsoft Entra tenant as Fabric. The host virtual machine is on a private virtual network that has public access blocked. POS1 contains all the sales transactions that were processed on the company’s website. The company has a software as a service (SaaS) online marketing app named MAR1. MAR1 has seven entities. The entities contain data that relates to email open rates and interaction rates, as well as website interactions. The data can be exported from MAR1 by calling REST APIs. Each entity has a different endpoint. Contoso has been using MAR1 for one year. Data from prior years is stored in Parquet files in an Amazon Simple Storage Service (Amazon S3) bucket. There are 12 files that range in size from 300 MB to 900 MB and relate to email interactions. Existing Environment. Product Data POS1 contains a product list and related data. The data comes from the following three tables: Products Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 1-317495320 ProductCategories Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 2-317495320 ProductSubcategories Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 3-317495320 In the data, products are related to product subcategories, and subcategories are related to product categories. Existing Environment. Azure Contoso has a Microsoft Entra tenant that has the following mail-enabled security groups: DataAnalysts: Contains the data analysts Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 4-317495320 DataEngineers: Contains the data engineers Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 5-317495320 Contoso has an Azure subscription. The company has an existing Azure DevOps organization and creates a new project for repositories that relate to Fabric. Existing Environment. User Problems The VP of marketing at Contoso requires analysis on the effectiveness of different types of email content. It typically takes a week to manually compile and analyze the data. Contoso wants to reduce the time to less than one day by using Fabric. The data engineering team has successfully exported data from MAR1. The team experiences transient connectivity errors, which causes the data exports to fail. Requirements. Planned Changes Contoso plans to create the following two lakehouses: Lakehouse1: Will store both raw and cleansed data from the sources Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 6-317495320 Lakehouse2: Will serve data in a dimensional model to users for analytical queries Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 7-3510210820 Additional items will be added to facilitate data ingestion and transformation. Contoso plans to use Azure Repos for source control in Fabric. Requirements. Technical Requirements The new lakehouses must follow a medallion architecture by using the following three layers: bronze, silver, and gold. There will be extensive data cleansing required to populate the MAR1 data in the silver layer, including deduplication, the handling of missing values, and the standardizing of capitalization. Each layer must be fully populated before moving on to the next layer. If any step in populating the lakehouses fails, an email must be sent to the data engineers. Data imports must run simultaneously, when possible. The use of email data from the Amazon S3 bucket must meet the following requirements: Minimize egress costs associated with cross-cloud data access. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 8-3510210820 Prevent saving a copy of the raw data in the lakehouses. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 9-3510210820 Items that relate to data ingestion must meet the following requirements: The items must be source controlled alongside other workspace items. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 10-317495320 Ingested data must land in the bronze layer of Lakehouse1 in the Delta format. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 11-317495320 No changes other than changes to the file formats must be implemented before the data lands in Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 12-3510210820 the bronze layer. Development effort must be minimized and a built-in connection must be used to import the Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 13-317495320 source data. In the event of a connectivity error, the ingestion processes must attempt the connection again. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 14-317495320 Lakehouses, data pipelines, and notebooks must be stored in WorkspaceA. Semantic models, reports, and dataflows must be stored in WorkspaceB. Once a week, old files that are no longer referenced by a Delta table log must be removed. Requirements. Data Transformation In the POS1 product data, ProductID values are unique. The product dimension in the gold layer must include only active products from product list. Active products are identified by an IsActive value of 1. Some product categories and subcategories are NOT assigned to any product. They are NOT analytically relevant and must be omitted from the product dimension in the gold layer. Requirements. Data Security Security in Fabric must meet the following requirements: The data engineers must have read and write access to all the lakehouses, including the underlying Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 15-317495320 files. The data analysts must only have read access to the Delta tables in the gold layer. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 16-3510210820 The data analysts must NOT have access to the data in the bronze and silver layers. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 17-317495320 The data engineers must be able to commit changes to source control in WorkspaceA. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 18-317495320
HOTSPOT You need to recommend a method to populate the POS1 data to the lakehouse medallion layers. What should you recommend for each layer? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 19-3750268748
Select the answer
1 correct answer
Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 20-3385081338 Bronze Layer: A pipeline Copy activity The bronze layer is used to store raw, unprocessed dat a. The requirements specify that no transformations should be applied before landing the data in this layer. Using a pipeline Copy activity ensures minimal development effort, built-in connectors, and the ability to ingest the data directly into the Delta format in the bronze layer. Silver Layer: A notebook The silver layer involves extensive data cleansing (deduplication, handling missing values, and standardizing capitalization). A notebook provides the flexibility to implement complex transformations and is well-suited for this task.

Quiz

3/10
Topic 1, Contoso, Ltd Case Study Overview This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided. To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study. At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section. To start the case study To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question. Overview. Company Overview Contoso, Ltd. is an online retail company that wants to modernize its analytics platform by moving to Fabric. The company plans to begin using Fabric for marketing analytics. Overview. IT Structure The company’s IT department has a team of data analysts and a team of data engineers that use analytics systems. The data engineers perform the ingestion, transformation, and loading of data. They prefer to use Python or SQL to transform the data. The data analysts query data and create semantic models and reports. They are qualified to write queries in Power Query and T-SQL. Existing Environment. Fabric Contoso has an F64 capacity named Cap1. All Fabric users are allowed to create items. Contoso has two workspaces named WorkspaceA and WorkspaceB that currently use Pro license mode. Existing Environment. Source Systems Contoso has a point of sale (POS) system named POS1 that uses an instance of SQL Server on Azure Virtual Machines in the same Microsoft Entra tenant as Fabric. The host virtual machine is on a private virtual network that has public access blocked. POS1 contains all the sales transactions that were processed on the company’s website. The company has a software as a service (SaaS) online marketing app named MAR1. MAR1 has seven entities. The entities contain data that relates to email open rates and interaction rates, as well as website interactions. The data can be exported from MAR1 by calling REST APIs. Each entity has a different endpoint. Contoso has been using MAR1 for one year. Data from prior years is stored in Parquet files in an Amazon Simple Storage Service (Amazon S3) bucket. There are 12 files that range in size from 300 MB to 900 MB and relate to email interactions. Existing Environment. Product Data POS1 contains a product list and related data. The data comes from the following three tables: Products Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 1-317495320 ProductCategories Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 2-317495320 ProductSubcategories Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 3-317495320 In the data, products are related to product subcategories, and subcategories are related to product categories. Existing Environment. Azure Contoso has a Microsoft Entra tenant that has the following mail-enabled security groups: DataAnalysts: Contains the data analysts Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 4-317495320 DataEngineers: Contains the data engineers Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 5-317495320 Contoso has an Azure subscription. The company has an existing Azure DevOps organization and creates a new project for repositories that relate to Fabric. Existing Environment. User Problems The VP of marketing at Contoso requires analysis on the effectiveness of different types of email content. It typically takes a week to manually compile and analyze the data. Contoso wants to reduce the time to less than one day by using Fabric. The data engineering team has successfully exported data from MAR1. The team experiences transient connectivity errors, which causes the data exports to fail. Requirements. Planned Changes Contoso plans to create the following two lakehouses: Lakehouse1: Will store both raw and cleansed data from the sources Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 6-317495320 Lakehouse2: Will serve data in a dimensional model to users for analytical queries Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 7-3510210820 Additional items will be added to facilitate data ingestion and transformation. Contoso plans to use Azure Repos for source control in Fabric. Requirements. Technical Requirements The new lakehouses must follow a medallion architecture by using the following three layers: bronze, silver, and gold. There will be extensive data cleansing required to populate the MAR1 data in the silver layer, including deduplication, the handling of missing values, and the standardizing of capitalization. Each layer must be fully populated before moving on to the next layer. If any step in populating the lakehouses fails, an email must be sent to the data engineers. Data imports must run simultaneously, when possible. The use of email data from the Amazon S3 bucket must meet the following requirements: Minimize egress costs associated with cross-cloud data access. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 8-3510210820 Prevent saving a copy of the raw data in the lakehouses. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 9-3510210820 Items that relate to data ingestion must meet the following requirements: The items must be source controlled alongside other workspace items. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 10-317495320 Ingested data must land in the bronze layer of Lakehouse1 in the Delta format. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 11-317495320 No changes other than changes to the file formats must be implemented before the data lands in Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 12-3510210820 the bronze layer. Development effort must be minimized and a built-in connection must be used to import the Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 13-317495320 source data. In the event of a connectivity error, the ingestion processes must attempt the connection again. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 14-317495320 Lakehouses, data pipelines, and notebooks must be stored in WorkspaceA. Semantic models, reports, and dataflows must be stored in WorkspaceB. Once a week, old files that are no longer referenced by a Delta table log must be removed. Requirements. Data Transformation In the POS1 product data, ProductID values are unique. The product dimension in the gold layer must include only active products from product list. Active products are identified by an IsActive value of 1. Some product categories and subcategories are NOT assigned to any product. They are NOT analytically relevant and must be omitted from the product dimension in the gold layer. Requirements. Data Security Security in Fabric must meet the following requirements: The data engineers must have read and write access to all the lakehouses, including the underlying Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 15-317495320 files. The data analysts must only have read access to the Delta tables in the gold layer. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 16-3510210820 The data analysts must NOT have access to the data in the bronze and silver layers. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 17-317495320 The data engineers must be able to commit changes to source control in WorkspaceA. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 18-317495320
You need to ensure that usage of the data in the Amazon S3 bucket meets the technical requirements. What should you do?
Select the answer
1 correct answer
A.
Create a workspace identity and enable high concurrency for the notebooks.
B.
Create a shortcut and ensure that caching is disabled for the workspace.
C.
Create a workspace identity and use the identity in a data pipeline.
D.
Create a shortcut and ensure that caching is enabled for the workspace.

Quiz

4/10
Topic 1, Contoso, Ltd Case Study Overview This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided. To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study. At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section. To start the case study To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question. Overview. Company Overview Contoso, Ltd. is an online retail company that wants to modernize its analytics platform by moving to Fabric. The company plans to begin using Fabric for marketing analytics. Overview. IT Structure The company’s IT department has a team of data analysts and a team of data engineers that use analytics systems. The data engineers perform the ingestion, transformation, and loading of data. They prefer to use Python or SQL to transform the data. The data analysts query data and create semantic models and reports. They are qualified to write queries in Power Query and T-SQL. Existing Environment. Fabric Contoso has an F64 capacity named Cap1. All Fabric users are allowed to create items. Contoso has two workspaces named WorkspaceA and WorkspaceB that currently use Pro license mode. Existing Environment. Source Systems Contoso has a point of sale (POS) system named POS1 that uses an instance of SQL Server on Azure Virtual Machines in the same Microsoft Entra tenant as Fabric. The host virtual machine is on a private virtual network that has public access blocked. POS1 contains all the sales transactions that were processed on the company’s website. The company has a software as a service (SaaS) online marketing app named MAR1. MAR1 has seven entities. The entities contain data that relates to email open rates and interaction rates, as well as website interactions. The data can be exported from MAR1 by calling REST APIs. Each entity has a different endpoint. Contoso has been using MAR1 for one year. Data from prior years is stored in Parquet files in an Amazon Simple Storage Service (Amazon S3) bucket. There are 12 files that range in size from 300 MB to 900 MB and relate to email interactions. Existing Environment. Product Data POS1 contains a product list and related data. The data comes from the following three tables: Products Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 1-317495320 ProductCategories Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 2-317495320 ProductSubcategories Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 3-317495320 In the data, products are related to product subcategories, and subcategories are related to product categories. Existing Environment. Azure Contoso has a Microsoft Entra tenant that has the following mail-enabled security groups: DataAnalysts: Contains the data analysts Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 4-317495320 DataEngineers: Contains the data engineers Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 5-317495320 Contoso has an Azure subscription. The company has an existing Azure DevOps organization and creates a new project for repositories that relate to Fabric. Existing Environment. User Problems The VP of marketing at Contoso requires analysis on the effectiveness of different types of email content. It typically takes a week to manually compile and analyze the data. Contoso wants to reduce the time to less than one day by using Fabric. The data engineering team has successfully exported data from MAR1. The team experiences transient connectivity errors, which causes the data exports to fail. Requirements. Planned Changes Contoso plans to create the following two lakehouses: Lakehouse1: Will store both raw and cleansed data from the sources Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 6-317495320 Lakehouse2: Will serve data in a dimensional model to users for analytical queries Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 7-3510210820 Additional items will be added to facilitate data ingestion and transformation. Contoso plans to use Azure Repos for source control in Fabric. Requirements. Technical Requirements The new lakehouses must follow a medallion architecture by using the following three layers: bronze, silver, and gold. There will be extensive data cleansing required to populate the MAR1 data in the silver layer, including deduplication, the handling of missing values, and the standardizing of capitalization. Each layer must be fully populated before moving on to the next layer. If any step in populating the lakehouses fails, an email must be sent to the data engineers. Data imports must run simultaneously, when possible. The use of email data from the Amazon S3 bucket must meet the following requirements: Minimize egress costs associated with cross-cloud data access. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 8-3510210820 Prevent saving a copy of the raw data in the lakehouses. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 9-3510210820 Items that relate to data ingestion must meet the following requirements: The items must be source controlled alongside other workspace items. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 10-317495320 Ingested data must land in the bronze layer of Lakehouse1 in the Delta format. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 11-317495320 No changes other than changes to the file formats must be implemented before the data lands in Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 12-3510210820 the bronze layer. Development effort must be minimized and a built-in connection must be used to import the Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 13-317495320 source data. In the event of a connectivity error, the ingestion processes must attempt the connection again. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 14-317495320 Lakehouses, data pipelines, and notebooks must be stored in WorkspaceA. Semantic models, reports, and dataflows must be stored in WorkspaceB. Once a week, old files that are no longer referenced by a Delta table log must be removed. Requirements. Data Transformation In the POS1 product data, ProductID values are unique. The product dimension in the gold layer must include only active products from product list. Active products are identified by an IsActive value of 1. Some product categories and subcategories are NOT assigned to any product. They are NOT analytically relevant and must be omitted from the product dimension in the gold layer. Requirements. Data Security Security in Fabric must meet the following requirements: The data engineers must have read and write access to all the lakehouses, including the underlying Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 15-317495320 files. The data analysts must only have read access to the Delta tables in the gold layer. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 16-3510210820 The data analysts must NOT have access to the data in the bronze and silver layers. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 17-317495320 The data engineers must be able to commit changes to source control in WorkspaceA. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 18-317495320
HOTSPOT You need to create the product dimension. How should you complete the Apache Spark SQL code? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 23-1478811553
Select the answer
1 correct answer
Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 24-1701380416 Join between Products and ProductSubCategories: Use an INNER JOIN. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 25-2038696993 The goal is to include only products that are assigned to a subcategory. An INNER JOIN ensures that Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 26-2413424720 only matching records (i.e., products with a valid subcategory) are included. Join between ProductSubCategories and ProductCategories: Use an INNER JOIN. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 27-2038696993 Similar to the above logic, we want to include only subcategories assigned to a valid product Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 28-2413424720 category. An INNER JOIN ensures this condition is met. WHERE Clause Condition: IsActive = 1 Only active products (where IsActive equals 1) should be included in the gold layer. This filters out inactive products.

Quiz

5/10
Topic 1, Contoso, Ltd Case Study Overview This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided. To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study. At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section. To start the case study To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question. Overview. Company Overview Contoso, Ltd. is an online retail company that wants to modernize its analytics platform by moving to Fabric. The company plans to begin using Fabric for marketing analytics. Overview. IT Structure The company’s IT department has a team of data analysts and a team of data engineers that use analytics systems. The data engineers perform the ingestion, transformation, and loading of data. They prefer to use Python or SQL to transform the data. The data analysts query data and create semantic models and reports. They are qualified to write queries in Power Query and T-SQL. Existing Environment. Fabric Contoso has an F64 capacity named Cap1. All Fabric users are allowed to create items. Contoso has two workspaces named WorkspaceA and WorkspaceB that currently use Pro license mode. Existing Environment. Source Systems Contoso has a point of sale (POS) system named POS1 that uses an instance of SQL Server on Azure Virtual Machines in the same Microsoft Entra tenant as Fabric. The host virtual machine is on a private virtual network that has public access blocked. POS1 contains all the sales transactions that were processed on the company’s website. The company has a software as a service (SaaS) online marketing app named MAR1. MAR1 has seven entities. The entities contain data that relates to email open rates and interaction rates, as well as website interactions. The data can be exported from MAR1 by calling REST APIs. Each entity has a different endpoint. Contoso has been using MAR1 for one year. Data from prior years is stored in Parquet files in an Amazon Simple Storage Service (Amazon S3) bucket. There are 12 files that range in size from 300 MB to 900 MB and relate to email interactions. Existing Environment. Product Data POS1 contains a product list and related data. The data comes from the following three tables: Products Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 1-317495320 ProductCategories Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 2-317495320 ProductSubcategories Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 3-317495320 In the data, products are related to product subcategories, and subcategories are related to product categories. Existing Environment. Azure Contoso has a Microsoft Entra tenant that has the following mail-enabled security groups: DataAnalysts: Contains the data analysts Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 4-317495320 DataEngineers: Contains the data engineers Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 5-317495320 Contoso has an Azure subscription. The company has an existing Azure DevOps organization and creates a new project for repositories that relate to Fabric. Existing Environment. User Problems The VP of marketing at Contoso requires analysis on the effectiveness of different types of email content. It typically takes a week to manually compile and analyze the data. Contoso wants to reduce the time to less than one day by using Fabric. The data engineering team has successfully exported data from MAR1. The team experiences transient connectivity errors, which causes the data exports to fail. Requirements. Planned Changes Contoso plans to create the following two lakehouses: Lakehouse1: Will store both raw and cleansed data from the sources Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 6-317495320 Lakehouse2: Will serve data in a dimensional model to users for analytical queries Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 7-3510210820 Additional items will be added to facilitate data ingestion and transformation. Contoso plans to use Azure Repos for source control in Fabric. Requirements. Technical Requirements The new lakehouses must follow a medallion architecture by using the following three layers: bronze, silver, and gold. There will be extensive data cleansing required to populate the MAR1 data in the silver layer, including deduplication, the handling of missing values, and the standardizing of capitalization. Each layer must be fully populated before moving on to the next layer. If any step in populating the lakehouses fails, an email must be sent to the data engineers. Data imports must run simultaneously, when possible. The use of email data from the Amazon S3 bucket must meet the following requirements: Minimize egress costs associated with cross-cloud data access. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 8-3510210820 Prevent saving a copy of the raw data in the lakehouses. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 9-3510210820 Items that relate to data ingestion must meet the following requirements: The items must be source controlled alongside other workspace items. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 10-317495320 Ingested data must land in the bronze layer of Lakehouse1 in the Delta format. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 11-317495320 No changes other than changes to the file formats must be implemented before the data lands in Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 12-3510210820 the bronze layer. Development effort must be minimized and a built-in connection must be used to import the Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 13-317495320 source data. In the event of a connectivity error, the ingestion processes must attempt the connection again. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 14-317495320 Lakehouses, data pipelines, and notebooks must be stored in WorkspaceA. Semantic models, reports, and dataflows must be stored in WorkspaceB. Once a week, old files that are no longer referenced by a Delta table log must be removed. Requirements. Data Transformation In the POS1 product data, ProductID values are unique. The product dimension in the gold layer must include only active products from product list. Active products are identified by an IsActive value of 1. Some product categories and subcategories are NOT assigned to any product. They are NOT analytically relevant and must be omitted from the product dimension in the gold layer. Requirements. Data Security Security in Fabric must meet the following requirements: The data engineers must have read and write access to all the lakehouses, including the underlying Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 15-317495320 files. The data analysts must only have read access to the Delta tables in the gold layer. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 16-3510210820 The data analysts must NOT have access to the data in the bronze and silver layers. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 17-317495320 The data engineers must be able to commit changes to source control in WorkspaceA. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 18-317495320
You need to populate the MAR1 data in the bronze layer. Which two types of activities should you include in the pipeline? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.
Select the answer
2 correct answers
A.
ForEach
B.
Copy data
C.
WebHook
D.
Stored procedure

Quiz

6/10
Topic 2, Litware, Inc Case Study Overview This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided. To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study. At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section. To start the case study To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question. Overview Litware, Inc. is a publishing company that has an online bookstore and several retail bookstores worldwide. Litware also manages an online advertising business for the authors it represents. Existing Environment. Fabric Environment Litware has a Fabric workspace named Workspace1. High concurrency is enabled for Workspace1. The company has a data engineering team that uses Python for data processing. Existing Environment. Data Processing The retail bookstores send sales data at the end of each business day, while the online bookstore constantly provides logs and sales data to a central enterprise resource planning (ERP) system. Litware implements a medallion architecture by using the following three layers: bronze, silver, and gold. The sales data is ingested from the ERP system as Parquet files that land in the Files folder in a lakehouse. Notebooks are used to transform the files in a Delta table for the bronze and silver layers. The gold layer is in a warehouse that has V-Order disabled. Litware has image files of book covers in Azure Blob Storage. The files are loaded into the Files folder. Existing Environment. Sales Data Month-end sales data is processed on the first calendar day of each month. Data that is older than one month never changes. In the source system, the sales data refreshes every six hours starting at midnight each day. The sales data is captured in a Dataflow Gen1 dataflow. When the dataflow runs, new and historical data is captured. The dataflow captures the following fields of the source: Sales Date Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 29-3510210820 Author Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 30-317495320 Price Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 31-317495320 Units Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 32-317495320 SKU Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 33-317495320 A table named AuthorSales stores the sales data that relates to each author. The table contains a column named AuthorEmail. Authors authenticate to a guest Fabric tenant by using their email address. Existing Environment. Security Groups Litware has the following security groups: Sales Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 34-317495320 Fabric Admins Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 35-3510210820 Streaming Admins Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 36-317495320 Existing Environment. Performance Issues Business users perform ad-hoc queries against the warehouse. The business users indicate that reports against the warehouse sometimes run for two hours and fail to load as expected. Upon further investigation, the data engineering team receives the following error message when the reports fail to load: “The SQL query failed while running.” The data engineering team wants to debug the issue and find queries that cause more than one failure. When the authors have new book releases, there is often an increase in sales activity. This increase slows the data ingestion process. The company’s sales team reports that during the last month, the sales data has NOT been up-to- date when they arrive at work in the morning. Requirements. Planned Changes Litware recently signed a contract to receive book reviews. The provider of the reviews exposes the data in Amazon Simple Storage Service (Amazon S3) buckets. Litware plans to manage Search Engine Optimization (SEO) for the authors. The SEO data will be streamed from a REST API. Requirements. Version Control Litware plans to implement a version control solution in Fabric that will use GitHub integration and follow the principle of least privilege. Requirements. Governance Requirements To control data platform costs, the data platform must use only Fabric services and items. Additional Azure resources must NOT be provisioned. Requirements. Data Requirements Litware identifies the following data requirements: Process the SEO data in near-real-time (NRT). Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 37-317495320 Make the book reviews available in the lakehouse without making a copy of the data. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 38-317495320 When a new book cover image arrives in the Files folder, process the image as soon as possible. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 39-317495320
You need to implement the solution for the book reviews. Which should you do?
Select the answer
1 correct answer
A.
Create a Dataflow Gen2 dataflow.
B.
Create a shortcut.
C.
Enable external data sharing.
D.
Create a data pipeline.

Quiz

7/10
Topic 2, Litware, Inc Case Study Overview This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided. To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study. At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section. To start the case study To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question. Overview Litware, Inc. is a publishing company that has an online bookstore and several retail bookstores worldwide. Litware also manages an online advertising business for the authors it represents. Existing Environment. Fabric Environment Litware has a Fabric workspace named Workspace1. High concurrency is enabled for Workspace1. The company has a data engineering team that uses Python for data processing. Existing Environment. Data Processing The retail bookstores send sales data at the end of each business day, while the online bookstore constantly provides logs and sales data to a central enterprise resource planning (ERP) system. Litware implements a medallion architecture by using the following three layers: bronze, silver, and gold. The sales data is ingested from the ERP system as Parquet files that land in the Files folder in a lakehouse. Notebooks are used to transform the files in a Delta table for the bronze and silver layers. The gold layer is in a warehouse that has V-Order disabled. Litware has image files of book covers in Azure Blob Storage. The files are loaded into the Files folder. Existing Environment. Sales Data Month-end sales data is processed on the first calendar day of each month. Data that is older than one month never changes. In the source system, the sales data refreshes every six hours starting at midnight each day. The sales data is captured in a Dataflow Gen1 dataflow. When the dataflow runs, new and historical data is captured. The dataflow captures the following fields of the source: Sales Date Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 29-3510210820 Author Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 30-317495320 Price Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 31-317495320 Units Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 32-317495320 SKU Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 33-317495320 A table named AuthorSales stores the sales data that relates to each author. The table contains a column named AuthorEmail. Authors authenticate to a guest Fabric tenant by using their email address. Existing Environment. Security Groups Litware has the following security groups: Sales Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 34-317495320 Fabric Admins Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 35-3510210820 Streaming Admins Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 36-317495320 Existing Environment. Performance Issues Business users perform ad-hoc queries against the warehouse. The business users indicate that reports against the warehouse sometimes run for two hours and fail to load as expected. Upon further investigation, the data engineering team receives the following error message when the reports fail to load: “The SQL query failed while running.” The data engineering team wants to debug the issue and find queries that cause more than one failure. When the authors have new book releases, there is often an increase in sales activity. This increase slows the data ingestion process. The company’s sales team reports that during the last month, the sales data has NOT been up-to- date when they arrive at work in the morning. Requirements. Planned Changes Litware recently signed a contract to receive book reviews. The provider of the reviews exposes the data in Amazon Simple Storage Service (Amazon S3) buckets. Litware plans to manage Search Engine Optimization (SEO) for the authors. The SEO data will be streamed from a REST API. Requirements. Version Control Litware plans to implement a version control solution in Fabric that will use GitHub integration and follow the principle of least privilege. Requirements. Governance Requirements To control data platform costs, the data platform must use only Fabric services and items. Additional Azure resources must NOT be provisioned. Requirements. Data Requirements Litware identifies the following data requirements: Process the SEO data in near-real-time (NRT). Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 37-317495320 Make the book reviews available in the lakehouse without making a copy of the data. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 38-317495320 When a new book cover image arrives in the Files folder, process the image as soon as possible. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 39-317495320
You need to resolve the sales data issue. The solution must minimize the amount of data transferred. What should you do?
Select the answer
1 correct answer
A.
Spilt the dataflow into two dataflows.
B.
Configure scheduled refresh for the dataflow.
C.
Configure incremental refresh for the dataflow. Set Store rows from the past to 1 Month.
D.
Configure incremental refresh for the dataflow. Set Refresh rows from the past to 1 Year.
E.
Configure incremental refresh for the dataflow. Set Refresh rows from the past to 1 Month.

Quiz

8/10
Topic 2, Litware, Inc Case Study Overview This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided. To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study. At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section. To start the case study To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question. Overview Litware, Inc. is a publishing company that has an online bookstore and several retail bookstores worldwide. Litware also manages an online advertising business for the authors it represents. Existing Environment. Fabric Environment Litware has a Fabric workspace named Workspace1. High concurrency is enabled for Workspace1. The company has a data engineering team that uses Python for data processing. Existing Environment. Data Processing The retail bookstores send sales data at the end of each business day, while the online bookstore constantly provides logs and sales data to a central enterprise resource planning (ERP) system. Litware implements a medallion architecture by using the following three layers: bronze, silver, and gold. The sales data is ingested from the ERP system as Parquet files that land in the Files folder in a lakehouse. Notebooks are used to transform the files in a Delta table for the bronze and silver layers. The gold layer is in a warehouse that has V-Order disabled. Litware has image files of book covers in Azure Blob Storage. The files are loaded into the Files folder. Existing Environment. Sales Data Month-end sales data is processed on the first calendar day of each month. Data that is older than one month never changes. In the source system, the sales data refreshes every six hours starting at midnight each day. The sales data is captured in a Dataflow Gen1 dataflow. When the dataflow runs, new and historical data is captured. The dataflow captures the following fields of the source: Sales Date Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 29-3510210820 Author Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 30-317495320 Price Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 31-317495320 Units Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 32-317495320 SKU Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 33-317495320 A table named AuthorSales stores the sales data that relates to each author. The table contains a column named AuthorEmail. Authors authenticate to a guest Fabric tenant by using their email address. Existing Environment. Security Groups Litware has the following security groups: Sales Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 34-317495320 Fabric Admins Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 35-3510210820 Streaming Admins Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 36-317495320 Existing Environment. Performance Issues Business users perform ad-hoc queries against the warehouse. The business users indicate that reports against the warehouse sometimes run for two hours and fail to load as expected. Upon further investigation, the data engineering team receives the following error message when the reports fail to load: “The SQL query failed while running.” The data engineering team wants to debug the issue and find queries that cause more than one failure. When the authors have new book releases, there is often an increase in sales activity. This increase slows the data ingestion process. The company’s sales team reports that during the last month, the sales data has NOT been up-to- date when they arrive at work in the morning. Requirements. Planned Changes Litware recently signed a contract to receive book reviews. The provider of the reviews exposes the data in Amazon Simple Storage Service (Amazon S3) buckets. Litware plans to manage Search Engine Optimization (SEO) for the authors. The SEO data will be streamed from a REST API. Requirements. Version Control Litware plans to implement a version control solution in Fabric that will use GitHub integration and follow the principle of least privilege. Requirements. Governance Requirements To control data platform costs, the data platform must use only Fabric services and items. Additional Azure resources must NOT be provisioned. Requirements. Data Requirements Litware identifies the following data requirements: Process the SEO data in near-real-time (NRT). Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 37-317495320 Make the book reviews available in the lakehouse without making a copy of the data. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 38-317495320 When a new book cover image arrives in the Files folder, process the image as soon as possible. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 39-317495320
HOTSPOT You need to troubleshoot the ad-hoc query issue. How should you complete the statement? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point. Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 41-3981182168
Select the answer
1 correct answer
Certification Exam Microsoft Certified: Fabric Data Engineer Associate Microsoft Microsoft-DP-700 40-2535923464 SELECT last_run_start_time, last_run_command: These fields will help identify the execution details of the long-running queries. FROM queryinsights.long_running_queries: The correct solution is to check the long-running queries using the queryinsights.long_running_queries view, which provides insights into queries that take longer than expected to execute. WHERE last_run_total_elapsed_time_ms > 7200000: This condition filters queries that took more than 2 hours to complete (7200000 milliseconds), which is relevant to the issue described. AND number_of_failed_runs > 1: This condition is key for identifying queries that have failed more than once, helping to isolate the problematic queries that cause failures and need attention.

Quiz

9/10
You have a Fabric workspace. You have semi-structured data. You need to read the data by using T-SQL, KQL, and Apache Spark. The data will only be written by using Spark. What should you use to store the data?
Select the answer
1 correct answer
A.
a lakehouse
B.
an eventhouse
C.
a datamart
D.
a warehouse

Quiz

10/10
You have a Fabric workspace that contains a warehouse named Warehouse1. You have an on-premises Microsoft SQL Server database named Database1 that is accessed by using an on-premises data gateway. You need to copy data from Database1 to Warehouse1. Which item should you use?
Select the answer
1 correct answer
A.
a Dataflow Gen1 dataflow
B.
a data pipeline
C.
a KQL queryset
D.
a notebook
Looking for more questions?Buy now

Microsoft Certified: Fabric Data Engineer Associate Practice test unlocks all online simulator questions

Thank you for choosing the free version of the Microsoft Certified: Fabric Data Engineer Associate practice test! Further deepen your knowledge on Microsoft Simulator; by unlocking the full version of our Microsoft Certified: Fabric Data Engineer Associate Simulator you will be able to take tests with over 463 constantly updated questions and easily pass your exam. 98% of people pass the exam in the first attempt after preparing with our 463 questions.

BUY NOW

What to expect from our Microsoft Certified: Fabric Data Engineer Associate practice tests and how to prepare for any exam?

The Microsoft Certified: Fabric Data Engineer Associate Simulator Practice Tests are part of the Microsoft Database and are the best way to prepare for any Microsoft Certified: Fabric Data Engineer Associate exam. The Microsoft Certified: Fabric Data Engineer Associate practice tests consist of 463 questions and are written by experts to help you and prepare you to pass the exam on the first attempt. The Microsoft Certified: Fabric Data Engineer Associate database includes questions from previous and other exams, which means you will be able to practice simulating past and future questions. Preparation with Microsoft Certified: Fabric Data Engineer Associate Simulator will also give you an idea of the time it will take to complete each section of the Microsoft Certified: Fabric Data Engineer Associate practice test . It is important to note that the Microsoft Certified: Fabric Data Engineer Associate Simulator does not replace the classic Microsoft Certified: Fabric Data Engineer Associate study guides; however, the Simulator provides valuable insights into what to expect and how much work needs to be done to prepare for the Microsoft Certified: Fabric Data Engineer Associate exam.

BUY NOW

Microsoft Certified: Fabric Data Engineer Associate Practice test therefore represents an excellent tool to prepare for the actual exam together with our Microsoft practice test . Our Microsoft Certified: Fabric Data Engineer Associate Simulator will help you assess your level of preparation and understand your strengths and weaknesses. Below you can read all the quizzes you will find in our Microsoft Certified: Fabric Data Engineer Associate Simulator and how our unique Microsoft Certified: Fabric Data Engineer Associate Database made up of real questions:

Info quiz:

  • Quiz name:Microsoft Certified: Fabric Data Engineer Associate
  • Total number of questions:463
  • Number of questions for the test:50
  • Pass score:80%

You can prepare for the Microsoft Certified: Fabric Data Engineer Associate exams with our mobile app. It is very easy to use and even works offline in case of network failure, with all the functions you need to study and practice with our Microsoft Certified: Fabric Data Engineer Associate Simulator.

Use our Mobile App, available for both Android and iOS devices, with our Microsoft Certified: Fabric Data Engineer Associate Simulator . You can use it anywhere and always remember that our mobile app is free and available on all stores.

Our Mobile App contains all Microsoft Certified: Fabric Data Engineer Associate practice tests which consist of 463 questions and also provide study material to pass the final Microsoft Certified: Fabric Data Engineer Associate exam with guaranteed success. Our Microsoft Certified: Fabric Data Engineer Associate database contain hundreds of questions and Microsoft Tests related to Microsoft Certified: Fabric Data Engineer Associate Exam. This way you can practice anywhere you want, even offline without the internet.

BUY NOW