Winter Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: bigdisc65

Microsoft Certified: Azure Data Engineer Associate DP-203 Microsoft Study Notes

Page: 2 / 12
Question 8

You have a trigger in Azure Data Factory configured as shown in the following exhibit.

Use the drop-down menus to select the answer choice that completes each statement based upon the information presented in the graphic.

Options:

Question 9

You have an Azure Data Lake Storage Gen 2 account named storage1.

You need to recommend a solution for accessing the content in storage1. The solution must meet the following requirements:

    List and read permissions must be granted at the storage account level.

    Additional permissions can be applied to individual objects in storage1.

    Security principals from Microsoft Azure Active Directory (Azure AD), part of Microsoft Entra, must be used for authentication.

What should you use? To answer, drag the appropriate components to the correct requirements. Each component may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

NOTE: Each correct selection is worth one point.

Options:

Question 10

You are creating an Azure Data Factory data flow that will ingest data from a CSV file, cast columns to specified types of data, and insert the data into a table in an Azure Synapse Analytic dedicated SQL pool. The CSV file contains three columns named username, comment, and date.

The data flow already contains the following:

    A source transformation.

    A Derived Column transformation to set the appropriate types of data.

    A sink transformation to land the data in the pool.

You need to ensure that the data flow meets the following requirements:

    All valid rows must be written to the destination table.

    Truncation errors in the comment column must be avoided proactively.

    Any rows containing comment values that will cause truncation errors upon insert must be written to a file in blob storage.

Which two actions should you perform? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

Options:

A.

To the data flow, add a sink transformation to write the rows to a file in blob storage.

B.

To the data flow, add a Conditional Split transformation to separate the rows that will cause truncation

errors.

C.

To the data flow, add a filter transformation to filter out rows that will cause truncation errors.

D.

Add a select transformation to select only the rows that will cause truncation errors.

Question 11

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You are designing an Azure Stream Analytics solution that will analyze Twitter data.

You need to count the tweets in each 10-second window. The solution must ensure that each tweet is counted only once.

Solution: You use a tumbling window, and you set the window size to 10 seconds.

Does this meet the goal?

Options:

A.

Yes

B.

No

Page: 2 / 12
Exam Code: DP-203
Exam Name: Data Engineering on Microsoft Azure
Last Update: Jan 22, 2025
Questions: 355
DP-203 pdf

DP-203 PDF

$33.25  $94.99
DP-203 Engine

DP-203 Testing Engine

$38.5  $109.99
DP-203 PDF + Engine

DP-203 PDF + Testing Engine

$50.75  $144.99