exam questions

Exam DP-203 All Questions

View all questions & answers for the DP-203 exam

Exam DP-203 topic 2 question 86 discussion

Actual exam question from Microsoft's DP-203
Question #: 86
Topic #: 2
[All DP-203 Questions]

HOTSPOT -
You have an Azure Synapse Analytics pipeline named Pipeline1 that contains a data flow activity named Dataflow1.
Pipeline1 retrieves files from an Azure Data Lake Storage Gen 2 account named storage1.
Dataflow1 uses the AutoResolveIntegrationRuntime integration runtime configured with a core count of 128.
You need to optimize the number of cores used by Dataflow1 to accommodate the size of the files in storage1.
What should you configure? To answer, select the appropriate options in the answer area.
Hot Area:

Show Suggested Answer Hide Answer
Suggested Answer:
Box 1: A Get Metadata activity -
Dynamically size data flow compute at runtime
The Core Count and Compute Type properties can be set dynamically to adjust to the size of your incoming source data at runtime. Use pipeline activities like
Lookup or Get Metadata in order to find the size of the source dataset data. Then, use Add Dynamic Content in the Data Flow activity properties.

Box 2: Dynamic content -
Reference:
https://docs.microsoft.com/en-us/azure/data-factory/control-flow-execute-data-flow-activity

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
dom271219
Highly Voted 2 years, 9 months ago
Correct : Use pipeline activities like Lookup or Get Metadata in order to find the size of the source dataset data. Then, use Add Dynamic Content in the Data Flow activity properties. You can choose small, medium, or large compute sizes. Optionally, pick "Custom" and configure the compute types and number of cores manually.
upvoted 15 times
...
dmitriypo
Highly Voted 2 years, 7 months ago
Looks correct. Checked in the doc.
upvoted 6 times
...
iceberge
Most Recent 11 months ago
Copilot Sent by Copilot: To optimize the number of cores used by Dataflow1 in your Azure Synapse Analytics pipeline, you should configure the core count dynamically based on the size of the incoming source data. Here are the steps to achieve this: Use a Lookup or Get Metadata activity in your pipeline to determine the size of the source dataset in storage1. Add Dynamic Content to the Data Flow activity properties to set the core count based on the size of the data
upvoted 2 times
...
Alongi
1 year, 4 months ago
Correct
upvoted 1 times
...
kkk5566
1 year, 9 months ago
Get Metadata & Dynamic Content
upvoted 1 times
...
anks84
2 years, 9 months ago
Looks Correct !!
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...