exam questions

Exam DP-201 All Questions

View all questions & answers for the DP-201 exam

Exam DP-201 topic 2 question 39 discussion

Actual exam question from Microsoft's DP-201
Question #: 39
Topic #: 2
[All DP-201 Questions]

HOTSPOT -
You are designing an Azure Data Factory solution that will download up to 5 TB of data from several REST APIs.
The solution must meet the following staging requirements:
✑ Ensure that the data can be landed quickly and in parallel to a staging area.
✑ Minimize the need to return to the API sources to retrieve the data again should a later activity in the pipeline fail.
The solution must meet the following analysis requirements:
✑ Ensure that the data can be loaded in parallel.
✑ Ensure that users and applications can query the data without requiring an additional compute engine.
What should you include in the solution to meet the requirements? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Show Suggested Answer Hide Answer
Suggested Answer:
Box 1: Azure Blob storage -
When you activate the staging feature, first the data is copied from the source data store to the staging storage (bring your own Azure Blob or Azure Data Lake
Storage Gen2).

Box 2: Azure Synapse Analytics -
The Azure Synapse Analytics connector in copy activity provides built-in data partitioning to copy data in parallel.
Reference:
https://docs.microsoft.com/en-us/azure/data-factory/copy-activity-performance-features https://docs.microsoft.com/en-us/azure/data-factory/connector-azure-sql-data-warehouse

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Marcus1612
3 years, 8 months ago
Look at this: https://docs.microsoft.com/en-us/azure/data-factory/copy-activity-performance-features When you activate the staging feature, first the data is copied from the source data store to the staging storage (bring your own Azure Blob or Azure Data Lake Storage Gen2). Next, the data is copied from the staging to the sink data store. The copy activity automatically manages the two-stage flow for you, and also cleans up temporary data from the staging storage after the data movement is complete.
upvoted 2 times
...
anamaster
4 years, 1 month ago
correct, but the explanation for synapse is that ASA allows querying
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...