exam questions

Exam DP-200 All Questions

View all questions & answers for the DP-200 exam

Exam DP-200 topic 2 question 2 discussion

Actual exam question from Microsoft's DP-200
Question #: 2
Topic #: 2
[All DP-200 Questions]

You develop data engineering solutions for a company.
You must integrate the company's on-premises Microsoft SQL Server data with Microsoft Azure SQL Database. Data must be transformed incrementally.
You need to implement the data integration solution.
Which tool should you use to configure a pipeline to copy data?

  • A. Use the Copy Data tool with Blob storage linked service as the source
  • B. Use Azure PowerShell with SQL Server linked service as a source
  • C. Use Azure Data Factory UI with Blob storage linked service as a source
  • D. Use the .NET Data Factory API with Blob storage linked service as the source
Show Suggested Answer Hide Answer
Suggested Answer: C 🗳️
The Integration Runtime is a customer managed data integration infrastructure used by Azure Data Factory to provide data integration capabilities across different network environments.
A linked service defines the information needed for Azure Data Factory to connect to a data resource. We have three resources in this scenario for which linked services are needed:
✑ On-premises SQL Server
✑ Azure Blob Storage
✑ Azure SQL database
Note: Azure Data Factory is a fully managed cloud-based data integration service that orchestrates and automates the movement and transformation of data. The key concept in the ADF model is pipeline. A pipeline is a logical grouping of Activities, each of which defines the actions to perform on the data contained in
Datasets. Linked services are used to define the information needed for Data Factory to connect to the data resources.
References:
https://docs.microsoft.com/en-us/azure/machine-learning/team-data-science-process/move-sql-azure-adf

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
brcdbrcd
Highly Voted 4 years, 7 months ago
The answer is B. Use Azure PowerShell with SQL Server linked service as a source https://docs.microsoft.com/en-us/azure/data-factory/tutorial-incremental-copy-multiple-tables-powershell
upvoted 35 times
tankzzz
4 years, 1 month ago
This is the answer. 100%
upvoted 5 times
...
...
BungyTex
Highly Voted 4 years, 9 months ago
C says the linked service is Blob storage. The question says it coming from SQL Server. Why would you use Blob storage as the source?
upvoted 7 times
...
111222333
Most Recent 4 years ago
Correct answer is B. 1. Both *Azure Data Factory UI (Azure Portal)* and *Azure PowerShell* can be used to incrementally load data from multiple tables in SQL Server to a database in Azure SQL Database. 2. Both *Azure Data Factory UI (Azure Portal)* and *Azure PowerShell* can use an SQL Server linked service as a source for this incremental copy. Evidence for these two statements: - *ADF GUI*: https://docs.microsoft.com/en-us/azure/data-factory/tutorial-incremental-copy-multiple-tables-portal - *PowerShell*: https://docs.microsoft.com/en-us/azure/data-factory/tutorial-incremental-copy-multiple-tables-powershell Hence, use *Azure Data Factory UI (Azure Portal) with SQL Server linked service* or use *Azure PowerShell with SQL Server linked service*. Since *ADF with SQL Server linked service* is not an option among the answers ("Blob storage linked service as a source" in answer C is really not necessary), the correct answer is B: Use Azure PowerShell with SQL Server linked service as a source.
upvoted 5 times
...
Qrm_1972
4 years, 1 month ago
The correct answer is : 100% is C
upvoted 2 times
...
cadio30
4 years, 1 month ago
This should be ADF
upvoted 1 times
cadio30
4 years, 1 month ago
Disregard this and the answer is 'B' as the other connection string relies on azure blob storage and the source is from on-prem SQL Server
upvoted 1 times
...
...
Sai2609
4 years, 1 month ago
The answer is B since the catch is the movement of data incrementally which can be done easily through powershell
upvoted 3 times
...
Nevia
4 years, 1 month ago
In my opinion the key word is "pipeline". ADF UI is the only one that creates a pipeline
upvoted 1 times
maciejt
4 years ago
ADF is not good with incremental or delta loads. Besides all anwers except B refers to Blob as source and the source is on prem SQL, so it can only be B
upvoted 1 times
...
...
felmasri
4 years, 3 months ago
looks like the question is based on a scenario presented in one of Microsoft help docs : https://docs.microsoft.com/en-us/azure/machine-learning/team-data-science-process/move-sql-azure-adf the scenario assuming the following: The Scenario We set up an ADF pipeline that composes two data migration activities. Together they move data on a daily basis between a SQL Server database and Azure SQL Database. The two activities are: Copy data from a SQL Server database to an Azure Blob Storage account Copy data from the Azure Blob Storage account to Azure SQL Database.
upvoted 2 times
...
dumpsm42
4 years, 5 months ago
hi to all, the text says "...tool..." so for me that leaves out B and D. answer C seems right but the source is SQL Server onprem, not the blob storage ! so for me it's A because its source is SQL onPrem and because of this link: https://docs.microsoft.com/pt-pt/azure/data-factory/tutorial-hybrid-copy-data-tool so it's A for me. regards
upvoted 1 times
dumpsm42
4 years, 5 months ago
Under New Linked Service, search for SQL Server, and then select Continue. In the New Linked Service (SQL Server) dialog box, under Name, enter SqlServerLinkedService. Select +New under Connect via integration runtime. You must create a self-hosted integration runtime, download it to your machine, and register it with Data Factory. The self-hosted integration runtime copies data between your on-premises environment and the cloud.
upvoted 1 times
...
...
anarvekar
4 years, 8 months ago
Can someone please explain how blob storage comes into picture in this scenario? The data transformation is being carried out using ADF, and the data moves between on premise SQL Server instance and Azure SQL DB.
upvoted 3 times
poomazuretest
4 years, 8 months ago
ADF need staging area on cloud using Blob
upvoted 3 times
big_data_au
4 years, 7 months ago
Not if you are using a self hosted integration runtime - ADF can draw directly from on-prem SQL
upvoted 4 times
maciejt
4 years ago
I confirm, ADF copy activity can draw from on prem, data flows need staging on cloud as they run on IR
upvoted 1 times
...
...
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...