exam questions

Exam DP-201 All Questions

View all questions & answers for the DP-201 exam

Exam DP-201 topic 2 question 21 discussion

Actual exam question from Microsoft's DP-201
Question #: 21
Topic #: 2
[All DP-201 Questions]

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure Data Lake Storage account that contains a staging zone.
You need to design a daily process to ingest incremental data from the staging zone, transform the data by executing an R script, and then insert the transformed data into a data warehouse in Azure Synapse Analytics.
Solution: You schedule an Azure Databricks job that executes an R notebook, and then inserts the data into the data warehouse.
Does this meet the goal?

  • A. Yes
  • B. No
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️
You should use an Azure Data Factory, not an Azure Databricks job.
Reference:
https://docs.microsoft.com/en-US/azure/data-factory/transform-data

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
avix
Highly Voted 4 years, 9 months ago
But I can do with this too!
upvoted 20 times
andreeavi
4 years, 5 months ago
it is possible, but first you need to ingest data from staging source
upvoted 3 times
Kalo
4 years, 3 months ago
with a mount in DBFS, we can ingest data from ADLS
upvoted 4 times
...
...
...
Shrikant_Kulkarni
Highly Voted 4 years, 7 months ago
answer should be yes.
upvoted 17 times
...
cadio30
Most Recent 4 years ago
This requirement is possible with the use R script in Azure Databricks job. Therefore, answer should be 'Yes'
upvoted 2 times
...
mohowzeh
4 years, 4 months ago
A scheduled daily Databricks job does the trick. Data Factory isn't the only tool that can bring data from one place to another... Answer should be yes.
upvoted 4 times
...
Psycho360
4 years, 6 months ago
Who is gonna stop me from using Databricks. There seems to be no technical limitation in this approach
upvoted 3 times
...
Akva
4 years, 7 months ago
I think it should be YES. https://docs.microsoft.com/en-us/azure/databricks/scenarios/databricks-extract-load-sql-data-warehouse
upvoted 9 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...