exam questions

Exam DP-203 All Questions

View all questions & answers for the DP-203 exam

Exam DP-203 topic 2 question 77 discussion

Actual exam question from Microsoft's DP-203
Question #: 77
Topic #: 2
[All DP-203 Questions]

DRAG DROP -
You have an Azure Data Lake Storage Gen2 account that contains a JSON file for customers. The file contains two attributes named FirstName and LastName.
You need to copy the data from the JSON file to an Azure Synapse Analytics table by using Azure Databricks. A new column must be created that concatenates the FirstName and LastName values.
You create the following components:
✑ A destination table in Azure Synapse
✑ An Azure Blob storage container
✑ A service principal
In which order should you perform the actions? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
Select and Place:

Show Suggested Answer Hide Answer
Suggested Answer:
Step 1: Mount the Data Lake Storage onto DBFS
Begin with creating a file system in the Azure Data Lake Storage Gen2 account.
Step 2: Read the file into a data frame.
You can load the json files as a data frame in Azure Databricks.
Step 3: Perform transformations on the data frame.
Step 4: Specify a temporary folder to stage the data
Specify a temporary folder to use while moving data between Azure Databricks and Azure Synapse.
Step 5: Write the results to a table in Azure Synapse.
You upload the transformed data frame into Azure Synapse. You use the Azure Synapse connector for Azure Databricks to directly upload a dataframe as a table in a Azure Synapse.
Reference:
https://docs.microsoft.com/en-us/azure/azure-databricks/databricks-extract-load-sql-data-warehouse

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Feljoud
Highly Voted 2 years, 5 months ago
Similar to another question in this dump. Seems correct!
upvoted 20 times
...
Alongi
Most Recent 6 months, 1 week ago
Correct
upvoted 2 times
...
kkk5566
1 year, 2 months ago
correct
upvoted 2 times
...
rzeng
2 years ago
correct
upvoted 4 times
...
dom271219
2 years, 2 months ago
"Specify a temporary folder to stage the data" must be before creating the DF : I am wrong ?
upvoted 1 times
Karl_Cen
1 year, 9 months ago
As mentioned earlier, the Azure Synapse connector uses Azure Blob storage as temporary storage to upload data between Azure Databricks and Azure Synapse so it means only before you loading data into ADLS, you need this temporary folder. https://learn.microsoft.com/en-us/azure/databricks/scenarios/databricks-extract-load-sql-data-warehouse
upvoted 3 times
...
...
Deeksha1234
2 years, 2 months ago
correct
upvoted 2 times
...
nefarious_smalls
2 years, 4 months ago
correct
upvoted 1 times
...
demirsamuel
2 years, 5 months ago
answer is correct. Similar to a duplicated question in this question catalog.
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago