exam questions

Exam AZ-304 All Questions

View all questions & answers for the AZ-304 exam

Exam AZ-304 topic 3 question 32 discussion

Actual exam question from Microsoft's AZ-304
Question #: 32
Topic #: 3
[All AZ-304 Questions]

You have data files in Azure Blob storage.
You plan to transform the files and move them to Azure Data Lake Storage.
You need to transform the data by using mapping data flow.
Which Azure service should you use?

  • A. Azure Data Box Gateway
  • B. Azure Storage Sync
  • C. Azure Data Factory
  • D. Azure Databricks
Show Suggested Answer Hide Answer
Suggested Answer: C 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
razvi
Highly Voted 4 years, 9 months ago
C is correct. https://docs.microsoft.com/en-us/azure/data-factory/concepts-data-flow-overview#:~:text=Mapping%20data%20flows%20are%20visually%20designed%20data%20transformations,Factory%20pipelines%20that%20use%20scaled-out%20Apache%20Spark%20clusters.
upvoted 34 times
...
folkmusic99
Highly Voted 4 years ago
Seems like Azure Data Factory is the new favorite for a lot of answers.
upvoted 28 times
GregoryGerard
3 years, 9 months ago
Perhaps it is the most costly :-)
upvoted 7 times
examineezer
3 years, 5 months ago
ADF is cheap generally speaking.
upvoted 2 times
...
...
...
Dawn7
Most Recent 3 years, 3 months ago
Selected Answer: C
Always Data Factory 😂
upvoted 3 times
...
syu31svc
3 years, 8 months ago
C is the answer for sure
upvoted 3 times
...
Gautam1985
3 years, 9 months ago
Correct
upvoted 2 times
...
glam
4 years, 4 months ago
C. Azure Data Factory
upvoted 5 times
...
kopper2019
4 years, 5 months ago
Data flow = Azure Data Factory
upvoted 9 times
...
certmonster
4 years, 8 months ago
Mapping data flows are visually designed data transformations in Azure Data Factory. Data flows allow data engineers to develop data transformation logic without writing code. Data flow activities can be operationalized using existing Azure Data Factory scheduling, control, flow, and monitoring capabilities.
upvoted 17 times
...
speedminer
4 years, 9 months ago
For Copy activity, with this connector you can: Copy data from/to Azure Data Lake Storage Gen2 by using account key, service principal, or managed identities for Azure resources authentications. Copy files as-is or parse or generate files with supported file formats and compression codecs. Preserve file metadata during copy. Preserve ACLs when copying from Azure Data Lake Storage Gen1/Gen2.
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...