exam questions

Exam DP-700 All Questions

View all questions & answers for the DP-700 exam

Exam DP-700 topic 2 question 13 discussion

Actual exam question from Microsoft's DP-700
Question #: 13
Topic #: 2
[All DP-700 Questions]

You have a Fabric workspace that contains a lakehouse named Lakehouse1.
In an external data source, you have data files that are 500 GB each. A new file is added every day.
You need to ingest the data into Lakehouse1 without applying any transformations. The solution must meet the following requirements
Trigger the process when a new file is added.
Provide the highest throughput.
Which type of item should you use to ingest the data?

  • A. Eventstream
  • B. Dataflow Gen2
  • C. Streaming dataset
  • D. Data pipeline
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
IshtarSQL
Highly Voted 4 months, 3 weeks ago
Selected Answer: D
Eventstream is designed for ingesting real-time or streaming data from sources like IoT devices or logs. It’s not optimized for batch processing or large files.
upvoted 9 times
...
fassil
Most Recent 2 months ago
Selected Answer: D
D. Data pipeline. Data pipelines are designed to handle large volumes of data efficiently and can be configured to trigger the ingestion process automatically when new files are added to the external data source. They also provide high throughput, making them suitable for handling 500 GB files daily without applying any transformations.
upvoted 4 times
...
henryphchan
2 months, 2 weeks ago
Selected Answer: D
I would prefer using data pipeline although it is a preview feature. Eventstream and streaming data are designed for realtime events
upvoted 1 times
...
2e6975f
2 months, 2 weeks ago
Selected Answer: D
For high-throughput, event-triggered ingestion of large files into a lakehouse without transformations, Data pipeline is the most appropriate and efficient item in Fabric.
upvoted 2 times
...
4371883
3 months, 1 week ago
Selected Answer: C
Streaming dataset is the only answer that ticks the requirements for storage trigger and high throughput. Data Pipeline is not right as at 2025-Jan. The storage trigger is still in preview, so it doesn't satisfy the requirement. But it's probably the best option.
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago