exam questions

Exam DP-700 All Questions

View all questions & answers for the DP-700 exam

Exam DP-700 topic 2 question 8 discussion

Actual exam question from Microsoft's DP-700
Question #: 8
Topic #: 2
[All DP-700 Questions]

HOTSPOT -
You have an Azure Event Hubs data source that contains weather data.
You ingest the data from the data source by using an eventstream named Eventstream1. Eventstream1 uses a lakehouse as the destination.
You need to batch ingest only rows from the data source where the City attribute has a value of Kansas. The filter must be added before the destination. The solution must minimize development effort.
What should you use for the data processor and filtering? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Show Suggested Answer Hide Answer
Suggested Answer:

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
4371883
Highly Voted 5 months, 1 week ago
1. eventstream with an external data source 2. eventstream processor https://learn.microsoft.com/en-us/fabric/real-time-intelligence/event-streams/add-source-azure-event-hubs?pivots=enhanced-capabilities https://learn.microsoft.com/en-us/fabric/real-time-intelligence/event-streams/process-events-using-event-processor-editor?pivots=enhanced-capabilities
upvoted 21 times
...
18e18d0
Highly Voted 4 months, 3 weeks ago
The answer is correct. The question states that eventstream already exists and uses the lakehouse as destination. The question also states that the rows need to be batch ingested. Thus 1) Dataflow and 2) Filter activity are the best in this situation
upvoted 5 times
...
79de7b8
Most Recent 2 weeks ago
I think sevond one has to be eventstream processor, but why is the first one eventstream with an external datasource? Lakehouse is a default data source in eventstream
upvoted 1 times
...
rohitbinnani
2 weeks, 2 days ago
Since the question specifies "batch ingest only rows", that changes the approach. Real-time eventstream processors are typically used for streaming ingestion, not batch. Data Processor A Dataflow Gen2 dataflow Dataflow Gen2 supports batch processing and can connect to Event Hubs via staging or intermediate storage. Filtering A filter in a Dataflow Gen2 dataflow You can apply a row-level filter (e.g., City == "Kansas") in the transformation steps of the dataflow.
upvoted 1 times
...
DarioReymago
3 weeks, 1 day ago
1. eventstream with an external data source 2. eventstream processor
upvoted 1 times
...
kim32
1 month, 3 weeks ago
Correct Solution Data processor: An eventstream with an external data source Filtering: An eventstream processor
upvoted 1 times
...
13d2a97
2 months, 2 weeks ago
Data processor: An eventstream with a custom endpoint Eventstreams allow real-time processing of data using processors, and a custom endpoint provides flexibility in routing data downstream, especially after transformations or filters. Filtering: An eventstream processor Eventstream processors are used within the eventstream pipeline to apply transformations, filters (like your "City = 'Kansas'" logic), or aggregations before writing to a destination. This is the most efficient and low-code way to apply this filter.
upvoted 1 times
...
13d2a97
2 months, 2 weeks ago
Data processor: An eventstream with a custom endpoint Eventstreams allow real-time processing of data using processors, and a custom endpoint provides flexibility in routing data downstream, especially after transformations or filters. Filtering: An eventstream processor Eventstream processors are used within the eventstream pipeline to apply transformations, filters (like your "City = 'Kansas'" logic), or aggregations before writing to a destination. This is the most efficient and low-code way to apply this filter.
upvoted 1 times
...
hebertorosillo
3 months, 2 weeks ago
1. eventstream with an external data source 2. eventstream processor . Dataflow not support Event Hubs
upvoted 2 times
...
henryphchan
4 months, 2 weeks ago
The provided answer. This question is asking how you can batch ingest only rows from the data source where the City attribute has a value of Kansas. To minimize development effeort, the data processor must be DataFlow Gen2 and the filtering should use the Filter in DataFlow Gen2
upvoted 3 times
zxc01
3 months ago
the problem is "You need to batch ingest only rows from the data source where the City attribute has a value of Kansas." where is the data source in this one? it is very hard to build dataflow Gen2 if this data source is event hub. key word "batch ingest" looks like point to dataflow Gen2. And question said they already set eventstream to save data in lakehouse. If this data source means the table in lakehouse, then I can agree dataflow Gen2 is best option.
upvoted 1 times
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...