exam questions

Exam DP-200 All Questions

View all questions & answers for the DP-200 exam

Exam DP-200 topic 4 question 19 discussion

Actual exam question from Microsoft's DP-200
Question #: 19
Topic #: 4
[All DP-200 Questions]

HOTSPOT -
You have a new Azure Data Factory environment.
You need to periodically analyze pipeline executions from the last 60 days to identify trends in execution durations. The solution must use Azure Log Analytics to query the data and create charts.
Which diagnostic settings should you configure in Data Factory? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Show Suggested Answer Hide Answer
Suggested Answer:
Log type: PipelineRuns -
A pipeline run in Azure Data Factory defines an instance of a pipeline execution.
Storage location: An Azure Storage account
Data Factory stores pipeline-run data for only 45 days. Use Monitor if you want to keep that data for a longer time. With Monitor, you can route diagnostic logs for analysis. You can also keep them in a storage account so that you have factory information for your chosen duration.
Save your diagnostic logs to a storage account for auditing or manual inspection. You can use the diagnostic settings to specify the retention time in days.
Reference:
https://docs.microsoft.com/en-us/azure/data-factory/concepts-pipeline-execution-triggers https://docs.microsoft.com/en-us/azure/data-factory/monitor-using-azure-monitor

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
AUgigi
Highly Voted 5 years, 3 months ago
Should it be sent to Azure Log Analytics? Considering "The solution must use Azure Log Analytics to query the data and create charts."?
upvoted 25 times
...
avestabrzn
Highly Voted 5 years, 3 months ago
You need to send your data to storage account in order to query your logs and create charts via Log Analytics. But instead, You can directly store it in Log Analytics. Question is still tricky though.
upvoted 14 times
...
Pairon
Most Recent 4 years, 2 months ago
I don't know if it's possible to choose "Azure Logs Analytics" as sink in Data Factory. If my thought is correct, the correct answer is Storage account.
upvoted 1 times
...
akram786
4 years, 3 months ago
It should be log analytics : https://docs.microsoft.com/en-us/answers/questions/214414/data-factory-diagnostic-settings.html
upvoted 6 times
...
AyeshJr
4 years, 4 months ago
Log analytics should be the correct answer since it can retain data for longer periods of times which in turn means that you can query the data in the past 60 days. https://docs.microsoft.com/en-us/azure/azure-monitor/platform/manage-cost-storage#log-analytics-and-security-center
upvoted 2 times
satyamkishoresingh
3 years, 8 months ago
Agreed with Retention but where in link it said ADF can directly store those info in log Analytics ?
upvoted 1 times
...
...
KRV
4 years, 5 months ago
Since pipeline executions have to be analyzed beyond 45 days , which is 60 in this case , the "Azure Log Analytics" will be incorrect answer choice and the right answer should be as "Azure Storage Account" which is already correctly selected out here , the catch is to read the question Using Azure Monitor you can route it to multiple different targets https://docs.microsoft.com/en-us/azure/data-factory/monitor-using-azure-monitor
upvoted 1 times
AyeshJr
4 years, 4 months ago
That is incorrect, Log analytics can retain data for a cost.
upvoted 1 times
hoangton
4 years ago
Data Factory stores pipeline-run data for only 45 days. Use Azure Monitor if you want to keep that data for a longer time. With Monitor, you can route diagnostic logs for analysis to multiple different targets.
upvoted 2 times
...
...
...
calvintcy
4 years, 5 months ago
Should be Azure Log Analytics
upvoted 1 times
...
syu31svc
4 years, 6 months ago
Question already gave the answer for the storage location
upvoted 1 times
dumpsm42
4 years, 6 months ago
hi to all, no. krisspark is correct. you must pay attention to the retention period => 60 days and azure log does not keep for 60 days, so you must use an storage account. regards
upvoted 1 times
...
...
rsm2020
4 years, 9 months ago
It should be Azure Log Analytics.
upvoted 1 times
...
krisspark
4 years, 11 months ago
The correct answer is "storage account" as the question asked for explicit "60 days" retention which is not available in Azure Log Analytics & Event hub configs..
upvoted 10 times
akn1
4 years, 11 months ago
Azure Monitor - It is also possible to specify different retention settings for individual data types from 30 to 730 days https://docs.microsoft.com/en-us/azure/azure-monitor/platform/manage-cost-storage#retention-by-data-type
upvoted 6 times
...
...
Rohan21
4 years, 11 months ago
Storage Account is correct as using Storage Account one can specify retention days but same is not with Log Analytics.
upvoted 6 times
amar111
4 years, 11 months ago
Retention period is no where mentioned in question .
upvoted 2 times
...
amar111
4 years, 11 months ago
its just what time period you want to query on
upvoted 1 times
...
...
SachinKumar2
5 years, 3 months ago
Yes..it should be Azure Log analytics
upvoted 9 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...