exam questions

Exam DP-200 All Questions

View all questions & answers for the DP-200 exam

Exam DP-200 topic 3 question 16 discussion

Actual exam question from Microsoft's DP-200
Question #: 16
Topic #: 3
[All DP-200 Questions]

DRAG DROP -
You have an Azure data factory.
You need to ensure that pipeline-run data is retained for 120 days. The solution must ensure that you can query the data by using the Kusto query language.
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
NOTE: More than one order of answer choices is correct. You will receive credit for any of the correct orders you select.
Select and Place:

Show Suggested Answer Hide Answer
Suggested Answer:
Step 1: Create an Azure Storage account that has a lifecycle policy
To automate common data management tasks, Microsoft created a solution based on Azure Data Factory. The service, Data Lifecycle Management, makes frequently accessed data available and archives or purges other data according to retention policies. Teams across the company use the service to reduce storage costs, improve app performance, and comply with data retention policies.
Step 2: Create a Log Analytics workspace that has Data Retention set to 120 days.
Data Factory stores pipeline-run data for only 45 days. Use Azure Monitor if you want to keep that data for a longer time. With Monitor, you can route diagnostic logs for analysis to multiple different targets, such as a Storage Account: Save your diagnostic logs to a storage account for auditing or manual inspection. You can use the diagnostic settings to specify the retention time in days.
Step 3: From Azure Portal, add a diagnostic setting.
Step 4: Send the data to a log Analytics workspace,
Event Hub: A pipeline that transfers events from services to Azure Data Explorer.
Keeping Azure Data Factory metrics and pipeline-run data.
Configure diagnostic settings and workspace.
Create or add diagnostic settings for your data factory.
1. In the portal, go to Monitor. Select Settings > Diagnostic settings.
2. Select the data factory for which you want to set a diagnostic setting.
3. If no settings exist on the selected data factory, you're prompted to create a setting. Select Turn on diagnostics.
4. Give your setting a name, select Send to Log Analytics, and then select a workspace from Log Analytics Workspace.
5. Select Save.
Reference:
https://docs.microsoft.com/en-us/azure/data-factory/monitor-using-azure-monitor

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
MsIrene
Highly Voted 4 years, 1 month ago
Storage account is not needed here. I would say the order would be like that: Step 1: Create a Log Analytics workspace that has Data Retention set to 120 days. Step 2: From Azure Portal, add a diagnostic setting. Step 3: Select the PipelineRuns Category Step 4: Send the data to a Log Analytics workspace.
upvoted 42 times
Maddaa
4 years, 1 month ago
Agree!
upvoted 2 times
...
Kenai
4 years, 1 month ago
100% agree
upvoted 5 times
...
cadio30
4 years, 1 month ago
Support this solution
upvoted 2 times
...
...
Wendy_DK
Highly Voted 4 years, 1 month ago
Step 1: Create a Log Analytics workspace that has Data Retention set to 120 days. Step 2: From Azure Portal, add a diagnostic setting. Step 3: Select the PipelineRuns Category Step 4: Send the data to a Log Analytics workspace.
upvoted 5 times
...
mric
Most Recent 3 years, 11 months ago
According to the linked article, it's: first Storage Account, then Event Hub, and finally Log Analytics. So I would say: 1- Create an Azure Storage Account with a lifecycle policy 2- Stream to an Azure Event Hub 3- Create a Log Analytics workspace that has a Data Retention set to 120 days 4- Send the data to a Log Analytics Workspace Source: https://docs.microsoft.com/en-us/azure/data-factory/monitor-using-azure-monitor#keeping-azure-data-factory-metrics-and-pipeline-run-data
upvoted 1 times
...
Wendy_DK
4 years, 1 month ago
I agree. Step 1: Create a Log Analytics workspace that has Data Retention set to 120 days. Step 2: From Azure Portal, add a diagnostic setting. Step 3: Select the PipelineRuns Category Step 4: Send the data to a Log Analytics workspace.
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...