exam questions

Exam DP-100 All Questions

View all questions & answers for the DP-100 exam

Exam DP-100 topic 5 question 3 discussion

Actual exam question from Microsoft's DP-100
Question #: 3
Topic #: 5
[All DP-100 Questions]

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You train a classification model by using a logistic regression algorithm.
You must be able to explain the model's predictions by calculating the importance of each feature, both as an overall global relative importance value and as a measure of local importance for a specific set of predictions.
You need to create an explainer that you can use to retrieve the required global and local feature importance values.
Solution: Create a MimicExplainer.
Does the solution meet the goal?

  • A. Yes
  • B. No
Show Suggested Answer Hide Answer
Suggested Answer: A 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
OlivierM
Highly Voted 3 years, 11 months ago
Is this correct? The documentation explicitly says that PFIExplainer is the only explainer that does not support local importance
upvoted 36 times
chevyli
2 years, 1 month ago
The solution is incorrect, nor its explanation
upvoted 2 times
...
...
shivaborusu
Highly Voted 3 years, 11 months ago
The answer is NO, there is no local explainer for PFI
upvoted 20 times
...
evangelist
Most Recent 5 months, 2 weeks ago
How MimicExplainer works MimicExplainer works by training a simple, interpretable model (such as linear regression, decision trees, etc.) to mimic the behavior of the original complex model. The core idea of ​​this method is: Train a new model so that its output is as close as possible to the output of the original model. Use this new model to explain the original model because the new model itself is easy to interpret.
upvoted 1 times
...
deyoz
8 months, 4 weeks ago
I think answer is No because, a mimic explainer is used to help interpret decisions made my black box models such as ANN. The one in this case is logistic regression, which isn't considered blackbox. However, i am not sure why Mimic explainer cannot bed used in logistic regression?
upvoted 1 times
...
Beauterham
10 months, 3 weeks ago
Answer is No You can pass global and local but only return 1 value. Parameters explanation_types list[str] Required A list of strings representing types of explanations desired. Currently, 'global' and 'local' are supported. Both may be passed in at once; only one explanation will be returned. https://learn.microsoft.com/en-us/python/api/azureml-interpret/azureml.interpret.mimic_wrapper.mimicwrapper?view=azure-ml-py
upvoted 1 times
...
VuTon2025
1 year, 6 months ago
NO. The solution is PIPEExplainer does not support local. Ref: https://learn.microsoft.com/en-us/training/modules/explain-machine-learning-models-with-azure-machine-learning/3-explainers
upvoted 1 times
...
phdykd
1 year, 8 months ago
A Yes
upvoted 1 times
...
therealola
2 years, 4 months ago
On exam 18-06-22
upvoted 2 times
...
synapse
2 years, 7 months ago
Selected Answer: A
PFIExplainer is the only explainer that does not support local importance
upvoted 2 times
...
TheCyanideLancer
2 years, 9 months ago
The Question is Solution: Create a MimicExplainer. Does the solution meet the goal? Ans should be NO as PFIE does not support local feature importance
upvoted 1 times
...
dija123
2 years, 10 months ago
Selected Answer: A
The answer should be Yes for Mimicexplainer.
upvoted 6 times
JTWang
2 years ago
Only PFIeplainer can't support local.
upvoted 1 times
...
...
thhvancouver
3 years, 3 months ago
Examtopic: The comments for PFIExplainer is switched with that of Mimicexplainer...
upvoted 8 times
Geezee999
2 years, 6 months ago
Thank you for clarifying this for me as I was almost confused
upvoted 3 times
...
...
VJPrakash
3 years, 3 months ago
The answer should be YES. The question is - does the solution(Create a Mimicexplainer work). Based on the documentation - both Mimic and tabular explainer will be able to explain global and local, feature importance https://docs.microsoft.com/en-us/learn/modules/explain-machine-learning-models-with-azure-machine-learning/3-explainers
upvoted 16 times
Moshekwa
3 years, 3 months ago
According to the documentation A is the answer
upvoted 2 times
...
...
YipingRuan
3 years, 3 months ago
The question is?? Solution: Create a MimicExplainer. Does the solution meet the goal?
upvoted 2 times
...
azurecert2021
3 years, 4 months ago
answer should be No Permutation Feature Importance (PFI) model explainer canonly be used to explain how strongly the features contribute to the prediction at the dataset level, itdoesn’t support evaluation of local importances. Mimic Explainer can be used for interpreting both the global andlocal importance of features, Tabular Explainer can be used for interpreting both the globaland local importance of features
upvoted 4 times
...
iamnagesh
3 years, 4 months ago
https://docs.microsoft.com/en-us/learn/modules/explain-machine-learning-models-with-azure-machine-learning/3-explainers
upvoted 2 times
...
hachascloud
3 years, 9 months ago
Anseer is NO. answers for this scenario are inverted
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago