exam questions

Exam DP-100 All Questions

View all questions & answers for the DP-100 exam

Exam DP-100 topic 5 question 5 discussion

Actual exam question from Microsoft's DP-100
Question #: 5
Topic #: 5
[All DP-100 Questions]

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You train a classification model by using a logistic regression algorithm.
You must be able to explain the model's predictions by calculating the importance of each feature, both as an overall global relative importance value and as a measure of local importance for a specific set of predictions.
You need to create an explainer that you can use to retrieve the required global and local feature importance values.
Solution: Create a PFIExplainer.
Does the solution meet the goal?

  • A. Yes
  • B. No
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
shivaborusu
Highly Voted 4 years, 2 months ago
There is no Local Importance Explanation for Permutation Feature Importance. Mimic and Tabular Explainers has it.The answer is YES
upvoted 38 times
aziti
4 years ago
Mimic explainer is based on the idea of training global surrogate models to mimic blackbox models. The way I see it it seems as if Mimic explainer is one with only global importance https://docs.microsoft.com/en-us/azure/machine-learning/how-to-machine-learning-interpretability
upvoted 5 times
ralucabala
3 years, 9 months ago
I was thinking the same way, but it says it's a Logistic Regression not Linear and not decision tree or the other surrogate models supported by Mimix Explainer. So, do we have explainability for logistic regressions also or not?
upvoted 1 times
ralucabala
3 years, 9 months ago
Found it here https://docs.microsoft.com/en-us/azure/machine-learning/how-to-machine-learning-interpretability-aml
upvoted 1 times
...
...
aziti
4 years ago
my bad you're correct
upvoted 2 times
...
...
...
Abhinav_nasaiitkgp
Highly Voted 3 years, 12 months ago
Answer is Yes Mimic explains both local and global feature importance https://docs.microsoft.com/en-us/azure/machine-learning/how-to-machine-learning-interpretability-automl
upvoted 10 times
slashssab
3 years, 3 months ago
Question is about PFIExplainer, so answer should be "No"
upvoted 7 times
...
...
phdykd
Most Recent 1 year, 10 months ago
B. No. The PFIExplainer doesn't support local feature importance explanations.
upvoted 3 times
...
therealola
2 years, 7 months ago
On exam 18-06-22
upvoted 1 times
...
ning
2 years, 7 months ago
PFI cannot do local / instance level!
upvoted 3 times
...
eeah
2 years, 9 months ago
Ans is NO. This was the practice test official answer. Global/local arguments from discussion are correct.
upvoted 2 times
...
synapse
2 years, 10 months ago
Selected Answer: B
PFIExplainer is the only explainer that does not support local importance. So it does not meet the reqs in this case. Answer is B
upvoted 5 times
...
dija123
3 years, 1 month ago
The Answer should be NO
upvoted 1 times
...
dija123
3 years, 1 month ago
Selected Answer: B
PFI can explain the overall behavior of any underlying model but does not explain individual predictions.
upvoted 4 times
...
akuamorgan
3 years, 3 months ago
y all these confusion? Mimic and tabular support global n local. PFI only support global. so the answer is No. the PFI solution doesnt meet the goal
upvoted 5 times
...
frida321
3 years, 3 months ago
I suppose it should be NO. PFI can't explain local importance
upvoted 4 times
...
YipingRuan
3 years, 5 months ago
You need to create an explainer that you can use to retrieve the required global and local feature importance values. Solution: Create a PFIExplainer. Does the solution meet the goal? ????
upvoted 1 times
...
azurecert2021
3 years, 6 months ago
answer should be Yes Permutation Feature Importance (PFI) model explainer canonly be used to explain how strongly the features contribute to the prediction at the dataset level, itdoesn’t support evaluation of local importances. Mimic Explainer can be used for interpreting both the global and local importance of features, Tabular Explainer can be used for interpreting both the global and local importance of features
upvoted 2 times
deyoz
11 months, 2 weeks ago
then why you said yes?
upvoted 1 times
...
...
iamnagesh
3 years, 7 months ago
https://docs.microsoft.com/en-us/learn/modules/explain-machine-learning-models-with-azure-machine-learning/3-explainers
upvoted 1 times
...
dev2dev
3 years, 10 months ago
in the sample notebook comment its states that " # Note: Do not run this cell if using PFIExplainer, it does not support local explanations" So answer is Yes. Given answer No is wrong. ref: https://github.com/interpretml/interpret-community/blob/master/notebooks/advanced-feature-transformations-explain-local.ipynb
upvoted 4 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...