New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft DP-100 Exam - Topic 9 Question 68 Discussion

Actual exam question for Microsoft's DP-100 exam
Question #: 68
Topic #: 9
[All DP-100 Questions]

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You train a classification model by using a logistic regression algorithm.

You must be able to explain the model's predictions by calculating the importance of each feature, both as an overall global relative importance value and as a measure of local importance for a specific set of predictions.

You need to create an explainer that you can use to retrieve the required global and local feature importance values.

Solution: Create a PFIExplainer.

Does the solution meet the goal?

Show Suggested Answer Hide Answer
Suggested Answer: A

Permutation Feature Importance Explainer (PFI): Permutation Feature Importance is a technique used to explain classification and regression models. At a high level, the way it works is by randomly shuffling data one feature at a time for the entire dataset and calculating how much the performance metric of interest changes. The larger the change, the more important that feature is. PFI can explain the overall behavior of any underlying model but does not explain individual predictions.


https://docs.microsoft.com/en-us/azure/machine-learning/how-to-machine-learning-interpretability

Contribute your Thoughts:

0/2000 characters
Talia
4 months ago
I thought there were other options too, though.
upvoted 0 times
...
Veronika
4 months ago
Yes, it meets the goal perfectly!
upvoted 0 times
...
King
4 months ago
Wait, are we sure it covers everything needed?
upvoted 0 times
...
Gaynell
4 months ago
Totally agree, it's a solid choice.
upvoted 0 times
...
Vivienne
5 months ago
PFIExplainer gives both global and local importance!
upvoted 0 times
...
Karon
5 months ago
I believe the PFIExplainer is designed for global feature importance, but I’m not sure if it covers local importance adequately. I might lean towards "No" for this one.
upvoted 0 times
...
Antonio
5 months ago
I think we practiced a similar question where we had to explain model predictions, and I recall that PFIExplainer was mentioned as a good option for global importance.
upvoted 0 times
...
Tammy
5 months ago
I remember we discussed feature importance in class, but I'm not entirely sure if a PFIExplainer is the right tool for both global and local importance.
upvoted 0 times
...
Dorothy
5 months ago
I'm a bit confused about whether PFIExplainer can handle local importance as well. I thought we might need a different method for that part.
upvoted 0 times
...
Rebecka
5 months ago
Hmm, I'm not entirely sure about this one. The question is asking for three specific inputs, but there are a lot of options to choose from. I'll need to carefully read through each choice to determine the correct three.
upvoted 0 times
...
Shayne
5 months ago
I'm a bit confused by the wording here. What exactly is a "Pipeline Normalizer transformation"? And how does the "occurs value" factor into the number of input ports? I'll need to review my notes to see if I can figure this out.
upvoted 0 times
...
Hassie
5 months ago
Hmm, this is a tricky one. I'm not totally sure about some of these options - I'll have to think it through carefully. Maybe I can eliminate a few that seem less likely.
upvoted 0 times
...
Josefa
5 months ago
I don't recall seeing anything about an Investment Reimbursement Act, but could it be possible that it's part of a newer regulation?
upvoted 0 times
...
Neghir
4 years ago
Answer is NO : The PFIExplainer doesn't support local feature importance explanations. https://docs.microsoft.com/en-us/learn/modules/explain-machine-learning-models-with-azure-machine-learning/3-explainers
upvoted 1 times
...

Save Cancel