exam questions

Exam DP-100 All Questions

View all questions & answers for the DP-100 exam

Exam DP-100 topic 5 question 14 discussion

Actual exam question from Microsoft's DP-100
Question #: 14
Topic #: 5
[All DP-100 Questions]

You are creating a binary classification by using a two-class logistic regression model.
You need to evaluate the model results for imbalance.
Which evaluation metric should you use?

  • A. Relative Absolute Error
  • B. AUC Curve
  • C. Mean Absolute Error
  • D. Relative Squared Error
  • E. Accuracy
  • F. Root Mean Square Error
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
akgarg00
Highly Voted 4 years, 3 months ago
99% class 1 data and 1% class 2 data. If all prediction is class 1, we will attain 99% accuracy. So accuracy is incorrect answer
upvoted 15 times
pancman
3 years, 1 month ago
Absolutely not. Funny thing is, you proved yourself wrong on why it shouldn't be accuracy in the answer you gave.
upvoted 1 times
...
...
evangelist
Most Recent 11 months, 2 weeks ago
Selected Answer: B
For evaluating a binary classification model, especially with imbalanced datasets, the Area Under the Receiver Operating Characteristic (AUC-ROC) Curve is an excellent metric. It's insensitive to class imbalance and provides a good summary of the model's performance across different classification thresholds.
upvoted 1 times
...
evangelist
1 year ago
Selected Answer: B
AUC Curve (Area Under the Curve): The AUC-ROC (Receiver Operating Characteristic) curve is a performance measurement for classification problems at various threshold settings. AUC represents the degree or measure of separability, indicating how much the model is capable of distinguishing between classes. An AUC value of 0.5 suggests no discrimination (i.e., random guessing), whereas a value of 1.0 indicates perfect discrimination. The AUC-ROC curve is particularly useful for evaluating models on imbalanced datasets because it is insensitive to changes in the class distribution. It provides a single metric that captures the trade-off between sensitivity (true positive rate) and specificity (true negative rate).
upvoted 1 times
...
phdykd
2 years, 3 months ago
The appropriate evaluation metric to use for assessing imbalance in a binary classification model is the AUC Curve (B). AUC (Area Under the Curve) is a measure of the model's ability to distinguish between positive and negative classes. AUC ranges from 0 to 1, where an AUC of 1 indicates perfect separation between the positive and negative classes, and an AUC of 0.5 indicates random chance. A high AUC value indicates that the model has a strong ability to correctly classify positive and negative instances, which is especially important in imbalanced datasets where one class may have significantly fewer instances than the other. Therefore, the AUC curve is a commonly used metric to evaluate the performance of binary classification models in the presence of class imbalance.
upvoted 1 times
...
ning
2 years, 12 months ago
I guess weighted AUC is the best answer ...
upvoted 4 times
ning
2 years, 12 months ago
Or weighted accuracy
upvoted 1 times
...
...
[Removed]
3 years, 1 month ago
What does it mean by "evaluate the model results for imbalance"? Does it mean evaluate the extent/degree of imbalance in the dataset? Or does it simply mean to evaluate the model when the underyling data is imbalanced?
upvoted 1 times
...
pancman
3 years, 1 month ago
Selected Answer: B
AUC is the correct answer.
upvoted 4 times
...
synapse
3 years, 2 months ago
Selected Answer: B
AUC seems to be the right answer as per this.. https://stats.stackexchange.com/questions/260164/auc-and-class-imbalance-in-training-test-dataset
upvoted 2 times
...
anonymjason
3 years, 11 months ago
I would assume AUC Curve is a typo, because AUC is Area Under Curve. Seems it would be the right answer though.
upvoted 2 times
...
OmarF
4 years, 2 months ago
It should be E (Accuracy) The AUC is the area under ROC curve so it's a number not a curve. So there is no curve called AUC curve.
upvoted 1 times
sim39
3 years, 9 months ago
No, can't be accuracy. I agree that there is nothing called "AUC curve", but I assume it's supposed to say just AUC
upvoted 1 times
...
...
Askme101
4 years, 5 months ago
Should Accuracy not be included along with AUC?
upvoted 2 times
Neuron
4 years, 4 months ago
no, accuracy can be misleading when the dataset is skewed (not balanced). AUC provides better insight overall.
upvoted 10 times
...
dijaa
3 years, 9 months ago
accuracy fails when imbalance exists.
upvoted 3 times
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...