exam questions

Exam AWS Certified Machine Learning - Specialty All Questions

View all questions & answers for the AWS Certified Machine Learning - Specialty exam

Exam AWS Certified Machine Learning - Specialty topic 1 question 107 discussion

A Machine Learning Specialist is assigned to a Fraud Detection team and must tune an XGBoost model, which is working appropriately for test data. However, with unknown data, it is not working as expected. The existing parameters are provided as follows.

Which parameter tuning guidelines should the Specialist follow to avoid overfitting?

  • A. Increase the max_depth parameter value.
  • B. Lower the max_depth parameter value.
  • C. Update the objective to binary:logistic.
  • D. Lower the min_child_weight parameter value.
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
SophieSu
Highly Voted 2 years, 6 months ago
B lower max_depth is the correct answer. D min_child_weight means something like "stop trying to split once your sample size in a node goes below a given threshold" Lower min_child_weight, the tree becomes more deep and complex. Increase min_child_weight, the tree will have less branches and less complexity.
upvoted 17 times
...
Mickey321
Most Recent 8 months, 3 weeks ago
Selected Answer: B
The max_depth parameter controls the maximum depth of the decision trees in the XGBoost model. A higher max_depth value will result in more complex decision trees, which can lead to overfitting.
upvoted 1 times
...
ccpmad
9 months, 2 weeks ago
Selected Answer: B
Overfitting occurs when a model performs well on the training data but poorly on unseen or test data. In the context of XGBoost, reducing the max_depth parameter helps prevent overfitting. The max_depth parameter controls the maximum depth of the trees in the ensemble. A smaller max_depth value limits the complexity of the trees, making them less likely to memorize the noise in the training data and improve generalization to unseen data.
upvoted 1 times
...
gcaria
11 months, 2 weeks ago
Selected Answer: B
It is B
upvoted 1 times
...
vbal
11 months, 2 weeks ago
B: overfitting problem.
upvoted 1 times
...
Shailendraa
1 year, 8 months ago
12-Sep Exam.
upvoted 1 times
...
[Removed]
2 years, 5 months ago
Selected Answer: B
When a model overfits, the solutions are: 1. Reduce model flexibility and complexity 2. Reduce the number of feature combinations 3. Decrease n-grams size 4. Decrease the number of numeric attribute bins 5. Increase the amount of regularization 6. Add dropout
upvoted 1 times
...
Dr_Kiko
2 years, 6 months ago
B. 30-deep tree is crazy; normally it's 6-7 no more
upvoted 2 times
...
cnethers
2 years, 7 months ago
A. Increase the max_depth parameter value. (This would increase the complexity resulting in overfitting) B. Lower the max_depth parameter value. (This would reduce the complexity and minimize overfitting) C. Update the objective to binary:logistic. it depends on what the target(s) generally you would have a binary classification for fraud detection but there is nothing to say you can't have a multi class so there is not enough information given. D. Lower the min_child_weight parameter value. (This would reduce the complexity and minimize overfitting) I find that there are 2 correct answers to this question which does not help B & D
upvoted 2 times
arulrajjayaraj
2 years, 6 months ago
Ans : B , Lower values avoid over-fitting. No for D - Larger values avoid over-fitting.
upvoted 8 times
...
...
cnethers
2 years, 7 months ago
Thus, those parameters can be used to control the complexity of the trees. It is important to tune them together in order to find a good trade-off between model bias and variance
upvoted 2 times
...
cnethers
2 years, 7 months ago
min_child_weight is the minimum weight (or number of samples if all samples have a weight of 1) required in order to create a new node in the tree. A smaller min_child_weight allows the algorithm to create children that correspond to fewer samples, thus allowing for more complex trees, but again, more likely to overfit.
upvoted 1 times
...
cnethers
2 years, 7 months ago
max_depth is the maximum number of nodes allowed from the root to the farthest leaf of a tree. Deeper trees can model more complex relationships by adding more nodes, but as we go deeper, splits become less relevant and are sometimes only due to noise, causing the model to overfit.
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago