exam questions

Exam AWS Certified Machine Learning - Specialty All Questions

View all questions & answers for the AWS Certified Machine Learning - Specialty exam

Exam AWS Certified Machine Learning - Specialty topic 1 question 44 discussion

A Machine Learning Specialist has created a deep learning neural network model that performs well on the training data but performs poorly on the test data.
Which of the following methods should the Specialist consider using to correct this? (Choose three.)

  • A. Decrease regularization.
  • B. Increase regularization.
  • C. Increase dropout.
  • D. Decrease dropout.
  • E. Increase feature combinations.
  • F. Decrease feature combinations.
Show Suggested Answer Hide Answer
Suggested Answer: BCF 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
cybe001
Highly Voted 3 years, 7 months ago
Yes, answer is BCF
upvoted 24 times
...
Phong
Highly Voted 3 years, 6 months ago
Go for BCF
upvoted 14 times
...
ninomfr64
Most Recent 10 months, 2 weeks ago
Selected Answer: BCF
I think here the point is around the definition of "feature combinations". If you refer to it as "combine the features to generate a smaller but more effective feature set" this would end up to a smaller feature set thus a good thing for overfitting. However, if you refer to it as "combine the features to generate additional features" this would end up to a larger feature set thus a bad thing for overfitting. Also, in some cases you implement feature combinations in your model (see hidden layers in feed-forward network) thus increasing model complexity which is bad for overfitting. To me this question is poorly worded. I would pick F as my best guess is that you need to implement feature combination in your model, thus decreasing feature combination decrease complexity hence improving with overfitting issue
upvoted 5 times
cloudera3
9 months, 4 weeks ago
Great callout - what exactly the Feature combination is performing has not been elaborated It can be: Using PCA or t-SNE, it is essentially optimizing features - good to address overfitting, and should be done Or, it can be: Using Cartesian Product, features are being combined to create additional features - this will aid overfitting and should NOT be done. Wish questions and answer options are written clearly so that there is no room for ambiguity. Especially, taking into account that in real life, these kind of communication/write-up will trigger follow-up questions until addressed satisfactorily.
upvoted 1 times
...
...
Denise123
1 year, 1 month ago
Selected Answer: BCE
About option E: When increasing feature combinations, the goal is not to simply add more features indiscriminately, which could indeed lead to overfitting. Instead, it involves selecting and combining features in a way that captures important patterns and relationships in the data. When done effectively, increasing feature combinations can help the model generalize better to unseen data by providing more informative and discriminative features, thus reducing the risk of overfitting.
upvoted 1 times
...
Piyush_N
1 year, 2 months ago
Selected Answer: BCF
If your model is overfitting the training data, it makes sense to take actions that reduce model flexibility. To reduce model flexibility, try the following: Feature selection: consider using fewer feature combinations, decrease n-grams size, and decrease the number of numeric attribute bins. Increase the amount of regularization used. https://docs.aws.amazon.com/machine-learning/latest/dg/model-fit-underfitting-vs-overfitting.html
upvoted 1 times
...
Neet1983
1 year, 4 months ago
Selected Answer: BCF
Best choices are B (Increase regularization), C (Increase dropout), and F (Decrease feature combinations), as these techniques are effective in reducing overfitting and improving the model's ability to generalize to new data.
upvoted 1 times
...
akgarg00
1 year, 5 months ago
Selected Answer: BCE
BCE The model has learnt training data. One approach is to increase complexity by increasing the features or remove some features to increase bias. In deep learning, i thinking increasing feature set is more workable.
upvoted 1 times
...
kaike_reis
1 year, 9 months ago
Selected Answer: BCF
B-C-F. All of those options can be used to reduce model complexity and thus: overfit
upvoted 1 times
...
SRB1337
1 year, 10 months ago
its BCF
upvoted 1 times
...
jackzhao
2 years, 1 month ago
BCF is correct.
upvoted 2 times
...
AjoseO
2 years, 2 months ago
Selected Answer: BCF
Increasing regularization helps to prevent overfitting by adding a penalty term to the loss function to discourage the model from learning the noise in the data. Increasing dropout helps to prevent overfitting by randomly dropping out some neurons during training, which forces the model to learn more robust representations that do not depend on the presence of any single neuron. Decreasing the number of feature combinations helps to simplify the model, making it less likely to overfit.
upvoted 6 times
...
Tomatoteacher
2 years, 3 months ago
Selected Answer: BCE
I see all the comments for BCF, although when you look at F it just says decrease 'feature combinations', not features themselves. In one way to decrease feature combinations results in having more features (less feature engineering), which in turn will cause more overfitting. Unless the question in badly worded, saying less feature combinations just mean those combinations, which components will not be used, then it has to be BCE.
upvoted 1 times
cpal012
2 years, 1 month ago
Decrease feature combinations - too many irrelevant features can influence the model by drowning out the signal with noise
upvoted 1 times
...
AjoseO
2 years, 2 months ago
Increasing the number of feature combinations can sometimes improve the performance of a model if the model is underfitting the data. However, in this context, it is not likely to be a solution to overfitting.
upvoted 1 times
...
...
Shailendraa
2 years, 7 months ago
BCF - Always remember in case of overfitting - reduce features, Add regularisation and increase dropouts.
upvoted 3 times
...
ahquiceno
3 years, 6 months ago
BCE: The main objective of PCA (technic to feature combination) is to simplify your model features into fewer components to help visualize patterns in your data and to help your model run faster. Using PCA also reduces the chance of overfitting your model by eliminating features with high correlation. https://towardsdatascience.com/dealing-with-highly-dimensional-data-using-principal-component-analysis-pca-fea1ca817fe6
upvoted 2 times
uninit
2 years, 3 months ago
AWS Documentation explicitly mentions reducing feature combinations to prevent overfitting - https://docs.aws.amazon.com/machine-learning/latest/dg/model-fit-underfitting-vs-overfitting.html It's B C F
upvoted 3 times
...
...
cloud_trail
3 years, 6 months ago
B/C/F Easy peasy.
upvoted 1 times
...
apnu
3 years, 6 months ago
BCF 100%
upvoted 1 times
...
obaidur
3 years, 6 months ago
BCF F explained in AWS document: Feature selection: consider using fewer feature combinations, decrease n-grams size, and decrease the number of numeric attribute bins. Increase the amount of regularization used https://docs.aws.amazon.com/machine-learning/latest/dg/model-fit-underfitting-vs-overfitting.html
upvoted 5 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago