exam questions

Exam DP-100 All Questions

View all questions & answers for the DP-100 exam

Exam DP-100 topic 1 question 38 discussion

Actual exam question from Microsoft's DP-100
Question #: 38
Topic #: 1
[All DP-100 Questions]

You are in the process of constructing a deep convolutional neural network (CNN). The CNN will be used for image classification.
You notice that the CNN model you constructed displays hints of overfitting.
You want to make sure that overfitting is minimized, and that the model is converged to an optimal fit.
Which of the following is TRUE with regards to achieving your goal?

  • A. You have to add an additional dense layer with 512 input units, and reduce the amount of training data.
  • B. You have to add L1/L2 regularization, and reduce the amount of training data.
  • C. You have to reduce the amount of training data and make use of training data augmentation.
  • D. You have to add L1/L2 regularization, and make use of training data augmentation.
  • E. You have to add an additional dense layer with 512 input units, and add L1/L2 regularization.
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
exam_monkey1234
Highly Voted 2 years, 10 months ago
I would say answer D
upvoted 50 times
Moshekwa
2 years, 9 months ago
"data augmentation simply means increasing size of the data that is increasing the number of images present in the dataset.. using data augmentation a lot of similar images can be generated. This helps in increasing the dataset size and thus reduce overfitting." https://www.kdnuggets.com/2019/12/5-techniques-prevent-overfitting-neural-networks.html
upvoted 14 times
...
Nghia1
11 months, 1 week ago
agree, option B reduce amount of training data will lead to overfitting.
upvoted 1 times
...
...
phdykd
Highly Voted 1 year, 3 months ago
Moderator, you should correct the answers, not me! D. You have to add L1/L2 regularization and make use of training data augmentation. Overfitting occurs when a model is too complex for data it is being trained on and memorizes the training data instead of generalizing to new data. To reduce overfitting, you can use regularization techniques such as L1 or L2 regularization, which add a penalty term to the loss function to discourage the model from learning overly complex representations. Additionally, increasing the amount of training data can also help reduce overfitting by giving the model more information to learn from. One common way to increase the amount of training data is to use data augmentation, which involves transforming the existing data in ways that preserve the labels to generate additional training examples.
upvoted 9 times
...
dporwal04
Most Recent 4 months, 4 weeks ago
Selected Answer: D
D is the correct ans
upvoted 2 times
...
rakeshmk
7 months, 1 week ago
Selected Answer: D
The answer is D. Data augmentation really helps to reduce overfitting and L1L2 are most used regularization in Neural networks.. Dear moderator , please provide the correct answers. Many of them given here is misleading which can make chaos while attending exam
upvoted 1 times
...
endeesa
10 months, 4 weeks ago
Selected Answer: D
The only option that makes sense to me is D. Adding regularisation will reduce overfitting, similarly adding more data adds more diversity to the training set allowing it to generalise better. So answer is D
upvoted 1 times
...
bvkr
1 year, 1 month ago
ChatGPT answer: Option D: You have to add L1/L2 regularization, and make use of training data augmentation. When a deep CNN model displays hints of overfitting, it means that the model is too complex and has learned to fit the training data too closely. One way to minimize overfitting is to add regularization to the model, which adds a penalty term to the loss function, encouraging the model to choose simpler solutions. L1/L2 regularization adds a penalty term to the loss function that discourages the model from using large weights in the network. This has the effect of reducing the complexity of the model and can help prevent overfitting. Data augmentation is another effective technique to minimize overfitting. It involves applying random transformations to the training data, such as random rotations or translations, to create new training examples that are similar to the original ones. This helps the model to generalize better to unseen data.
upvoted 2 times
...
Yoshizn
1 year, 2 months ago
Selected Answer: D
Answer is D
upvoted 1 times
...
MansoorDataScientist
1 year, 3 months ago
Steps for reducing overfitting: Add more data. Use data augmentation. Use architectures that generalize well. Add regularization (mostly dropout, L1/L2 regularization are also possible) Reduce architecture complexity.
upvoted 2 times
...
Peeking
1 year, 3 months ago
Selected Answer: D
D is definitely the answer.
upvoted 1 times
...
Edriv
1 year, 4 months ago
Why don't C?
upvoted 1 times
...
lookaaaa
1 year, 5 months ago
Selected Answer: D
increse amount of data, simplify the model (decrese layers or NN unit, etc)
upvoted 2 times
...
zweic
1 year, 7 months ago
Selected Answer: D
I would say D
upvoted 2 times
...
jlopezfelizzola
1 year, 7 months ago
Selected Answer: D
My vote is for D. A & E discarded because that is increasing the complexity of the architecture. B, C are suggesting reducing the amount of data. D will generate more data for the CNN to be able to generalize more.
upvoted 2 times
jlopezfelizzola
1 year, 7 months ago
Source https://towardsdatascience.com/deep-learning-3-more-on-cnns-handling-overfitting-2bd5d99abe5d
upvoted 1 times
...
...
synapse
2 years, 1 month ago
Selected Answer: D
Definitely D
upvoted 1 times
...
dija123
2 years, 4 months ago
Selected Answer: D
I vote for D
upvoted 1 times
...
jed_elhak
2 years, 7 months ago
Answer is D :)
upvoted 4 times
...
Maryam89
2 years, 8 months ago
Answer is D
upvoted 4 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago