exam questions

Exam AWS Certified Machine Learning - Specialty All Questions

View all questions & answers for the AWS Certified Machine Learning - Specialty exam

Exam AWS Certified Machine Learning - Specialty topic 1 question 16 discussion

During mini-batch training of a neural network for a classification problem, a Data Scientist notices that training accuracy oscillates.
What is the MOST likely cause of this issue?

  • A. The class distribution in the dataset is imbalanced.
  • B. Dataset shuffling is disabled.
  • C. The batch size is too big.
  • D. The learning rate is very high.
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
gaku1016
Highly Voted 3 years, 7 months ago
Answer is D. Should the weight be increased or reduced so that the error is smaller than the current value? You need to examine the amount of change to know that. Therefore, we differentiate and check whether the slope of the tangent is positive or negative, and update the weight value in the direction to reduce the error. The operation is repeated over and over so as to approach the optimal solution that is the goal. The width of the update amount is important at this time, and is determined by the learning rate.
upvoted 17 times
...
ozan11
Highly Voted 3 years, 7 months ago
maybe D ?
upvoted 8 times
...
JonSno
Most Recent 2 months, 2 weeks ago
Selected Answer: D
D. The learning rate is very high. Explanation: When the learning rate is too high, the optimization process may overshoot the optimal weights in parameter space. Instead of gradually converging, the model updates weights in a highly unstable manner, causing fluctuations in training accuracy. The network fails to settle into a minimum because the updates are too aggressive.
upvoted 1 times
...
AjoseO
7 months, 1 week ago
Selected Answer: D
A high learning rate can cause oscillations in the training accuracy because the optimizer makes large updates to the model parameters in each iteration, which can cause overshooting the optimal values. This can result in the model oscillating back and forth across the optimal solution.
upvoted 3 times
...
Mickey321
7 months, 1 week ago
Selected Answer: D
If the learning rate is too high, the model weights may overshoot the optimal values and bounce back and forth around the minimum of the loss function. This can cause the training accuracy to oscillate and prevent the model from converging to a stable solution. The training accuracy is the proportion of correct predictions made by the model on the training data.
upvoted 2 times
...
Rejju
7 months, 1 week ago
When the learning rate is set too high, it can lead to oscillations or divergence during training. Here's why: High Learning Rate: A high learning rate means that the model's parameters are updated by a large amount in each training step. This can cause the model to overshoot the optimal parameter values, leading to instability in training. Oscillations: If the learning rate is excessively high, the model's updates can become unstable, causing it to oscillate back and forth between parameter values. This oscillation can prevent the model from converging to an optimal solution. To address this issue, you can try reducing the learning rate. It's often necessary to experiment with different learning rates to find the one that works best for your specific problem and dataset. Learning rate scheduling techniques, such as reducing the learning rate over time, can also help stabilize training.
upvoted 2 times
...
CKS1210
1 year, 10 months ago
Answer is A. A high learning rate means that the model parameters are being updated by large magnitudes in each iteration. As a result, the optimization process may struggle to converge to the optimal solution, leading to erratic behavior and fluctuations in training accuracy.
upvoted 1 times
...
soonmo
1 year, 11 months ago
Selected Answer: D
If learning rate is high, the accuracy is fluctuated because the value of loss function moves back and forth over the global minimum.
upvoted 1 times
...
Valcilio
2 years, 1 month ago
Selected Answer: D
The big learning rating overshoot in true minima.
upvoted 2 times
...
Tomatoteacher
2 years, 3 months ago
Selected Answer: D
D Learning rate is too high. Textbook example of learning rate being too high. Lower Learning_rate will take more iterations, or longer to train, but will settle in place.
upvoted 1 times
...
Shailendraa
2 years, 7 months ago
12-sep exam
upvoted 1 times
...
Sam1610
2 years, 10 months ago
D: per supuesto
upvoted 1 times
...
missionml
3 years, 1 month ago
A company sells thousands of products on a public website and wants to automatically identify products with potential durability problems. The company has 1.000 reviews with date, star rating, review text, review summary, and customer email fields, but many reviews are incomplete and have empty fields. Each review has already been labeled with the correct durability result. A machine learning specialist must train a model to identify reviews expressing concerns over product durability. The first model needs to be trained and ready to review in 2 days. What is the MOST direct approach to solve this problem within 2 days? A. Train a custom classifier by using Amazon Comprehend. B. Build a recurrent neural network (RNN) in Amazon SageMaker by using Gluon and Apache MXNet. C. Train a built-in BlazingText model using Word2Vec mode in Amazon SageMaker. D. Use a built-in seq2seq model in Amazon SageMaker.
upvoted 1 times
missionml
3 years, 1 month ago
Is A valid option?
upvoted 1 times
...
...
btsql
3 years, 6 months ago
D is correct. big batch size make local minia.
upvoted 1 times
...
jeetss1
3 years, 6 months ago
it is a multiple answer question and answer should be both A and D
upvoted 1 times
...
syu31svc
3 years, 6 months ago
Answer is D 100%; learning rate too high will cause such an event
upvoted 3 times
...
deep_n
3 years, 6 months ago
The answer is D, from the Coursera deep learning specialization (course 2 - improving Deep NN)
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago