exam questions

Exam Professional Machine Learning Engineer All Questions

View all questions & answers for the Professional Machine Learning Engineer exam

Exam Professional Machine Learning Engineer topic 1 question 109 discussion

Actual exam question from Google's Professional Machine Learning Engineer
Question #: 109
Topic #: 1
[All Professional Machine Learning Engineer Questions]

During batch training of a neural network, you notice that there is an oscillation in the loss. How should you adjust your model to ensure that it converges?

  • A. Decrease the size of the training batch.
  • B. Decrease the learning rate hyperparameter.
  • C. Increase the learning rate hyperparameter.
  • D. Increase the size of the training batch.
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
hiromi
Highly Voted 1 year, 10 months ago
Selected Answer: B
B larger learning rates can reduce training time but may lead to model oscillation and may miss the optimal model parameter values.
upvoted 9 times
...
desertlotus1211
Most Recent 2 months ago
Selected Answer: B
When you observe oscillations in the loss during training, it is often a sign that the learning rate is too high. A high learning rate can cause the optimizer to overshoot the minimum of the loss function
upvoted 1 times
...
fitri001
6 months, 1 week ago
Selected Answer: B
A. Decrease Batch Size: While a smaller batch size can sometimes help with convergence, it can also lead to slower training. It might not necessarily address the issue of oscillation. C. Increase Learning Rate: A higher learning rate can cause the loss to jump around more erratically, potentially worsening the oscillation problem. D. Increase Batch Size: A larger batch size can lead to smoother updates but might also make the model less sensitive to local gradients and hinder convergence, especially with an already oscillating loss.
upvoted 1 times
...
Akel123
6 months, 2 weeks ago
Selected Answer: C
I don't understand
upvoted 2 times
...
M25
1 year, 5 months ago
Selected Answer: B
Went with B
upvoted 1 times
...
TNT87
1 year, 7 months ago
Selected Answer: B
Answer B
upvoted 1 times
...
enghabeth
1 year, 8 months ago
Selected Answer: B
having a large learning rate results in Instability or Oscillations. Thus, the first solution is to tune the learning rate by gradually decreasing it. https://towardsdatascience.com/8-common-pitfalls-in-neural-network-training-workarounds-for-them-7d3de51763ad
upvoted 1 times
...
mymy9418
1 year, 10 months ago
Selected Answer: B
https://ai.stackexchange.com/questions/14079/what-could-an-oscillating-training-loss-curve-represent#:~:text=Try%20lowering%20the%20learning%20rate,step%20and%20overshoot%20it%20again.
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago