During batch training of a neural network, you notice that there is an oscillation in the loss. How should you adjust your model to ensure that it converges?
When you observe oscillations in the loss during training, it is often a sign that the learning rate is too high. A high learning rate can cause the optimizer to overshoot the minimum of the loss function
A. Decrease Batch Size: While a smaller batch size can sometimes help with convergence, it can also lead to slower training. It might not necessarily address the issue of oscillation.
C. Increase Learning Rate: A higher learning rate can cause the loss to jump around more erratically, potentially worsening the oscillation problem.
D. Increase Batch Size: A larger batch size can lead to smoother updates but might also make the model less sensitive to local gradients and hinder convergence, especially with an already oscillating loss.
having a large learning rate results in Instability or Oscillations. Thus, the first solution is to tune the learning rate by gradually decreasing it.
https://towardsdatascience.com/8-common-pitfalls-in-neural-network-training-workarounds-for-them-7d3de51763ad
A voting comment increases the vote count for the chosen answer by one.
Upvoting a comment with a selected answer will also increase the vote count towards that answer by one.
So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.
hiromi
Highly Voted 1 year, 10 months agodesertlotus1211
Most Recent 2 months agofitri001
6 months, 1 week agoAkel123
6 months, 2 weeks agoM25
1 year, 5 months agoTNT87
1 year, 7 months agoenghabeth
1 year, 8 months agomymy9418
1 year, 10 months ago