All of the following are common optimization techniques in deep learning to determine weights that represent the strength of the connection between artificial neurons EXCEPT:
A.
Gradient descent, which initially sets weights to arbitrary values, and then at each step changes them.
B.
Momentum, which improves the convergence speed and stability of neutral network training.
C.
Autoregression, which analyzes and makes predictions about time-series data.
D.
Backpropagation, which starts from the last layer working backwards.
The correct answer is C. Autoregression, which analyzes and makes predictions about time-series data.
Autoregression is the exception, as it is not used to optimize weights in deep learning.
upvoted 1 times
...
This section is not available anymore. Please use the main Exam Page.AIGP Exam Questions
Log in to ExamTopics
Sign in:
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.
Upvoting a comment with a selected answer will also increase the vote count towards that answer by one.
So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.
Ecommail
2 weeks, 2 days ago