All of the following are common optimization techniques in deep learning to determine weights that represent the strength of the connection between artificial neurons EXCEPT?
A.
Gradient descent, which initially sets weights arbitrary values, and then at each step changes them.
B.
Momentum, which improves the convergence speed and stability of neural network training.
C.
Autoregression, which analyzes and makes predictions about time-series data.
D.
Backpropagation, which starts from the last layer working backwards.
Autoregression is not a common optimization technique in deep learning to determine weights for artificial neurons. Common techniques include gradient descent, momentum, and backpropagation. Autoregression is more commonly associated with time-series analysis and forecasting rather than neural network optimization. Reference: AIGP BODY OF KNOWLEDGE, which discusses common optimization techniques used in deep learning.
Contribute your Thoughts:
Chosen Answer:
This is a voting comment (?). You can switch to a simple comment. It is better to Upvote an existing comment if you don't have anything to add.
Submit