→ Backpropagation (short for “backward propagation of errors”) is the fundamental algorithm for training neural networks. It involves computing the error at the output and propagating it backward through the network to update weights and biases via gradient descent.
Why the other options are incorrect:
A: Convolutions are specific to CNNs and are not propagated in this manner.
B: Accuracy is an evaluation metric, not used in weight updates.
C: Nodes are structural elements, not passed backward.
Official References:
CompTIA DataX (DY0-001) Official Study Guide – Section 4.3:“Backpropagation passes the error backward from the output layer to the input layer to adjust weights using gradient-based optimization.”
Deep Learning Textbook, Chapter 6:“The backpropagation algorithm is essential for computing gradients of the loss function with respect to each weight.”
—
Contribute your Thoughts:
Chosen Answer:
This is a voting comment (?). You can switch to a simple comment. It is better to Upvote an existing comment if you don't have anything to add.
Submit