Alex's Notes

Home

❯

general

❯

Backward propagation of errors (Back propagation)

Backward propagation of errors (Back propagation)

Jan 20, 20241 min read

  • programming

Backward propagation of errors (Back propagation)

This is the specific implementation of gradient decent applied to neural networks. There are two stages of this calculation:

  • Forward pass through as you evaluate the neural network on test data.
  • Backward propagation of that error as you use the chain rule for differentiation on each of the perceptrons.

For this to work all the perceptrons need to have differentiable activation function.


Graph View

Backlinks

  • Week 2 - Neural networks

Created with Quartz v4.5.1 © 2025

  • GitHub
  • Discord Community