Backpropagation

#backpropagation #need #backpropogation #algorithm #error

Akash Deep Dec 14 2021 · 1 min read
Share this

What is backpropagation? How does it work? Why do we need it?

  • The Backpropagation algorithm looks for the minimum value of the error function in weight space using a technique called the delta rule or gradient descent.
  • The weights that minimize the error function is then considered to be a solution to the learning problem.
  • We need backpropogation because:-

    Calculate the error – How far is your model output from the actual output.

    Minimum Error – Check whether the error is minimized or not.

    Update the parameters – 

    If the error is huge then, update the parameters (weights and biases). After that again check the error.

    Repeat the process until the error becomes minimum.

    Model is ready to make a prediction – Once the error becomes minimum, you can feed some inputs to your model and it will produce the output.

    Comments
    Read next