Backpropagation Algorithm in Neutral networks

Hritika Agarwal
Jul 22, 2020

--

“Backpropagation” is neural-network terminology for minimizing our cost function, just like what we were doing with gradient descent in logistic and linear regression. Our goal is to minimize our cost function J using an optimal set of parameters in theta.

To do so, we use the following algorithm:

Let’s Have a look at how it's done-:

Disclaimer — This series is based on the notes that I created for myself based on various books and videos I’ve read or seen , so some of the text could be an exact quote from some book/videos out there.

--

--

No responses yet