I am implementing neural network code using C++ so I don’t have access to automatic differentation.

I can calculate gradients/backpropagation for (recurrent) feedforward neural networks easily but have trouble calculating same formulas for residual neural networks (ResNet).

Would somebody point me to online resource which would show step-by-step calculation of gradient/backprop for residual networks (ResNet)?

Here is a link to pdf that contains my calculations for neural network gradients (attempt for skip layers too):

https://github.com/cslr/dinrhiw2-private/blob/RBM_test/docs/neural_network_gradient_residual.pdf

More Tomas Ukkonen's questions See All
Similar questions and discussions