in back propagation neural network training, using gradient decent, the derivative of weights with respect to error are calculated by back propagation algorithm.is this calculated value the exact gradient or it is the approximated gradient of error with respect to weights for each input?