While learning Deep Learning using regression (Euclidean Loss), I stumbled on a question: Back-propagation minimizes the cost wrt the gradient descent and regression minimizes the cost of prediction, are they basically the same? Can anyone help me understand the difference?

More Huma Chaudhry's questions See All
Similar questions and discussions