Cost function(J) of Linear Regression is the Mean Squared Error (MSE) between predicted h(x) value and true value y.

J=1/2m Sum[ (h(x))-y ]^2 ….(1)

So, how can one interpret the Logistic Regression Cost function as above?

Logistic Regression Cost Function

Cost(h(x),y))= 1/m Sum[Cost(h(x),y) ]

Cost(h(x),y))= -1/m Sum[ y*log(h(x)) + (1-y)log(1-h(x)) ] …(2)

In equation (1) the term in the Square bracket is simple the difference, but in equation (2) the term in square brackets is confusing. Is it also the error term?

More Syed Abuzar Bacha's questions See All
Similar questions and discussions