I am writing a nonlinear data regression program, the approach I am using is simple. After calculating estimated values for variables (I'm using orthogonal distance, thus error is both variables), I calculate the sum of the square error as follows:

SSQ = (X calc - X exp)2 + (Y calc - Y exp)2

Let's suppose that I have one adjustable parameter "a". I can calculate the derivative of the SSQ with respect to "a" by adding a delta to "a", then re-evaluating X calc, Y calc and then SSQ. This numerical derivative is (SSQ(at a+ delta) - SSQ) / delta

For stability purposes, (and since I am not converging yet), I want to replace this numerical derivative with an analytical one. The analytical derivative is

d SSQ / da = 2*(X Calc - X exp)* d Xcalc/da +2*(Y Calc - Y exp)* d Y calc/da

Can anyone explain to me why the analytical derivative is twice that of the numerical derivative?

-Thanks in advance.

Similar questions and discussions