In adsorption, a reviewer suggested me to not to use linearized models in the age of computers where it is possible to perform more precise non-linear regression.
In non - linear regression, the model equation is non-linear w.r.t the model parameters, e.g linear weighted sum of exponential decays, modelled as exp(-bi*x) terms, here the linear weights e.g ai's and the bi's are the model parameters.
I would suggest that you look into the chapter of nonlinear Regression from the book: Applied Regression Analysis by Draper and Smith, these models are usually solved iteratively with linearization at each step and pre chosen threshold for convergence.
It is usually implementable in most commercial statistical analysis packages.
Yes. Conducting nonlinear regression usually works out well, but it can be a little tricky. You might look up examples for PROC NLIN in SAS or the nls() function in R. (Although the nlmrt() function in R may work better for difficult models.)
One issue is that usually you will need to supply starting values for the parameters you wish to estimate. This can sometimes be difficult. Some software will allow you to ask it to try a range of starting values.
Another issue is that the software may converge on solutions that are not the real or best solution. This can occur if your starting values for parameters are too far off, or if the data set doesn't clearly suggest the model you are trying to fit.
Autoregressive moving average (ARMA) modelling is nonlinear because of the involvement in it of error (or white noise) terms. The process of estimation must of necessity be iterative starting from an initial estimate. With a software choice of the initial estimate is inbuilt. That determines the path of the estimation process. Therefore not all nonlinear systems are solvable by translation into an approximate linear one.
I think the most accurate method for developing non-linear model is Genetic Programming (GP). GP generates nonlinear models with significantly higher accuracy compared to least square fitting approach.
Lots of dedicated data analysis software products allow you to define your own functions for use with their nonlinear regression routines...I'll mention Igor Pro from WaveMetrics (my favorite) and Origin.
You can even do this for any function with Excel, though it can be a little cumbersome. You first guess the necessary coefficients. You then calculate the function value with these guesses for the value of the independent variable from your data set. You then subtract this prediction from the experimental measurement and square the result. You do this for all of your data. Then you sum up these "difference squared" values and use the Excel tool Solver to minimize the sum by changing the coefficients you guessed. It's a little time consuming but not very difficult. You can even extend this to regressions with more than one independent variables.
Of course, if you need statistics related to the quality of the regression, then you might be better off with one of the commercially available data analysis products.
I. About linearized versus nonlinear least-squares regression:
Linearized correlation and the corresponding linearized plot are often quite convenient to qualitatively evidentiate scattering around the trendline and to emphasize major effects and the physical meaning of the correlation parameters. Estimates of the parameters derived by least squares after the linearized correlation (modified from a former nonlinear correlation) can possibly be refined by iterative nonlinear least-squares regression; to find unbiased least-squares estimates for the original correlation. It may be advisable to compare both the linearized and the nonlinear correlations.
The following RG discussions seemingly can be found of some interest, concerning to your query:
You can use a higher order polinomy, en the regression is then non lineal. You cal search a code in the matlab share codes. they are for free and works perfectly.
The recursive least squares algorithm (RLS) allows for (real-time) dynamical application of least squares (LS) regression to a time series of time-stamped continuously acquired data points. As with LS, there may be several correlation equations with the corresponding set of dependent (observed) variables. For RLS with forgetting factor (RLS-FF), acquired data is weighted according to its age, with increased weight given to the most recent data. This is often convenient for adaptive control and/or real-time optimization.
Application example ― While investigating adaptive control and energetic optimization of aerobic fermenters, I have applied the RLS-FF algorithm to estimate the parameters from the KLa correlation, used to predict the O2 gas-liquid mass-transfer. Estimates were improved by imposing sinusoidal disturbance to air flow and agitation speed (manipulated variables). Simulations assessed the effect of numerically generated white Gaussian noise (2-sigma truncated) and of first order delay:
Thesis Controlo do Oxigénio Dissolvido em Fermentadores para Minimi...
Nonlinear method is most suitable as the error involved are very minimal. Unlike the linear were the best fit is not taken as you do not have control over it. The nonlinear will calculate the constant for you. Try origin nonlinear plot.