When the selected dataset does not fulfill the requirements to apply linear regression, what are the suitable other alternative methods to fit a (linear ) model rather than considering the non-linear regression?
Andersen, R. (2009). Nonparametric methods for modeling nonlinearity in regression analysis. Annual Review of Sociology, 35(1), 67-85. doi:10.1146/annurev.soc.34.040507.134631
Dore, B., & Bolger, N. (2018). Population-and individual-level changes in life satisfaction surrounding major life stressors. Social Psychological and Personality Science, 9(7), 875-884.
Jones, K., & Almond, S. (1992). Moving out of the linear rut: the possibilities of generalized additive models. Transactions of the Institute of British Geographers, 434-447.
@Sudheeraka You haven't mentioned exactly what assumptions of Linear Regression are violated by your dataset. If there's is the issue of normality, then you can go for PLS Approach in Smart-PLS as it doesn't require normality of the data. If your data violates the assumption of linearity, the you should think over what @Holger has suggested. Wish you luck with your work 👍🏼
You may want to consider harmonic analysis. Although harmonic analysis is commonly applied to time series, it can be applied along other abscissa axis, regardless of its units. Prior interpolation may be required.
OMG what a question!!!!!!. Do what it takes to find the best possible analysis of your data. Start with Kutner, Applied Linear Statistical Models, 5th ed in the Z-library and sweeten to taste. May the FORCE be with you, David Booth PS @Holger had one reasonable solution . You could try Generalized Additive Models (GAMS), I kind of like these. If you need nonlinear models do it, "Carry on my Wayward Son" by Kansas is relevant here. Best, David Booth
Alternative procedures include: Different linear model: fitting a linear model with additional X variable(s) Nonlinear model: fitting a nonlinear model when the linear model is inappropriate. Transformations: correcting nonnormality, nonlinearity, or unequal variances by transforming all the data values for X and/or Y.
It sound really funny, but seriously perhaps as we have alternative to parametric tests as non-parametric tests, given the nature of your data and research design objectives.
It may not be exactly the same usage , relating it to weighted "linear regression" and "non linear regression" as the alternative, if the variables are not independent, and the model can't be expressed as linear in it's parameters. You could be more specific here, why do you think it's not fit for linear regression, because both linear and non can fit curves? Have you made any plot, what inference or information do you have about the variance, "homo/hetero", what are your response variable type, (counts, countinous r.v.,) and can it meet the assumptions for additive model or logit/probit, and more importantly what prior distribution and regression approach were used in simular research or that which gaps you would like to "fill-in", etc. Lasso , weibull growth, polynomial have the "why" each becomes appropriate non linear model specifications.
Be more specific of your data, previous method and why you think you need that which account for non linearity. Then will useful counsel, guidance and literature can be given.
And, on converting your data by taking the logs, ...None of your observed variables have to be normal in linear regression analysis, which includes t-test and ANOVA. The errors after modeling, however, should be normal to draw a valid conclusion by hypothesis testing.(https://www.google.com/url?sa=t&source=web&rct=j&url=https://data.library.virginia.edu/normality-assumption/&ved=2ahUKEwj7itTEsJrxAhVLyoUKHQTFC3AQFjABegQIBBAF&usg=AOvVaw2dd5bpA5wa7B1qKbZS-EXA )
If you can classify the dependent variable, you can use generalized linear regression methods (such as binomial or polynomial). If this is not possible, try quantile regression.
I agree with above non para vs parametric. Linear vs nonlinear which does depends upon the complexity of your data. Otherwise perhaps look at Giloni and Padberg, 2002 who discuss alternative methods of Linear Regression.
A useful text/general guide to consider is a book by: Brace, Snelgar and Kemp SPSS for Psychologists: And Everybody Else Paperback – 29 April 2016.
Below might be applicable to your question, for regressions of the form
y = predicted-y + e.
https://www.researchgate.net/project/OLS-Regression-Should-Not-Be-a-Default-for-WLS-Regression. See the various updates, in reverse chronological order. Note upcoming paper in the Pakistan Journal of Statistics. -
This is about the nature and magnitude of heteroscedasticity regarding V(y|y*), in regressions of form y=y*+e for finite populations. If you have autocorrelation then you need to go to GLS regression.
As long as y = y* + e, predicted-y, y*, can have a variety of functions in the above.
It appears to me that you were asking for WLS or GLS, as opposed to OLS regression, but I guess you could have meant something more robust.
So your data don't meet the assumptions of linear regression (whatever those are) so you want alternatives to fit a linear model. Do you want to use a linear model or not? If you meet doesn't meet the assumptions of a particular way of fitting a linear model (like OLS) then you should have said this. As it is stands, I don't understand your question (you want to not do a linear regression but you want to do a linear model). I don't get how other seem to have made sense of this, unless they are making up things about what you have said.
Daniel Wright actually, I wanted to get a primary idea about this. I focused my thinking to develop a linear predictive model using kind of a transformation method through minimizing or ignoring linear regression restrictions.
what problems are preventing you from applying linear regression to your datasets? The main issue usually arises where the dataset(s) is skewed and skewed badly. Otherwise you have data that are not normally distributed. This can be solved by applying logarithms. You can convert one or both datasets logarithmically and then carry out linear regression. Or you can plot the data on semi-log or double-log graph paper. Either way you straighten the curve. I suggest you test your datasets for skewness first.
Alternative procedures include: Different linear model: fitting a linear model with additional X variable(s) Nonlinear model: fitting a nonlinear model when the linear model is inappropriate. Transformations: correcting nonnormality, nonlinearity, or unequal variances by transforming all the data values for X and/or Y.
Alternative regression methods: dealing with problems by employing a non-least-squares method of fitting. Removing outliers: refitting the linear model after removing outliers or high-leverage or influential data points. Mechanical methods: Finding the best selection of X variables by mechanical means.
In fact, due to the data set and their graphics survey could be utilized from some parametric and none parametric methods to data analysis. Among that, the types and kinds of the data and the relationships between them with considering to the central tendency and dispersion indices are very important in the data analysis.
@Sudheeraka, I have some basic ideas for you . Sometimes, it depends on your concept of 'linear regressor', the type of DV data (ordinal, binary, or repeated measures), the sample size (5