where $Y_{n\times 2}$ is a n times 2 response matrix, $X_{n\times q}$ is the n times q design matrix, $B_{q\times 2}$ is the q times 2 parameter matrix and $E_{n\times 2}$ is a n times 2 residual error matrix. A good reference might be the book: Timm (2002), Applied Multivariate Analysis, Springer.
Chris, you were writing "ERG amplitude and latency". Are these two variables? If so, then I understood that you have one or more predictors (independent variables) where you have different stimulus levels for which you then measure two(!) responses (dependent variables).
This may be analyzed by "Multivariate (Multiple) Regression". This could be a primer:
Dear Jochen, Yes - sorry for being unclear. I have measured the so-called electroretinogram and we have analyzed those responses in terms of latency and amplitude. So you are totally correct: There is one independent variable (stimulus level) and two dependent (measured) variables. Because of the high noise levels we would like to do a multivariate regression to see if we can make sense of the two dependent variables combined. I am but a simple biologist and I have spent the weekend searching the internet to get a grasp of how to tackle this issue and failed. Your presentation looks very interesting and I will take a look! Thank you heaps
Dear Naceur: The thing is that we obtained 2 measures (dependent variables) for each stimulus level (the independent variable), so I am not sure if your equation covers this, quite unusual, situation? It is not that we have two independent variables and one variable outcome. But please let me know if your model works - It is exactly this kind of formula I am looking for - I just need a linear equation as an input for matlab to perform a regression. Thank you so much for your efforts so far.
Also google for "MultivariateRegression" (just be aware that some authors confuse "multiple" and "multivariate"; in several links you will -wrongly- find multiple Regression for a single response)
Dear Jochen - That's exactly one of the problems - 9 out of 10 leads end up with info about multiple regression - and that's not what I am looking for :) I must thank you again for this information, I appreciate it !!
where $Y_{n\times 2}$ is a n times 2 response matrix, $X_{n\times q}$ is the n times q design matrix, $B_{q\times 2}$ is the q times 2 parameter matrix and $E_{n\times 2}$ is a n times 2 residual error matrix. A good reference might be the book: Timm (2002), Applied Multivariate Analysis, Springer.
Out of the two variables that you have, first decide which one is dependent on the other. After that, you can proceed to fit an equation of the type Y = aX + b using the standard method of least squares. This is a very simple analysis available in all standard text books on Statistics. However, first decide which variable is independent, and which variable is dependent on the other variable. You are the best judge for that.
For the bivariate linear regression problem, data are collected on an independent or predictor variable (X) and a dependent or criterion variable (Y) for each individual. Bivariate linear regression computes an equation that relates predicted Y scores (Ŷ) to X scores. The regression equation includes a slope weight for the independent variable, Bslope (b), and an additive constant, Bconstant (a):
Ŷ = Bslope X + Bconstant
(or)
Ŷ = bX + a
Indices are computed to assess how accurately the Y scores are predicted by the linear equation.
We will focus on applications in which both the predictor and the criterion are quantitative (continuous – interval/ratio data) variables. However, bivariate regression analysis may be used in other applications. For example, a predictor could have two levels like gender and be scored 0 for females and 1 for males. A criterion may also have two levels like pass-fail performance, scored 0 for fail and 1 for pass.
Linear regression can be used to analyze data from experimental or non-experimental designs. If the data are collected using experimental methods (e.g., a tightly controlled study in which participants have been randomly assigned to different treatment groups), the X and Y variables may be referred to appropriately as the independent and the dependent variables, respectively. SPSS uses these terms. However, if the data are collected using non-experimental methods (e.g., a study in which subjects are measured on a variety of variables), the X and Y variables are more appropriately referred to as the predictor and the criterion, respectively.
The general formula for a bi-variate linear regression analysis can be given by:
y = ax1 + bx2 + c
where x1 and x2 are the two independent variables of the study; a, b are coefficients of regression and c is the intercept with y-axis. Many statistical packages can perform this regression analysis (e.g., StatView, SPSS, Statistica, and even Excel).
The ERG has been studied for a long time, so I would guess that there are already pretty good models out there for it. As has been pointed out, you only have one independent variable ("level"), so there's no bivariate anything required, you can simply perform linear regression separately for each of your dependent measures. You then might look to see if there is any correlations between the residuals for the two measures.
Depending on the range of levels chosen, you don't necessarily expect this to be very linear.
Other input variables which are bound to have an influence are: stimulus size, duration, spatial structure, spectral composition, etc. If you vary one or more of these in addition to stimulus amplitude then you could test a multiparameter model - but don't expect it to be linear!
The formula made available by Ehsan Mohamed drives home the point and also all the available statistical/econometric softwares can perform this function with relative ease. However, it is worthy to note that regression provides information on the functional relationship that exists between variables (indépendant and dependent).
Guys, he's asking for multivariate regression (several dependent variables and one or more independent variables like in MANOVA) and not multiple regression (1 dependent variable and one or more independent variables like in Factorial ANOVA).
Stephan Haug formulation and reference is correct...(actually the MANOVA model is a multivariate regression model!)
If I understand well your question, you have some predictors and two variables to predict. Then, any program of general linear model will do the job, by MANOVA (Multivariate analysis of variance). For instance, in R, declare your two exogeneous variables as a data.frame with two columns, use this bivector as response, write your set of predictors as usual, and check the result by the manova function. At the inverse of univariate regression, the criterion for the statistical test is not unique (F in the case of univariate analysis of variance) but multiple solutions were proposed by statisticians: Pillai trace, Wilks criterion, Hotelling-Lawley criterion, Roys criterion and probably more. In "good" cases all of them give the same indication. I never met a "bad" case but they may occur. The model is quite simple :
MY=X*beta + ME where MY is the matrix of the variates under study as columns (two columns in your case), X is the matrix of covariates (continuous of factors), beta is the column vector of coefficients for each predictors including the intercept, and ME the matrix of residuals, having the same dimension as MY. The coefficients are found by minimizing or maximizing one of the criteria above. The underlying hypothesis is very restrictive and probably never achieved: Errrors are distributed as a multinormal, The variance-covariance matrix of errors is absolutly the same for all levels of the factors, and for all values of the predictors. João Maroco is fully right, and probably Stephan Haug also, as far as I understand his anwers. All others answer to another question.
Either R or SAS or Statistica do the job perfectly.
I will look into MANOVA and your other answers for sure! Thanks all for your helpful answers, and special thanks to João Maroco for clarifying and Jean-Sébastien Pierre for the clear explanation and Stephan Haug for understanding my apparently confusing question right away.