Simple linear regression (one predictor, one outcome) is equivalent to Pearson correlation. Regression expresses the relationship differently in terms of a slope (change in Y for every one-unit increase in X) rather than a coefficient bound between -1 and +1. The R-square of simple linear regression is equal to the correlation coefficient squared.
Multiple regression lets you relate the outcome to multiple predictor variables. Multiple correlation coefficients become complicated and not very useful.
Simple linear regression (one predictor, one outcome) is equivalent to Pearson correlation. Regression expresses the relationship differently in terms of a slope (change in Y for every one-unit increase in X) rather than a coefficient bound between -1 and +1. The R-square of simple linear regression is equal to the correlation coefficient squared.
Multiple regression lets you relate the outcome to multiple predictor variables. Multiple correlation coefficients become complicated and not very useful.
the estimated coefficients in multiple regression analysis are partial correlations. This means that the effect of the other predictors is partialed out. For correlation it is not important to think about cause and effect. This question was previously asked in many different forums, e.g.,