Correlations describe the interdependence of variables. That is, the effect of variable "x" on variable "y."
Regressions are used to describe the relationship between variables in a manner that allows the prediction of a dependent variable. Regression enables you to make predictions about future events/data.
A commonly used example of regression is the "best-fit line."
Correlation is a measure of linear association between two variables X and Y, while linear regression is a technique to make predictions, using the following model:
Y = a0 + a1 X1 + ... + ak Xk + Error
Here Y is the response (what we want to predict, for instance revenue) while the Xi's are the predictors (say gender, with 0 = male, 1 = female, education level, age, etc.) The predictors are sometimes called independent variables, or features in machine learning.
Typically, the predictors are somewhat correlated to the response. In regression, we want to maximize the absolute value of the correlation between the observed response and the linear combination of the predictors. We choose the parameters a0, ..., ak that accomplish this goal. The square of the correlation coefficient in question is called the R-squared coefficient. The coefficients a0, ..., ak are called the model parameters, and a0 (sometimes set to zero) is called the intercept.
See the link: https://www.google.com/search?client=firefox-b-1-d&ei=79yPX83cOYLbtQbTuafIBQ&q=relationion+regression+correlation&oq=relationion+regression+correlation&gs_lcp=CgZwc3ktYWIQAzIJCAAQyQMQFhAeOgQIABBHOgQIABANOgYIABAHEB46BAgAEEM6BAgAEB46BggAEAUQHjoGCAAQCBAeOgcIABCxAxBDOgUIABCxAzoFCAAQkQI6AggAOggIABAHEAoQHjoGCAAQDRAeOgYIABAWEB5Q1_QbWOXRHWD01h1oAHACeACAAb4CiAHrIJIBCDUuMjEuNC4xmAEAoAEBqgEHZ3dzLXdpesgBCMABAQ&sclient=psy-ab&ved=0ahUKEwiNkPHnjsXsAhWCbc0KHdPcCVkQ4dUDCAw&uact=5