In the multiple regression analysis you have more than one predictor variable, so the question is how to interpret the regression coefficient. There are two good-to-understand ways:
1) The regression coefficient is the coefficient of the regression of the relevant predictor controlled for all other predictors on the criterion controlled for all other predictors, so it is a partial regression coefficient.
2) The regression coefficient is the coefficient of the regression of the relevant predictor on the criterion while all other predictors are hold constant.
Both descriptions try to explain how you can interpret a "net effect". A "net effect" is just the effect of a predictor on the criterion with all other possible predictors controlled for (through holding them constant or doing partial regression). If you would run the analysis with only this predictor this would change the results, because then you would not control for the possibly confounding variables.
Another important point is that divers predictors can explain the same variance in a criterion. The "net effect" tells you how much effect a predicotrs has on the criterion beyond the effect of the other predictors.
The coefficients in a multiple linear regression are more interesting because they represent changes in the response that can be associated with a given predictor for fixed values of other predictors, and will be called "net effects".
It is not reasonable to compare either "gross" or "net effects" across predictors, because the regression coefficients depend on the units of measurement.
Simple and partial correlation coefficients can be compared in much the same vein as we compared gross and net effects. See, for example, http://data.princeton.edu/wws509/notes/c2s5.html