The standardized regression coefficient is often termed beta in a simple or multiple regression - so I'll assume that's the context.
The coefficient is the estimated SD change in Y associated with a one SD change in X1 (holding any other X variables such as X2, X3 etc. in the model constant).
So multiplying the beta coefficient by the SD of Y turns this into the change in Y in the original units associated with a 1 SD change in X1. This can be helpful because the original units are usually much more meaningful. For example predicting income from height if beta is +0.1 that isn't helpful but if the SD is 4,000 dollars then each SD change in height is associated with a $400 dollar increase in salary.
I rather like this kind of careful use of partial standardization, though I generally prefer unstandardized coefficient (usually b) as I'd like to know, says, that being taller by 10cm is associated with a $300 dollar increase in income. (Note these numbers are made up - but generally the original units are more helpful).
For multiple linear regression I'm not certain this is too useful, though I suppose it could be. But 'independent' variables actually have some interactions. Some collinearity, for example, is generally unavoidable. In the worst cases, that can actually change the sign of the coefficient compared to when only that regressor (independent variable) is used. In more reasonable cases, the result still means that if one ranks the importance of independent variables by their relative p-values, that just seems wrong to me, and here it will 'muddy the water' as well.