A simple standardization process involves: 

R= X-Mean / Standard deviation

such that, our distribution has a mean of 0 and sd of 1.

However, I want to standardize distribution such that it has a mean of 0 but sd of 10. I guess it will done as follows: 

R= ((X-Mean)/Standard Deviation )*10

I just need a little bit clarification on this. Did we get such a notation because we are scaling data from a deviation of 1 to 10 (hence simple multiplication)?

More Anjali Pant's questions See All
Similar questions and discussions