06 June 2016 7 4K Report

I am building a statistical distribution model for my data. And I am working with several data sets, each has different data range.

In order to compare the results of my model I needed to unify the range of my data sets,  so I thought about normalizing them. A technique called feature scaling is used, and it is performed as such;

(xi -min(X))/( max(X) - min(X)) , where xi is the data sample and X is the data set.

However, by checking the mean and standard deviation of the normalized data, it is found to be almost zero. Does that affect the statistical distribution of my original data?

More Hiba Ali's questions See All
Similar questions and discussions