The concept of degree of freedom is originated from classical mechanics to know the position of a point, or, precisely, the minimum number of observations in their respective dimension to know the positional vector. The statisticians harness this concept to test the significance level of the test statistic. Look at any theoretical distribution, they all are having certain degree of freedom and with increasing degree of freedom the shape of the theoretical curve flattens. We all use this statistical norm, however, I want to know the mathematical logic behind using this norm. kindly help. 

Similar questions and discussions