I tried to find out why the deviance in GLMs is defined as -2*log(LR) (with LR being the likelihood ratio). Why the factor -2? Often, authors state that this way the deviance for a normal model with identity link equals the residual sum of squares, but this should not be the reason but rather a consequence.
I found that "under some regularity conditions", the deviance has a Chi-squared distribution. Again, I dont think that this should be a reason but a consequence. (btw: I would be glad if someone could explain what conditions these are).
Is there any *logical* reason why the deviance is calculated as -2log(LR) and not, for instance, as -log(LR) ? Or, to put it the other way around: what does the *squared* ratio of the likelihoods (L[sat] over L[restr]) tell us (of which the log is the deviance)?