I've tried to compute the mean and variance of a grey scale image and its values for various scaled downs of the image (e.g. scaling down the image by a factor 2,3,4 etc ). The mean doesn't change as expected but the variance changed significantly when scaling down from 2 onwards. The original image had variance of 150 and then a scale of 2,3,4 onwards had a variance of around 64 (variance stays around 64 for scaling down of values from 2-10 and maybe above ).

We are using a python script to read the image and compute the mean and variance from it and its scaled down versions. We've used the .var() and computing the variance from its histogram data and they both show the same result of around 64 when scaling down from 2 onwards, so our method of computing the variance is not wrong.

We are not sure why we get this change in variance ? Also if possible, are there any other statistical parameters that don't change with image scaling, just like how mean doesn't change?

Similar questions and discussions