We are scaling down a 5 Megapixel image (2048x2448) by 4 in python and we get the mean to be the same for the normal and compressed image but the variance seems to change significantly (e.g. 150-64). Is the change in variance expected?
Any reasonable downsizing algorithm (I assume your "scaling down ... by 4" to be "scaling down by 2" on both axes) includes an appropriate low-pass filter to prevent undersampling artefacts.
Thus the reduction in variance doesn't come as a surprise.And the mean value should remain constant during downsizing as a low-pass doesn't affect the DC components.
scaling down by 4 means, scaling down by 4 on both axes.
I see, we are using the opencv's resize() function which uses interpolation methods to scale an image, which i assume would involve a low-pass filtering.
Do you know any statistical parameters that don't change with scaling?
Do you also know if we can get back the value of the variance after scaling ?
I'm not too familiar with statistical properties, so - beyond the unaltered mean value - I cannot contribute to your quesion about unchanged parameters.
Regarding "upscaling":
Content that is gone (read: "filtered out") cannot be retrieved. So the variance won't be restored during upscaling.
(I have to admit that there are some algorithms that deal with "inverse filter functions" - seemingly being able to restore what has been filtered out. When looking closer, these algorithms may handle the data somewhat differently; but the basic rule "what's gone is gone" applies to these algorihms as well.)