I am aware of many algorithms for using probability in different aspects of image

processing, but it seems that at one level when an image is captured, it is a

deterministic event (i.e. each pixel value is given value even if we take repeated

image captures). I assume that noise is not a concern here for the moment.

Probability implies some sort of inherent random variation (however, my views are open to criticism). Are we capturing inherent probability (i.e. contained within itself) in a scene? It would be nice to have a more detailed explanation.

More Narasim Ramesh's questions See All
Similar questions and discussions