Depending on the size of your images, you may have memory issues loading all the images at once on a 32bit operating system but you will definitely solve any memory issues on a 64bit operating system.
Run out of memory that would oblige you to load only part of your bunch of images or even less efficiently load 2 images at a time as you have to compute the distance between two images.
But as I said, it depends on the effective size of the images.
Have a look at a function I have written for myself to read thousands of text files in one go. You only need to modify it according to the image format you need...
Do you need to calculate the gaussian metric every image with each other? As Francois Abram said, you could have problems with memory (Matlab running out of memory). An initial approach if this is the case would be calculating it with the first and all the others; closing it (erasing from memory), doing the same with the second image and the rest os N-2 images, closing it and erasing the variable from memory, and so on and so for. It may not be very fast in making the calculation but you could be quite sure the memory problem didn't arise.
And as the algorithm goes by, it'd be faster and faster. It is just an idea.
Hi Rakesh, I remember answering your last question but since I can not see it , I suspect I might have done something wrong. Nevertheless here is my answer once again:
That piece of code is a Matlab function that you can call from the Matlab command line or in any of your programs.