I deal with images of a DDSM database . The size of these images is very high (e.g. 3000*5000 pixels). I always get an 'out of memory' error when I run them on Matlab.
Images of 3000*5000 pixels are 15 MB when monochrome byte, 45 MB when RGB color, or 60 MB when monochrome integer/float. Should still be manageable, even when running on a 32-bit operating system.
Plug in some more RAM so you have at least 4 GB, or adjust your matlab code to free memory of duplicates you don't need anymore.
But to answer your question: YES, downsampling leads to loss of information and COULD lead to degradation in the accuracy of your algorithm. Depends on your algorithm though.
What is the size of the feature that you are trying to analyze? Could you elaborate a bit on what your algorithm does? In general I would calculate the resolution that I have after resizing (pixels/mm) and make sure that my resolution is at least twice as much as the size of the feature I am trying to extract.
Also you could simulate an approximation of what you are trying to detect in the images and then see if your algorithm picks it up.
Images of 3000*5000 pixels are 15 MB when monochrome byte, 45 MB when RGB color, or 60 MB when monochrome integer/float. Should still be manageable, even when running on a 32-bit operating system.
Plug in some more RAM so you have at least 4 GB, or adjust your matlab code to free memory of duplicates you don't need anymore.
But to answer your question: YES, downsampling leads to loss of information and COULD lead to degradation in the accuracy of your algorithm. Depends on your algorithm though.
As others have noted, the size of the feature/s (relative to the total size of your image) that you're analyzing is an important consideration, as is the nature of the algorithm itself (and yes, rescaling images will affect the "quality" of the information).
If the features you're looking for are small, you might do better to crop (divide) the original image into several smaller images, keeping the original resolution. If the features are very large, then the chances are better that downsampling might not have a dramatic affect on your algorithm performance (as if you had simply captured with a lower resolution sensor).
At the same time, I wonder if there might not be something else wrong with the specific Matlab code (or configuration) that you're using... Matlab on any reasonably modern computer should be pretty good at handling those images, which are actually pretty modest in size compared to many other scientific applications where Matlab is commonly used.
Image processing is very application oriented . You have to find the sensivity of your application to find how far you can resize your image. You have to rewrite your algorithm with modular manner to be able to process your image, based on blocks and stages.
An example : co-occurrence matrices from which entropy, energy, contrast can be calculated will probably lead to different results just by changing the size of the image you are testing.
The image size you mention does not sound excessive… there are a lot of things you could try to improve the situation before resorting to resizing the image, e.g.
- Be sure to delete variables when they are no longer needed… or reuse variable names instead of creating entirely new variables where appropriate, e.g.
im = imfilter(im, fspecial(‘gaussian’, 9, 1));
instead of
im2 = imfilter(im, fspecial(‘gaussian’, 9, 1));
Dividing a long script into shorter functions may also help with this.
- Crop out regions of the image you definitely aren’t interested in - but keep what remains its original size. You could do this in MATLAB directly, or beforehand in other free software with a graphical user interface (e.g. ImageJ).
- By default, many MATLAB computations are made with double precision, when single would be quite adequate. Therefore you could look to convert any double arrays to single from the very first time they are used, e.g.
my_variable = single(my_variable);
If your input image is a double matrix, try converting it to single before calling your analysis function. If you aren’t sure what is the type of your variables, try the whos function:
- Some functions can be replaced with less memory-intensive ones. For example, if you are labelling features bwlabel may require a lot of memory, bwconncomp much less so. Similarly, most uses of repmat can be replaced with bsxfun.
- If you need to chop up the image into blocks, check out the blockproc function
- Use the profiler, and stop the running code at various places. When it has stopped, call whos to find out which variables are around and taking up memory:
"Out of Memory" is a big problem to MATLAB. One way is to enlarge your DRM. As to resize your image, it does lose message. What you should know is whether these lost messages are important to your purpose.