Huge data and Excel in the same sentence, I am puzzled! ;) when I hear "huge data" my first instinct is to get out of Excel, so I'll be interested in seeing what answers you get, as I would be glad to be proven wrong on my excel prejudice... :)
There are no good or bad answers, although I agree with Frenanda Dorea that large dataset and Excel might not mix well.
Use what you know. If you don't know any computer languages, ask around and see what people around use. When I was studying meteorology, I used Fortran and C because that was what others were using and that way I could re-use part of their code. Now I use mainly C++.
If the objective is to look at climate model outputs then I assume data may be saved into NetCDF files. One possible choice is Climatic Data Analysis Tool (CDAT) which is based on python. It gives some build-in functionalities like regriding, smoothing, annual cycle calculations etc. But it's definitely not well documented than Matlab.
You can also use Fortran for analyze data in NetCDF file (http://www.unidata.ucar.edu/software/netcdf/docs/netcdf-f90/, http://www.unidata.ucar.edu/software/netcdf/docs/netcdf-f77/) You can find some suitable software for different data and metadata formats in the British Atmospheric Data Centre webpage (http://badc.nerc.ac.uk/help/formats/index.html)
Thanks Carlos..., But i need to analyse data in .xls, ascii, .grib, .netcdf, csv, txt, and sometimes .dat files also so i need a common language or Like Matlab to analyse all. Perl, Python, FORTRAN, c++, VB and are all available open source but learning and excelling in them will consume a lot time. And moreover each language is best with a few format of data, but which one is a onestop solution for all? thats my Issue....But atlast i finalised R and MatLab and installed in my lap...Still Find MatLab easier to learn than R. As for as I am concerned am going with MatLab, If any thing much better than this for the above said purpose am thankful to know that...