PLOFS (piecewise linear orthonormal floating search), described here:
Jiang Li, Jianhua Yao, Ronald M. Summers, Nicholas Petrick, Michael T. Manry, and Amy K. Hara, “An Efficient Feature Selection Algorithm for Computer-Aided Polyp Detection,” special issue of the International Journal on Artificial Intelligence Tools (IJAIT), vol. 15, no. 6, December 2006, pp. 893-915.
Jiang Li, Michael T. Manry, Pramod Narasimha, and Changhua Yu, “Feature Selection Using a Piecewise Linear Network”, IEEE Trans. on Neural Networks, Vol. 17, no. 5, September 2006, pp. 1101-1115.
Well, I suggest you to use WEKA Attribute Selection part which has lots of modern algorithms like CfsSubsetEval, ChiSquaredAttributeEval, GainRatioEval, and so on. I used lots of them and GainRatio usually works fine for my datasets.
moreover, if you want a attribute extraction method, you can use PCA, LDA, ISOMAP, LLE or ...
but first I recommend WEKA explorer for your dataset analysis and after that you can think better about the situation and will make a better decision.
well, WEKA is an open source project and it has a well form gui called Explorer, you can find the project & its sources on here: http://www.cs.waikato.ac.nz/ml/weka/index_downloading.html
If you're looking for differentially expressed features, an adaptation of a method I just recently developed could be useful. I've attached a paper on it.
You can check the Quadratic Programming Feature Selection (QPFS) method. It proved to be very efficient on some of the problems I am working at. It works well on large datasets and the authors provide an implementation of it. Here comes a link to the online publication:
This is actually a very BIG topic. Recent focus of the feature selection is something related to Sparse Learning, one example is LASSO or LARS, though it is a little bit outdated.