JC: Feature Selection is commonly associated to classification or regression tasks. In both scenarios, the objetive is to improve any performance measure with less variables than the original set -i.e. for classification, the accuracy of any classifier; and for regression, the MSE of any regression algorithm. These algorithms could be a simple nearest neighbor classifier, a Naive Bayes classifier, or powerful ones as a Support Vector Machine and a Neural Net. So, after this long text, your objective function must be the classification or regression algorithm i.e. improve the accuracy or the MSE. You must adapt the Harmony search to embed this algorithms in order to be "optimized".
i would say cross-validation...Also there is one thing that you should keep in mind. Harmony search is an stochastic algorithm, what this mean? From one run to another, the solution may be different. Picking the best solution could be a biassed solution. So how to solve it?, with several runs of your harmony search experiment. May be you could use bootstrap sampling and with each bootstrap sample apply your harmony search; at the end, some variables or features consistently will appear in the final subsets. Those variables or features are the interesting ones. Hope this help you.
you can use any classifier accuracy as objective function. you can then set it maximize so that you will get the best features. you can use KNN or naive bayes classifier
If you work with harmony search, you should maybe know about the fact that harmony search is in fact a special case of evolution strategies and that some results reported by the "inventor" of harmony search, Z.W. Geem, seem extremely unlikely: http://www.dennisweyland.net/blog/?p=12
I would like to recommend you the following article: I may be helpful to you..
Mahamad Nabab Alam, Biswarup Das, Vinay Pant, A comparative study of metaheuristic optimization approaches for directional overcurrent relays coordination, Electric Power Systems Research, Volume 128, November 2015, Pages 39-52, ISSN 0378-7796, http://dx.doi.org/10.1016/j.epsr.2015.06.018.