It is possible, in general, for classification accuracy to decrease if you increase the number of features. It depends on your specific data and the amount of training data, the specific features, the specific classifier, and so on.
Would you mind posting more specifics about your problem, such as number of training samples, classifier used, input data?
First, of all thank you Mr, Matthew for your answer.
I totally agree with you but think about this, the features present in (let's say) 28 feature vector taken in zig-zag fashion will also be there in feature vector of 78 taking in the same fashion.Then the accuracy must be preserved, how comes there is the difference in accuracy!!! I am not getting it.
And the number of training samples are 665 and classifier used is svm classifier with radial basis kernel. input data is a microscopic blood image.
I agree that the original 28 features will be present in the 78 dimensional features, but in general it is not true that the classifier accuracy will remain the same if we merely add new features. Especially with a SVM using RBF kernel, which can be prone to overfitting if the training parameters (the cost C and the width of each kernel) are not set properly.
I'm more confused by the ~30% drop in accuracy going from 48 to 78 features which is then almost all gained back going from 78 to 92 features. A massive performance drop followed by a massive performance gain is a little unexpected. You might want to check your code to make sure something didn't screw up.
It is important to notice that, in general, by adding features you are adding complexity to the classifier. Therefore, you can expect a gain in performance if the added features are discriminant and there is no overfitting. Otherwise, performance will probably drop. Moreover, it is typical that the more features you add to the feature vector the more training samples will be needed.
I agree with Matthew concerning the drop in accuracy when you go from 48 to 78 features. It is unexpected to see the accuracy rise again when going from 78 to 92 features.
You may want to consider feature selection techniques to get a closer look at the discriminant power of each of the feature vector's dimension. You can find an excellent review of these techniques at http://core.ac.uk/download/pdf/23799612.pdf