Is there any rule that how many number of feature selection method is applied on particular dataset or it depends on dataset nd how we decide that number?
Hello Paramjit, a practical answer to your question is to treat number of features as a hyper-parameter and use a validation set to find the optimal answer. Both optimal number of features and optimal combination of feature types can be found this way.
Usually single feature selection method is used to identify significant features out of given/generated feature vectors. Number of features to be retained or to be discarded sometime depends on feature selection methods, and most of the time it is to be decided by the users.
If your dataset is fixed then understand statistical characteristics (Type, Origin, Instances, Features, Classes, Missing Values, Attribute Type, high dependency of features etc.) of your dataset before making choice to the list of Feature Selection Algorithms to be applied. This will help you in shortlisting an appropriate algorithm and will also limit on algorithm selection. As far as number of algorithms to be applied to any dataset is considered, there is no restriction on trying for optimal FS algorithm search. But if your considering an ensemble of FS algorithm then this factor need to be analyzed with respect to the dataset behavior and it's not fixed in general for all types of datasets or applications.