I have been doing research on feature selection and I'm failing to understand the difference about these two approaches.

According to most authors on literature, feature selection algorithms are categorized into three categories. The first two, filter and wrapper are easy to understand and there is a general agreement on that. However, on the last category there seems to be a misunderstandment. Some authors as the case of H. Liu name the last category as hybrid. In contrast, V. Kumar names it embedded. In addiction to that there are cases where authors define 4 categories including both embedded and hybrid algorithms, as is the case of P. Abinaya.

Authors explain the hybrid algorithms as the combination between a filter algorithm and a wrapper approachs. The main idea behind these algorithms is to use a filter approach to reduce the search space for a wrapper approach.

On the other hand the definition of embedded algorithms on the literature is very different depending on the source. Some use almost the same definitation as the hybrid algorithms as is the case of the wikipedia page. Others give more abstract definitions such as: methods that perform feature selection during learning of optimal parameters, and methods that incorporate knowledge about the specific structure of the class of functions used by a certain learning machine.

So I would appreciate if anyone could explain me what's the difference between these two approaches or give a less abstract definition of embedded methods

Similar questions and discussions