You may want to look at some of our works in recent years. The papers I am pointing to are posted at http://www.ece.rice.edu/~erzsebet/publications-EMerenyi.pdf (and also probably here at Research Gate). These studies aim at complex cluster structures in high-dimensional data. Improvement of the representation and visualization of an SOM's knowledge for better cluster identification: Tasdemir and Merenyi, IEEE TNN 2009; a new similarity measure for SOM prototypes (weights) for better clustering and cluster verification (CONNindex, Tasdemir and Merenyi, IEEE Man, Sys and Cyb. B, 2011), advancement of topology preservation measures (Weighted Differential Topographic Function, Zhang and Merenyi, ESANN 2006, and Merenyi et al, "Learning Highly Structured Manifolds ...", Springer book chapter, 2009). We also looked at forced magnification under theoretically unsupported conditions, i.e., for high-dimensional data with correlated dimensions (Merenyi et al., "Explicit magnification control ..., IEEE TNN 2007), which can be useful for improving the discovery potential for small, interesting clusters amongst large ones. These publications point to works by many other people, of course.
A line of very interesting SOM-related research is GRLVQ, a supervised relevance learning approach to deriving the relative importance of input features. (This is more specifically a derivative of the Learning Vector Quantization paradigm.) Matrix Relevance Learning Vector Quantization has been shown particularly successful lately. Search for names like Michael Biehl, Petra Schneider, Kerstin Bunte for MRLVQ, Barbara Hammer, Thomas Villmann, Michael Mendenhall for GRLVQ-related papers.
You may want to look at some of our works in recent years. The papers I am pointing to are posted at http://www.ece.rice.edu/~erzsebet/publications-EMerenyi.pdf (and also probably here at Research Gate). These studies aim at complex cluster structures in high-dimensional data. Improvement of the representation and visualization of an SOM's knowledge for better cluster identification: Tasdemir and Merenyi, IEEE TNN 2009; a new similarity measure for SOM prototypes (weights) for better clustering and cluster verification (CONNindex, Tasdemir and Merenyi, IEEE Man, Sys and Cyb. B, 2011), advancement of topology preservation measures (Weighted Differential Topographic Function, Zhang and Merenyi, ESANN 2006, and Merenyi et al, "Learning Highly Structured Manifolds ...", Springer book chapter, 2009). We also looked at forced magnification under theoretically unsupported conditions, i.e., for high-dimensional data with correlated dimensions (Merenyi et al., "Explicit magnification control ..., IEEE TNN 2007), which can be useful for improving the discovery potential for small, interesting clusters amongst large ones. These publications point to works by many other people, of course.
A line of very interesting SOM-related research is GRLVQ, a supervised relevance learning approach to deriving the relative importance of input features. (This is more specifically a derivative of the Learning Vector Quantization paradigm.) Matrix Relevance Learning Vector Quantization has been shown particularly successful lately. Search for names like Michael Biehl, Petra Schneider, Kerstin Bunte for MRLVQ, Barbara Hammer, Thomas Villmann, Michael Mendenhall for GRLVQ-related papers.
there is still a bad need for an efficient SOM-like mechanism with mixed type data
also quite interesting is the coupling of the SOM mechanism with some other (highly abstracted) plausible biological mechanism ; habituation is an interesting candidate with the aim of having a dynamic magnification process where repetitive patterns get less and less attention from the map and/or a dynamic expansion of the map to accomodate new patterns :
A Comparison Between Habituation and Conscience Mechanism in Self-Organizing Maps
Riccardo Rizzo and Antonio Chella
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 17, NO. 3, MAY 2006
Abstract—In this letter, a preliminary study of habituation in self-organizing networks is reported. The habituation model implemented allows us to obtain a faster learning process and better clustering performances. The habituable neuron is a generalization of the typical neuron and can be used in many self-organizing networkmodels. The habituation mechanism is implemented in a SOM and the clustering performances of the network are compared to the conscience learning mechanism that follows roughly the same principle but is less sophisticated.
or
A self-organising network that grows when required
Stephen Marsland, Jonathan Shapiro, Ulrich Nehmzow
Neural Networks 15 (2002) 1041–1058
Abstract
The ability to grow extra nodes is a potentially useful facility for a self-organising neural network. A network that can add nodes into its map space can approximate the input space more accurately, and often more parsimoniously, than a network with predefined structure and size, such as the Self-Organising Map. In addition, a growing network can deal with dynamic input distributions. Most of the growing networks that have been proposed in the literature add new nodes to support the node that has accumulated the highest error during previous iterations or to support topological structures. This usually means that new nodes are added only when the number of iterations is an integer multiple of some pre-defined constant, l.
This paper suggests a way in which the learning algorithm can add nodes whenever the network in its current state does not sufficiently match the input. In this way the network grows very quickly when new data is presented, but stops growing once the network has matched the data. This is particularly important when we consider dynamic data sets, where the distribution of inputs can change to a new regime after some time.
We also demonstrate the preservation of neighbourhood relations in the data by the network. The new network is compared to an existing growing network, the Growing Neural Gas (GNG), on a artificial dataset, showing how the network deals with a change in input distribution after some time. Finally, the new network is applied to several novelty detection tasks and is compared with both the GNG and an unsupervised form of the Reduced Coulomb Energy network on a robotic inspection task and with a Support Vector Machine on two benchmark novelty detection tasks.
there is a workshop-series on SOMs (and also LVQ), where recent developments are discussed. The 10th workshop will be held in July 2014 In Germany. See www.WSOM2014.de