My greater goal is to develop a theoretical framework for analyzing computation and information flow in dynamical neuronal networks (of the sort found in the brain). 

I need a place to start, so I'm looking at the nature of the solutions from trained recurrent neural networks. I've begun by looking at the space of solutions for a delayed Xor function. 

What's interesting is that while it appears there are a large number of solutions, when you account for the permutability of the weight matrix and eliminate weight matrices that are mere permutations of each other, you get orders of magnitude less solutions.[1] 

I'm wondering if any papers actually analyze the solution space for a [particular] task for an artificial NN and try to understand exactly what is going on. 

[1] This diagram clusters the solution matrices for 1000 different trainings of a delated Xor recurrent network (implementing the algorithm in the classical paper A Learning Algorithm for Continually Running Fully Recurrent Neural Networks). 

More Akiva Lipshitz's questions See All
Similar questions and discussions