Dynamic Attractors and Basin Class Capacity in Binary Neural Networks
Abstract:
The wide repertoire of attractors and basins of attraction that appear in dynamic neural networks not only serve as models of brain activity patterns, but they also create possibilities for new computational paradigms that use attractors and their basins. To develop such computational paradigms, it is first critical to assess neural network capacity for attractors and for differing basins of attraction, depending on the number of neurons and the weights. In this paper, the authors analyze the attractors and basins of attraction for recurrent, fully-connected single layer binary networks. They utilize the network transition graph -- a graph that shows all transitions from one state to another for a given neural network -- to show all oscillations and fixed-point attractors, along with the basins of attraction. Conditions are shown whereby pairs of transitions are possible from the same neural network. They derive a lower bound for the number of transition graphs possible, 2exp n2-n, for an n-neuron network. Simulation results show a wide variety of transition graphs and basins of attraction sometimes networks have more attractors than neurons. The authors count thousands of basin classes -- networks with differing basins of attraction -- in networks with as few as five neurons. Dynamic networks show promise for overcoming the limitations of static neural networks by use of dynamic attractors and their basins. The results show that dynamic networks have a high capacity for basin classes, can have more attractors than neurons, and have more stable basin boundaries than in the Hopfield associative memory.