Self-organizing neural networks with Hebbian and anti-Hebbian learning rules were found robust against variations in the parameters of neurons of the network, such as neural activities, learning rates and noisy inputs. Robustness was evaluated from the point of view of properties of soft competition for input correlations. Two models were studied: a neural network with presynaptic Hebbian learning and a similar network that models growing connections. Both networks were trained on pixel-discretized 2 dimensional images. If extended local objects of the 2 dimensional world placed at random positions are inputted to the network then neurons develop spatial filters with inhibitory connections between the neurons that correspond to the neighboring relation, i.e., the network builds up the internal representation of the geometry of the external world. Both networks were found robust against input noise and network parameters. A lower estimate for the change in all of the parameters and pixel noise that do not deteriorate performance is ±10%, making the system attractive for analog hardware implementation. It was found that small amount of input noise and small variations in the network parameters can further improve the geometry representation.
|Number of pages||19|
|Journal||Neural Network World|
|Publication status||Published - Dec 1 1994|
ASJC Scopus subject areas
- Hardware and Architecture
- Artificial Intelligence