Robustness of Hebbian and anti-Hebbian learning

Tibor Fomin, Andras Lorincz

Research output: Contribution to journalArticle

Abstract

Self-organizing neural networks with Hebbian and anti-Hebbian learning rules were found robust against variations in the parameters of neurons of the network, such as neural activities, learning rates and noisy inputs. Robustness was evaluated from the point of view of properties of soft competition for input correlations. Two models were studied: a neural network with presynaptic Hebbian learning and a similar network that models growing connections. Both networks were trained on pixel-discretized 2 dimensional images. If extended local objects of the 2 dimensional world placed at random positions are inputted to the network then neurons develop spatial filters with inhibitory connections between the neurons that correspond to the neighboring relation, i.e., the network builds up the internal representation of the geometry of the external world. Both networks were found robust against input noise and network parameters. A lower estimate for the change in all of the parameters and pixel noise that do not deteriorate performance is ±10%, making the system attractive for analog hardware implementation. It was found that small amount of input noise and small variations in the network parameters can further improve the geometry representation.

Original languageEnglish
Pages (from-to)699-717
Number of pages19
JournalNeural Network World
Volume4
Issue number6
Publication statusPublished - Dec 1 1994

ASJC Scopus subject areas

  • Software
  • Neuroscience(all)
  • Hardware and Architecture
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Robustness of Hebbian and anti-Hebbian learning'. Together they form a unique fingerprint.

  • Cite this