Abstracting spatial prototypes through short-term suppression of Hebbian weights in a continuously changing environment

S. Tavitian, T. Fomin, A. Lőrincz

Research output: Contribution to journalArticle

Abstract

A step towards assumption-free self-organization is proposed. We address the problem of learning a stable representation of the environment from inputs that continuously change at an unpredictable rate. The vulnerability of competitive Hebbian learning to low rate changes is assessed. It is shown that anti-Hebbian suppression of the feed-forward Hebbian weights broadens the range of rates in which learning is possible, and reduces the influence of the rate on the emerging representation. The resulting robustness during real-time training is demonstrated through simulations, and is compared to an alternative non-synaptic suppression scheme. Some particular passive short-term response properties of high-level visual areas are pointed out as biological clues for this form of short-term plasticity. A question is raised about a possible stabilizing role of the proposed mechanism in the learning of invariances.

Original languageEnglish
Pages (from-to)707-727
Number of pages21
JournalNeural Network World
Volume7
Issue number6
Publication statusPublished - 1997

Fingerprint

Invariance
Plasticity
Learning
Weights and Measures

ASJC Scopus subject areas

  • Software

Cite this

Abstracting spatial prototypes through short-term suppression of Hebbian weights in a continuously changing environment. / Tavitian, S.; Fomin, T.; Lőrincz, A.

In: Neural Network World, Vol. 7, No. 6, 1997, p. 707-727.

Research output: Contribution to journalArticle

@article{3a35feb2efac4d68b83039929a46871f,
title = "Abstracting spatial prototypes through short-term suppression of Hebbian weights in a continuously changing environment",
abstract = "A step towards assumption-free self-organization is proposed. We address the problem of learning a stable representation of the environment from inputs that continuously change at an unpredictable rate. The vulnerability of competitive Hebbian learning to low rate changes is assessed. It is shown that anti-Hebbian suppression of the feed-forward Hebbian weights broadens the range of rates in which learning is possible, and reduces the influence of the rate on the emerging representation. The resulting robustness during real-time training is demonstrated through simulations, and is compared to an alternative non-synaptic suppression scheme. Some particular passive short-term response properties of high-level visual areas are pointed out as biological clues for this form of short-term plasticity. A question is raised about a possible stabilizing role of the proposed mechanism in the learning of invariances.",
author = "S. Tavitian and T. Fomin and A. Lőrincz",
year = "1997",
language = "English",
volume = "7",
pages = "707--727",
journal = "Neural Network World",
issn = "1210-0552",
publisher = "Institute of Computer Science",
number = "6",

}

TY - JOUR

T1 - Abstracting spatial prototypes through short-term suppression of Hebbian weights in a continuously changing environment

AU - Tavitian, S.

AU - Fomin, T.

AU - Lőrincz, A.

PY - 1997

Y1 - 1997

N2 - A step towards assumption-free self-organization is proposed. We address the problem of learning a stable representation of the environment from inputs that continuously change at an unpredictable rate. The vulnerability of competitive Hebbian learning to low rate changes is assessed. It is shown that anti-Hebbian suppression of the feed-forward Hebbian weights broadens the range of rates in which learning is possible, and reduces the influence of the rate on the emerging representation. The resulting robustness during real-time training is demonstrated through simulations, and is compared to an alternative non-synaptic suppression scheme. Some particular passive short-term response properties of high-level visual areas are pointed out as biological clues for this form of short-term plasticity. A question is raised about a possible stabilizing role of the proposed mechanism in the learning of invariances.

AB - A step towards assumption-free self-organization is proposed. We address the problem of learning a stable representation of the environment from inputs that continuously change at an unpredictable rate. The vulnerability of competitive Hebbian learning to low rate changes is assessed. It is shown that anti-Hebbian suppression of the feed-forward Hebbian weights broadens the range of rates in which learning is possible, and reduces the influence of the rate on the emerging representation. The resulting robustness during real-time training is demonstrated through simulations, and is compared to an alternative non-synaptic suppression scheme. Some particular passive short-term response properties of high-level visual areas are pointed out as biological clues for this form of short-term plasticity. A question is raised about a possible stabilizing role of the proposed mechanism in the learning of invariances.

UR - http://www.scopus.com/inward/record.url?scp=0031383586&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0031383586&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:0031383586

VL - 7

SP - 707

EP - 727

JO - Neural Network World

JF - Neural Network World

SN - 1210-0552

IS - 6

ER -