Gradient computation of continuous-time cellular neural/nonlinear networks with linear templates via the CNN universal machine

Mátyás Brendel, T. Roska, Gusztáv Bártfai

Research output: Contribution to journalArticle

4 Citations (Scopus)

Abstract

Single-layer, continuous-time cellular neural/nonlinear networks (CNN) are considered with linear templates. The networks are programmed by the template-parameters. A fundamental question in template training or adaptation is the gradient computation or approximation of the error as a function of the template parameters. Exact equations are developed for computing the gradients. These equations are similar to the CNN network equations, i.e. they have the same neighborhood and connectivity as the original CNN network. It is shown that a CNN network, with a modified output function, can compute the gradients. Thus, fast on-line gradient computation is possible via the CNN Universal Machine, which allows on-line adaptation and training. The method for computing the gradient on-chip is investigated and demonstrated.

Original languageEnglish
Pages (from-to)111-120
Number of pages10
JournalNeural Processing Letters
Volume16
Issue number2
DOIs
Publication statusPublished - Oct 2002

Keywords

  • Cellular neural network
  • Gradient-method
  • Learning
  • Training

ASJC Scopus subject areas

  • Artificial Intelligence
  • Neuroscience(all)

Fingerprint Dive into the research topics of 'Gradient computation of continuous-time cellular neural/nonlinear networks with linear templates via the CNN universal machine'. Together they form a unique fingerprint.

  • Cite this