Adaptive least squares algorithm for the efficient training of artificial neural networks

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

Recently a number of publications have proposed alternative methods to apply in least mean square (LMS) algorithms in order to improve convergence rate. It has been also shown that variable step size methods can provide better convergence speed than the fixed step size ones. This paper introduces a new algorithm for the on-going calculation of the step size, and investigates its applicability in the training of multilayer neural networks. The proposed method seems to be efficient at least in the case of lower level additive input noise.

Original languageEnglish
Pages (from-to)119-129
Number of pages11
JournalPeriodica Polytechnica Electrical Engineering
Volume37
Issue number2
Publication statusPublished - Jan 1 1993

ASJC Scopus subject areas

  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Adaptive least squares algorithm for the efficient training of artificial neural networks'. Together they form a unique fingerprint.

  • Cite this