Exploiting the functional training approach in B-splines

António E. Ruano, Cristiano L. Cabrita, Pedro M. Ferreira, L. Kóczy

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

When used for function approximation purposes, neural networks belong to a class of models whose parameters can be separated into linear and nonlinear, according to their influence in the model output. This concept of parameter separability can also be applied when the training problem is formulated as the minimization of the integral of the (functional) squared error, over the input domain. Using this approach, the computation of the gradient involves terms that are dependent only on the model and the input domain, and terms which are the projection of the target function on the basis functions and on their derivatives with respect to the nonlinear parameters, over the input domain. These later terms can be numerically computed with the data. The use of the functional approach is introduced here for B-splines. An example shows that, besides great computational complexity savings, this approach obtains better results than the standard, discrete technique, as the performance surface employed is more similar to the one obtained with the function underlying the data. In some cases, as shown in the example, a complete analytical solution can be found.

Original languageEnglish
Title of host publicationIFAC Proceedings Volumes (IFAC-PapersOnline)
Pages127-132
Number of pages6
DOIs
Publication statusPublished - 2012
Event1st IFAC Conference on Embedded Systems, Computational Intelligence and Telematics in Control, CESCIT 2012 - Wurzburg, Germany
Duration: Apr 3 2012Apr 5 2012

Other

Other1st IFAC Conference on Embedded Systems, Computational Intelligence and Telematics in Control, CESCIT 2012
CountryGermany
CityWurzburg
Period4/3/124/5/12

Fingerprint

Splines
Computational complexity
Derivatives
Neural networks

Keywords

  • Functional back-propagation
  • Neural networks training
  • Parameter separability

ASJC Scopus subject areas

  • Control and Systems Engineering

Cite this

Ruano, A. E., Cabrita, C. L., Ferreira, P. M., & Kóczy, L. (2012). Exploiting the functional training approach in B-splines. In IFAC Proceedings Volumes (IFAC-PapersOnline) (pp. 127-132) https://doi.org/10.3182/20120403-3-DE-3010.00070

Exploiting the functional training approach in B-splines. / Ruano, António E.; Cabrita, Cristiano L.; Ferreira, Pedro M.; Kóczy, L.

IFAC Proceedings Volumes (IFAC-PapersOnline). 2012. p. 127-132.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Ruano, AE, Cabrita, CL, Ferreira, PM & Kóczy, L 2012, Exploiting the functional training approach in B-splines. in IFAC Proceedings Volumes (IFAC-PapersOnline). pp. 127-132, 1st IFAC Conference on Embedded Systems, Computational Intelligence and Telematics in Control, CESCIT 2012, Wurzburg, Germany, 4/3/12. https://doi.org/10.3182/20120403-3-DE-3010.00070
Ruano AE, Cabrita CL, Ferreira PM, Kóczy L. Exploiting the functional training approach in B-splines. In IFAC Proceedings Volumes (IFAC-PapersOnline). 2012. p. 127-132 https://doi.org/10.3182/20120403-3-DE-3010.00070
Ruano, António E. ; Cabrita, Cristiano L. ; Ferreira, Pedro M. ; Kóczy, L. / Exploiting the functional training approach in B-splines. IFAC Proceedings Volumes (IFAC-PapersOnline). 2012. pp. 127-132
@inproceedings{06d9915eb9514712ad14b3ee3e625ce7,
title = "Exploiting the functional training approach in B-splines",
abstract = "When used for function approximation purposes, neural networks belong to a class of models whose parameters can be separated into linear and nonlinear, according to their influence in the model output. This concept of parameter separability can also be applied when the training problem is formulated as the minimization of the integral of the (functional) squared error, over the input domain. Using this approach, the computation of the gradient involves terms that are dependent only on the model and the input domain, and terms which are the projection of the target function on the basis functions and on their derivatives with respect to the nonlinear parameters, over the input domain. These later terms can be numerically computed with the data. The use of the functional approach is introduced here for B-splines. An example shows that, besides great computational complexity savings, this approach obtains better results than the standard, discrete technique, as the performance surface employed is more similar to the one obtained with the function underlying the data. In some cases, as shown in the example, a complete analytical solution can be found.",
keywords = "Functional back-propagation, Neural networks training, Parameter separability",
author = "Ruano, {Ant{\'o}nio E.} and Cabrita, {Cristiano L.} and Ferreira, {Pedro M.} and L. K{\'o}czy",
year = "2012",
doi = "10.3182/20120403-3-DE-3010.00070",
language = "English",
isbn = "9783902661975",
pages = "127--132",
booktitle = "IFAC Proceedings Volumes (IFAC-PapersOnline)",

}

TY - GEN

T1 - Exploiting the functional training approach in B-splines

AU - Ruano, António E.

AU - Cabrita, Cristiano L.

AU - Ferreira, Pedro M.

AU - Kóczy, L.

PY - 2012

Y1 - 2012

N2 - When used for function approximation purposes, neural networks belong to a class of models whose parameters can be separated into linear and nonlinear, according to their influence in the model output. This concept of parameter separability can also be applied when the training problem is formulated as the minimization of the integral of the (functional) squared error, over the input domain. Using this approach, the computation of the gradient involves terms that are dependent only on the model and the input domain, and terms which are the projection of the target function on the basis functions and on their derivatives with respect to the nonlinear parameters, over the input domain. These later terms can be numerically computed with the data. The use of the functional approach is introduced here for B-splines. An example shows that, besides great computational complexity savings, this approach obtains better results than the standard, discrete technique, as the performance surface employed is more similar to the one obtained with the function underlying the data. In some cases, as shown in the example, a complete analytical solution can be found.

AB - When used for function approximation purposes, neural networks belong to a class of models whose parameters can be separated into linear and nonlinear, according to their influence in the model output. This concept of parameter separability can also be applied when the training problem is formulated as the minimization of the integral of the (functional) squared error, over the input domain. Using this approach, the computation of the gradient involves terms that are dependent only on the model and the input domain, and terms which are the projection of the target function on the basis functions and on their derivatives with respect to the nonlinear parameters, over the input domain. These later terms can be numerically computed with the data. The use of the functional approach is introduced here for B-splines. An example shows that, besides great computational complexity savings, this approach obtains better results than the standard, discrete technique, as the performance surface employed is more similar to the one obtained with the function underlying the data. In some cases, as shown in the example, a complete analytical solution can be found.

KW - Functional back-propagation

KW - Neural networks training

KW - Parameter separability

UR - http://www.scopus.com/inward/record.url?scp=84861405426&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84861405426&partnerID=8YFLogxK

U2 - 10.3182/20120403-3-DE-3010.00070

DO - 10.3182/20120403-3-DE-3010.00070

M3 - Conference contribution

AN - SCOPUS:84861405426

SN - 9783902661975

SP - 127

EP - 132

BT - IFAC Proceedings Volumes (IFAC-PapersOnline)

ER -