Robust radial basis function networks based on least trimmed squares-support vector regression

Shun Feng Su, Jin Tsong Jeng, Yue Shiang Liu, Chen Chia Chuang, I. Rudas

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

In this paper, based on least trimmed squares-support vector regression (LTS-SVR), a robust radial basis function network (RRBFN) is proposed in the modeling problem to deal with training data sets that may contain outliers and noises. There are two stages in the proposed RRBFN approach. In stage I, outliers and large noises will be trimmed via the LTS-SVR procedure so that the influences of outliers and large noises can be reduced. In other words, with the use of the LTS-SVR procedure, the proposed approach can obtain an appropriate initial structure for the RRBFN to avoid possible overfitting phenomena. It can also be found that with this procedure it may result in a fast convergence speed. After stage I, the remaining data of the training data set are directly used to adjust the parameters of the RRBFN through the gradient-descent kind of learning algorithm. Hence, it does not need to take extra time to compute the weights of the robust cost function like that considered in the M-estimate kind of approaches. From our simulation results, the performances of the proposed system are superior to that of using the annealing robust RBFN and that of using the Wilcoxon generalized RBFN when training data contain outliers and noises.

Original languageEnglish
Title of host publicationProceedings of the 2013 Joint IFSA World Congress and NAFIPS Annual Meeting, IFSA/NAFIPS 2013
Pages1-6
Number of pages6
DOIs
Publication statusPublished - 2013
Event9th Joint World Congress on Fuzzy Systems and NAFIPS Annual Meeting, IFSA/NAFIPS 2013 - Edmonton, AB, Canada
Duration: Jun 24 2013Jun 28 2013

Other

Other9th Joint World Congress on Fuzzy Systems and NAFIPS Annual Meeting, IFSA/NAFIPS 2013
CountryCanada
CityEdmonton, AB
Period6/24/136/28/13

Fingerprint

Radial basis function networks
Cost functions
Learning algorithms
Annealing

ASJC Scopus subject areas

  • Computer Networks and Communications

Cite this

Su, S. F., Jeng, J. T., Liu, Y. S., Chuang, C. C., & Rudas, I. (2013). Robust radial basis function networks based on least trimmed squares-support vector regression. In Proceedings of the 2013 Joint IFSA World Congress and NAFIPS Annual Meeting, IFSA/NAFIPS 2013 (pp. 1-6). [6608365] https://doi.org/10.1109/IFSA-NAFIPS.2013.6608365

Robust radial basis function networks based on least trimmed squares-support vector regression. / Su, Shun Feng; Jeng, Jin Tsong; Liu, Yue Shiang; Chuang, Chen Chia; Rudas, I.

Proceedings of the 2013 Joint IFSA World Congress and NAFIPS Annual Meeting, IFSA/NAFIPS 2013. 2013. p. 1-6 6608365.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Su, SF, Jeng, JT, Liu, YS, Chuang, CC & Rudas, I 2013, Robust radial basis function networks based on least trimmed squares-support vector regression. in Proceedings of the 2013 Joint IFSA World Congress and NAFIPS Annual Meeting, IFSA/NAFIPS 2013., 6608365, pp. 1-6, 9th Joint World Congress on Fuzzy Systems and NAFIPS Annual Meeting, IFSA/NAFIPS 2013, Edmonton, AB, Canada, 6/24/13. https://doi.org/10.1109/IFSA-NAFIPS.2013.6608365
Su SF, Jeng JT, Liu YS, Chuang CC, Rudas I. Robust radial basis function networks based on least trimmed squares-support vector regression. In Proceedings of the 2013 Joint IFSA World Congress and NAFIPS Annual Meeting, IFSA/NAFIPS 2013. 2013. p. 1-6. 6608365 https://doi.org/10.1109/IFSA-NAFIPS.2013.6608365
Su, Shun Feng ; Jeng, Jin Tsong ; Liu, Yue Shiang ; Chuang, Chen Chia ; Rudas, I. / Robust radial basis function networks based on least trimmed squares-support vector regression. Proceedings of the 2013 Joint IFSA World Congress and NAFIPS Annual Meeting, IFSA/NAFIPS 2013. 2013. pp. 1-6
@inproceedings{73c49b632cc04d6abdae54a0ab75823e,
title = "Robust radial basis function networks based on least trimmed squares-support vector regression",
abstract = "In this paper, based on least trimmed squares-support vector regression (LTS-SVR), a robust radial basis function network (RRBFN) is proposed in the modeling problem to deal with training data sets that may contain outliers and noises. There are two stages in the proposed RRBFN approach. In stage I, outliers and large noises will be trimmed via the LTS-SVR procedure so that the influences of outliers and large noises can be reduced. In other words, with the use of the LTS-SVR procedure, the proposed approach can obtain an appropriate initial structure for the RRBFN to avoid possible overfitting phenomena. It can also be found that with this procedure it may result in a fast convergence speed. After stage I, the remaining data of the training data set are directly used to adjust the parameters of the RRBFN through the gradient-descent kind of learning algorithm. Hence, it does not need to take extra time to compute the weights of the robust cost function like that considered in the M-estimate kind of approaches. From our simulation results, the performances of the proposed system are superior to that of using the annealing robust RBFN and that of using the Wilcoxon generalized RBFN when training data contain outliers and noises.",
author = "Su, {Shun Feng} and Jeng, {Jin Tsong} and Liu, {Yue Shiang} and Chuang, {Chen Chia} and I. Rudas",
year = "2013",
doi = "10.1109/IFSA-NAFIPS.2013.6608365",
language = "English",
isbn = "9781479903474",
pages = "1--6",
booktitle = "Proceedings of the 2013 Joint IFSA World Congress and NAFIPS Annual Meeting, IFSA/NAFIPS 2013",

}

TY - GEN

T1 - Robust radial basis function networks based on least trimmed squares-support vector regression

AU - Su, Shun Feng

AU - Jeng, Jin Tsong

AU - Liu, Yue Shiang

AU - Chuang, Chen Chia

AU - Rudas, I.

PY - 2013

Y1 - 2013

N2 - In this paper, based on least trimmed squares-support vector regression (LTS-SVR), a robust radial basis function network (RRBFN) is proposed in the modeling problem to deal with training data sets that may contain outliers and noises. There are two stages in the proposed RRBFN approach. In stage I, outliers and large noises will be trimmed via the LTS-SVR procedure so that the influences of outliers and large noises can be reduced. In other words, with the use of the LTS-SVR procedure, the proposed approach can obtain an appropriate initial structure for the RRBFN to avoid possible overfitting phenomena. It can also be found that with this procedure it may result in a fast convergence speed. After stage I, the remaining data of the training data set are directly used to adjust the parameters of the RRBFN through the gradient-descent kind of learning algorithm. Hence, it does not need to take extra time to compute the weights of the robust cost function like that considered in the M-estimate kind of approaches. From our simulation results, the performances of the proposed system are superior to that of using the annealing robust RBFN and that of using the Wilcoxon generalized RBFN when training data contain outliers and noises.

AB - In this paper, based on least trimmed squares-support vector regression (LTS-SVR), a robust radial basis function network (RRBFN) is proposed in the modeling problem to deal with training data sets that may contain outliers and noises. There are two stages in the proposed RRBFN approach. In stage I, outliers and large noises will be trimmed via the LTS-SVR procedure so that the influences of outliers and large noises can be reduced. In other words, with the use of the LTS-SVR procedure, the proposed approach can obtain an appropriate initial structure for the RRBFN to avoid possible overfitting phenomena. It can also be found that with this procedure it may result in a fast convergence speed. After stage I, the remaining data of the training data set are directly used to adjust the parameters of the RRBFN through the gradient-descent kind of learning algorithm. Hence, it does not need to take extra time to compute the weights of the robust cost function like that considered in the M-estimate kind of approaches. From our simulation results, the performances of the proposed system are superior to that of using the annealing robust RBFN and that of using the Wilcoxon generalized RBFN when training data contain outliers and noises.

UR - http://www.scopus.com/inward/record.url?scp=84886451069&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84886451069&partnerID=8YFLogxK

U2 - 10.1109/IFSA-NAFIPS.2013.6608365

DO - 10.1109/IFSA-NAFIPS.2013.6608365

M3 - Conference contribution

AN - SCOPUS:84886451069

SN - 9781479903474

SP - 1

EP - 6

BT - Proceedings of the 2013 Joint IFSA World Congress and NAFIPS Annual Meeting, IFSA/NAFIPS 2013

ER -