Exact rate of convergence of kernel-based classification rule

Maik Döring, László Györfi, Harro Walk

Research output: Chapter in Book/Report/Conference proceedingChapter

1 Citation (Scopus)

Abstract

A binary classification problem is considered, where the posteriori probability is estimated by the nonparametric kernel regression estimate with naive kernel. The excess error probability of the corresponding plug-in decision classification rule according to the error probability of the Bayes decision is studied such that the excess error probability is decomposed into approximation and estimation error. A general formula is derived for the approximation error. Under a weak margin condition and various smoothness conditions, tight upper bounds are presented on the approximation error. By a Berry-Esseen type central limit theorem a general expression for the estimation error is shown.

Original languageEnglish
Title of host publicationChallenges in Computational Statistics and Data Mining
PublisherSpringer International Publishing
Pages71-91
Number of pages21
Volume605
ISBN (Print)9783319187815, 9783319187808
DOIs
Publication statusPublished - Jul 7 2015

Keywords

  • Classification error probability
  • Kernel rule
  • Lower bound
  • Margin condition
  • Upper bound

ASJC Scopus subject areas

  • Computer Science(all)
  • Mathematics(all)

Fingerprint Dive into the research topics of 'Exact rate of convergence of kernel-based classification rule'. Together they form a unique fingerprint.

  • Cite this

    Döring, M., Györfi, L., & Walk, H. (2015). Exact rate of convergence of kernel-based classification rule. In Challenges in Computational Statistics and Data Mining (Vol. 605, pp. 71-91). Springer International Publishing. https://doi.org/10.1007/978-3-319-18781-5_5