Exact rate of convergence of kernel-based classification rule

Maik Döring, László Györfi, Harro Walk

Research output: Chapter in Book/Report/Conference proceedingChapter

1 Citation (Scopus)


A binary classification problem is considered, where the posteriori probability is estimated by the nonparametric kernel regression estimate with naive kernel. The excess error probability of the corresponding plug-in decision classification rule according to the error probability of the Bayes decision is studied such that the excess error probability is decomposed into approximation and estimation error. A general formula is derived for the approximation error. Under a weak margin condition and various smoothness conditions, tight upper bounds are presented on the approximation error. By a Berry-Esseen type central limit theorem a general expression for the estimation error is shown.

Original languageEnglish
Title of host publicationChallenges in Computational Statistics and Data Mining
PublisherSpringer International Publishing
Number of pages21
ISBN (Print)9783319187815, 9783319187808
Publication statusPublished - Jul 7 2015



  • Classification error probability
  • Kernel rule
  • Lower bound
  • Margin condition
  • Upper bound

ASJC Scopus subject areas

  • Computer Science(all)
  • Mathematics(all)

Cite this

Döring, M., Györfi, L., & Walk, H. (2015). Exact rate of convergence of kernel-based classification rule. In Challenges in Computational Statistics and Data Mining (Vol. 605, pp. 71-91). Springer International Publishing. https://doi.org/10.1007/978-3-319-18781-5_5