Exact rate of convergence of kernel-based classification rule

Maik Döring, László Györfi, Harro Walk

Research output: Contribution to journalArticle

Abstract

A binary classification problem is considered, where the posteriori probability is estimated by the nonparametric kernel regression estimate with naive kernel. The excess error probability of the corresponding plug-in decision classification rule according to the error probability of the Bayes decision is studied such that the excess error probability is decomposed into approximation and estimation error. A general formula is derived for the approximation error. Under a weak margin condition and various smoothness conditions, tight upper bounds are presented on the approximation error. By a Berry-Esseen type central limit theorem a general expression for the estimation error is shown.

Original languageEnglish
Pages (from-to)71-91
Number of pages21
JournalStudies in Computational Intelligence
Volume605
DOIs
Publication statusPublished - Jan 1 2016

Keywords

  • Classification error probability
  • Kernel rule
  • Lower bound
  • Margin condition
  • Upper bound

ASJC Scopus subject areas

  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Exact rate of convergence of kernel-based classification rule'. Together they form a unique fingerprint.

  • Cite this