Rate of convergence of k-nearest-neighbor classification rule

Maik Döring, L. Györfi, Harro Walk

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

A binary classification problem is considered. The excess error probability of the k-nearest-neighbor classification rule according to the error probability of the Bayes decision is revisited by a decomposition of the excess error probability into approximation and estimation errors. Under a weak margin condition and under a modified Lipschitz condition or a local Lipschitz condition, tight upper bounds are presented such that one avoids the condition that the feature vector is bounded. The concept of modified Lipschitz condition is applied for discrete distributions, too. As a consequence of both concepts, we present the rate of convergence of L2 error for the corresponding nearest neighbor regression estimate.

Original languageEnglish
Pages (from-to)1-16
Number of pages16
JournalJournal of Machine Learning Research
Volume18
Publication statusPublished - Jun 1 2018

Fingerprint

Classification Rules
Lipschitz condition
Error Probability
Nearest Neighbor
Rate of Convergence
Excess
Regression Estimate
Binary Classification
Discrete Distributions
Approximation Error
Bayes
Estimation Error
Feature Vector
Classification Problems
Margin
Error analysis
Upper bound
Decomposition
Decompose
Error probability

Keywords

  • Classification
  • Error probability
  • K-nearest-neighbor rule
  • Rate of convergence

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Cite this

Rate of convergence of k-nearest-neighbor classification rule. / Döring, Maik; Györfi, L.; Walk, Harro.

In: Journal of Machine Learning Research, Vol. 18, 01.06.2018, p. 1-16.

Research output: Contribution to journalArticle

@article{f55b6fdcf4544c5a8734b671b09bb6e9,
title = "Rate of convergence of k-nearest-neighbor classification rule",
abstract = "A binary classification problem is considered. The excess error probability of the k-nearest-neighbor classification rule according to the error probability of the Bayes decision is revisited by a decomposition of the excess error probability into approximation and estimation errors. Under a weak margin condition and under a modified Lipschitz condition or a local Lipschitz condition, tight upper bounds are presented such that one avoids the condition that the feature vector is bounded. The concept of modified Lipschitz condition is applied for discrete distributions, too. As a consequence of both concepts, we present the rate of convergence of L2 error for the corresponding nearest neighbor regression estimate.",
keywords = "Classification, Error probability, K-nearest-neighbor rule, Rate of convergence",
author = "Maik D{\"o}ring and L. Gy{\"o}rfi and Harro Walk",
year = "2018",
month = "6",
day = "1",
language = "English",
volume = "18",
pages = "1--16",
journal = "Journal of Machine Learning Research",
issn = "1532-4435",
publisher = "Microtome Publishing",

}

TY - JOUR

T1 - Rate of convergence of k-nearest-neighbor classification rule

AU - Döring, Maik

AU - Györfi, L.

AU - Walk, Harro

PY - 2018/6/1

Y1 - 2018/6/1

N2 - A binary classification problem is considered. The excess error probability of the k-nearest-neighbor classification rule according to the error probability of the Bayes decision is revisited by a decomposition of the excess error probability into approximation and estimation errors. Under a weak margin condition and under a modified Lipschitz condition or a local Lipschitz condition, tight upper bounds are presented such that one avoids the condition that the feature vector is bounded. The concept of modified Lipschitz condition is applied for discrete distributions, too. As a consequence of both concepts, we present the rate of convergence of L2 error for the corresponding nearest neighbor regression estimate.

AB - A binary classification problem is considered. The excess error probability of the k-nearest-neighbor classification rule according to the error probability of the Bayes decision is revisited by a decomposition of the excess error probability into approximation and estimation errors. Under a weak margin condition and under a modified Lipschitz condition or a local Lipschitz condition, tight upper bounds are presented such that one avoids the condition that the feature vector is bounded. The concept of modified Lipschitz condition is applied for discrete distributions, too. As a consequence of both concepts, we present the rate of convergence of L2 error for the corresponding nearest neighbor regression estimate.

KW - Classification

KW - Error probability

KW - K-nearest-neighbor rule

KW - Rate of convergence

UR - http://www.scopus.com/inward/record.url?scp=85052018548&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85052018548&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:85052018548

VL - 18

SP - 1

EP - 16

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

ER -