Margin maximizing discriminant analysis

András Kocsor, K. Kovács, Csaba Szepesvári

Research output: Chapter in Book/Report/Conference proceedingConference contribution

31 Citations (Scopus)

Abstract

We propose a new feature extraction method called Margin Maximizing Discriminant Analysis (MMDA) which seeks to extract features suitable for classification tasks. MMDA is based on the principle that an ideal feature should convey the maximum information about the class labels and it should depend only on the geometry of the optimal decision boundary and not on those parts of the distribution of the input data that do not participate in shaping this boundary. Further, distinct feature components should convey unrelated information about the data. Two feature extraction methods are proposed for calculating the parameters of such a projection that are shown to yield equivalent results. The kernel mapping idea is used to derive non-linear versions. Experiments with several real-world, publicly available data sets demonstrate that the new method yields competitive results.

Original languageEnglish
Title of host publicationLecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science)
EditorsJ.-F. Boulicaut, F. Esposito, D. Pedreschi, F. Giannotti
Pages227-238
Number of pages12
Volume3201
Publication statusPublished - 2004
Event15th European Conference on Machine Learning, ECML 2004 - Pisa, Italy
Duration: Sep 20 2004Sep 24 2004

Other

Other15th European Conference on Machine Learning, ECML 2004
CountryItaly
CityPisa
Period9/20/049/24/04

Fingerprint

Discriminant analysis
Discriminant Analysis
Margin
Feature extraction
Feature Extraction
Labels
Geometry
Projection
kernel
Distinct
Experiments
Demonstrate
Experiment

ASJC Scopus subject areas

  • Hardware and Architecture
  • Computer Science(all)
  • Theoretical Computer Science

Cite this

Kocsor, A., Kovács, K., & Szepesvári, C. (2004). Margin maximizing discriminant analysis. In J-F. Boulicaut, F. Esposito, D. Pedreschi, & F. Giannotti (Eds.), Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science) (Vol. 3201, pp. 227-238)

Margin maximizing discriminant analysis. / Kocsor, András; Kovács, K.; Szepesvári, Csaba.

Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science). ed. / J.-F. Boulicaut; F. Esposito; D. Pedreschi; F. Giannotti. Vol. 3201 2004. p. 227-238.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Kocsor, A, Kovács, K & Szepesvári, C 2004, Margin maximizing discriminant analysis. in J-F Boulicaut, F Esposito, D Pedreschi & F Giannotti (eds), Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science). vol. 3201, pp. 227-238, 15th European Conference on Machine Learning, ECML 2004, Pisa, Italy, 9/20/04.
Kocsor A, Kovács K, Szepesvári C. Margin maximizing discriminant analysis. In Boulicaut J-F, Esposito F, Pedreschi D, Giannotti F, editors, Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science). Vol. 3201. 2004. p. 227-238
Kocsor, András ; Kovács, K. ; Szepesvári, Csaba. / Margin maximizing discriminant analysis. Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science). editor / J.-F. Boulicaut ; F. Esposito ; D. Pedreschi ; F. Giannotti. Vol. 3201 2004. pp. 227-238
@inproceedings{ebe5f71d395f4b64979da6b79b819823,
title = "Margin maximizing discriminant analysis",
abstract = "We propose a new feature extraction method called Margin Maximizing Discriminant Analysis (MMDA) which seeks to extract features suitable for classification tasks. MMDA is based on the principle that an ideal feature should convey the maximum information about the class labels and it should depend only on the geometry of the optimal decision boundary and not on those parts of the distribution of the input data that do not participate in shaping this boundary. Further, distinct feature components should convey unrelated information about the data. Two feature extraction methods are proposed for calculating the parameters of such a projection that are shown to yield equivalent results. The kernel mapping idea is used to derive non-linear versions. Experiments with several real-world, publicly available data sets demonstrate that the new method yields competitive results.",
author = "Andr{\'a}s Kocsor and K. Kov{\'a}cs and Csaba Szepesv{\'a}ri",
year = "2004",
language = "English",
volume = "3201",
pages = "227--238",
editor = "J.-F. Boulicaut and F. Esposito and D. Pedreschi and F. Giannotti",
booktitle = "Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science)",

}

TY - GEN

T1 - Margin maximizing discriminant analysis

AU - Kocsor, András

AU - Kovács, K.

AU - Szepesvári, Csaba

PY - 2004

Y1 - 2004

N2 - We propose a new feature extraction method called Margin Maximizing Discriminant Analysis (MMDA) which seeks to extract features suitable for classification tasks. MMDA is based on the principle that an ideal feature should convey the maximum information about the class labels and it should depend only on the geometry of the optimal decision boundary and not on those parts of the distribution of the input data that do not participate in shaping this boundary. Further, distinct feature components should convey unrelated information about the data. Two feature extraction methods are proposed for calculating the parameters of such a projection that are shown to yield equivalent results. The kernel mapping idea is used to derive non-linear versions. Experiments with several real-world, publicly available data sets demonstrate that the new method yields competitive results.

AB - We propose a new feature extraction method called Margin Maximizing Discriminant Analysis (MMDA) which seeks to extract features suitable for classification tasks. MMDA is based on the principle that an ideal feature should convey the maximum information about the class labels and it should depend only on the geometry of the optimal decision boundary and not on those parts of the distribution of the input data that do not participate in shaping this boundary. Further, distinct feature components should convey unrelated information about the data. Two feature extraction methods are proposed for calculating the parameters of such a projection that are shown to yield equivalent results. The kernel mapping idea is used to derive non-linear versions. Experiments with several real-world, publicly available data sets demonstrate that the new method yields competitive results.

UR - http://www.scopus.com/inward/record.url?scp=22944457567&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=22944457567&partnerID=8YFLogxK

M3 - Conference contribution

VL - 3201

SP - 227

EP - 238

BT - Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science)

A2 - Boulicaut, J.-F.

A2 - Esposito, F.

A2 - Pedreschi, D.

A2 - Giannotti, F.

ER -