Sign in
Author

Conference

Journal

Organization

Year

DOI
Look for results that meet for the following criteria:
since
equal to
before
between
and
Search in all fields of study
Limit my searches in the following fields of study
Agriculture Science
Arts & Humanities
Biology
Chemistry
Computer Science
Economics & Business
Engineering
Environmental Sciences
Geosciences
Material Science
Mathematics
Medicine
Physics
Social Science
Multidisciplinary
Keywords
(12)
Curse of Dimensionality
Dimensional Analysis
Gaussian Distribution
Linear Discriminate Analysis
Lower and Upper Bound
Machine Learning
Multivariate Normal Distribution
Normal Distribution
Probability Density Function
Probability of Error
Synthetic Data
Error Rate
Subscribe
Academic
Publications
A onedimensional analysis for the probability of error of linear classifiers for normally distributed classes
A onedimensional analysis for the probability of error of linear classifiers for normally distributed classes,10.1016/j.patcog.2004.12.002,Pattern Re
Edit
A onedimensional analysis for the probability of error of linear classifiers for normally distributed classes
(
Citations: 2
)
BibTex

RIS

RefWorks
Download
Luis Rueda
Computing the
probability of error
is an important problem in evaluating classifiers. When dealing with normally distributed classes, this problem becomes intricate due to the fact that there is no closedform expression for integrating the
probability density
function. In this paper, we derive lower and upper bounds for the
probability of error
for a linear classifier, where the random vectors representing the underlying classes obey the
multivariate normal
distribution. The expression of the error is derived in the onedimensional space, independently of the dimensionality of the original problem . Based on the two bounds, we propose an approximating expression for the error of a generic linear classifier. In particular, we derive the corresponding bounds and the expression for approximating the error of Fisher's classifier. Our empirical results on synthetic data, including up to twohundreddimensional featured samples, show that the computations for the error are extremely fast and quite accurate; it differs from the actual error in at most = 0.0184340683. The scheme has also been successfully tested on reallife data sets drawn from the UCI
machine learning
repository.
Journal:
Pattern Recognition  PR
, vol. 38, no. 8, pp. 11971207, 2005
DOI:
10.1016/j.patcog.2004.12.002
Cumulative
Annual
View Publication
The following links allow you to view full publications. These links are maintained by other sources not affiliated with Microsoft Academic Search.
(
www.sciencedirect.com
)
(
sci2s.ugr.es
)
(
www.informatik.unitrier.de
)
(
linkinghub.elsevier.com
)
(
dx.doi.org
)
More »
Citation Context
(1)
...The complete proof of this result can be found in [
9
]...
...Adding (10) and (11), and rearranging, the result follows. The complete proof can be found in [
9
]...
...The algebraic expression for the error is stated in the followin g theorem, whose proof can be found in [
9
]...
...Also, using the algebraic analysis of the probability of error discussed in the previous subsection, boun ds and approximations for the error for Fisher’s classifier can be found (see [
9
])...
Luís G. Rueda
.
New Bounds and Approximations for the Error of Linear Classifiers
References
(14)
Introduction to statistical pattern recognition
(
Citations: 4859
)
K. Fukunaga
Published in 1990.
A PACBayesian margin bound for linear classifiers
(
Citations: 15
)
Ralf Herbrich
,
Thore Graepel
Journal:
IEEE Transactions on Information Theory  TIT
, vol. 48, no. 12, pp. 31403150, 2002
Bayes error evaluation of the Gaussian ML classifier
(
Citations: 21
)
Chulhee Lee
,
Euisun Choi
Journal:
IEEE Transactions on Geoscience and Remote Sensing  IEEE TRANS GEOSCI REMOT SEN
, vol. 38, no. 3, pp. 14711475, 2000
A Linear Classifier for Gaussian Class Conditional Distributions with Unequal Covariance Matrices
(
Citations: 7
)
Namrata Vaswani
Conference:
International Conference on Pattern Recognition  ICPR
, vol. 2, pp. 6063, 2002
A novel method for Fisher discriminant analysis
(
Citations: 35
)
Yong Xu
,
Jingyu Yang
,
Zhong Jin
Journal:
Pattern Recognition  PR
, vol. 37, no. 2, pp. 381384, 2004
Sort by:
Citations
(2)
Toward a tight upper bound for the error probability of the binary Gaussian classification problem
(
Citations: 4
)
Moataz M. H. El Ayadi
,
Mohamed S. Kamel
,
Fakhri Karray
Journal:
Pattern Recognition  PR
, vol. 41, no. 6, pp. 21202132, 2008
New Bounds and Approximations for the Error of Linear Classifiers
Luís G. Rueda
Conference:
Iberoamerican Congress on Pattern Recognition CIARP  CIARP
, pp. 342349, 2004