Sign in
Author

Conference

Journal

Organization

Year

DOI
Look for results that meet for the following criteria:
since
equal to
before
between
and
Search in all fields of study
Limit my searches in the following fields of study
Agriculture Science
Arts & Humanities
Biology
Chemistry
Computer Science
Economics & Business
Engineering
Environmental Sciences
Geosciences
Material Science
Mathematics
Medicine
Physics
Social Science
Multidisciplinary
Keywords
(5)
Learning Curve
Majority Voting
Pac Learning
Statistical Mechanics
Uniform Distribution
Subscribe
Academic
Publications
NOISE TOLERANT LEARNING USING EARLY PREDICTORS
NOISE TOLERANT LEARNING USING EARLY PREDICTORS,Shai Fine,Ran GiladBachrach,Eli Shamir,Naftali Tishby
Edit
NOISE TOLERANT LEARNING USING EARLY PREDICTORS
(
Citations: 2
)
BibTex

RIS

RefWorks
Download
Shai Fine
,
Ran GiladBachrach
,
Eli Shamir
,
Naftali Tishby
Generalization in most
PAC learning
analysis starts around examples, where of the class. Nevertheless, analysis of learning curves using
statistical mechanics
shows much earlier generalization (7). Here we introduce a gadget called Early Predictor, which exists if somewhat better than random prediction of the label of an arbitrary instance can be obtained from labels of random examples. We were able to show that by taking a majority vote over a committee of Early Predictors, strong and efficient learning is obtained. Moreover, this learning procedure is robust to persistent classification noise. The margin analysis of the vote is used to explain thisresult. We also compare the suggested method to Bagging (11) and Boosting (5) and connect it to the SQ model (10). A concrete example of Early Predictor is constructed for learning linear separators under uniform distribution. In this context we should mention the hardness result by Bartlett and
Published in 1999.
Cumulative
Annual
Citation Context
(1)
...In [14], the noise sensitivity of some popular machine learning algorithms were empirically studied, and some noise tolerant learning algorithms are proposed in [
9
, 10, 15]...
...The third baseline algorithm is known as the Statistical Query (SQ) model [
9
]...
Zeyu Zheng
,
et al.
A Novel Contrast Colearning Framework for Generating High Quality Tra...
References
(13)
A decisiontheoretic generalization of online learning and an application to boosting
(
Citations: 3774
)
Yoav Freund
,
Robert E. Schapire
Conference:
European Conference on Computational Learning Theory  EuroCOLT
, pp. 2337, 1995
Efficient Learning from Faulty Data
(
Citations: 3
)
S. E. Decator
Published in 1995.
An Experimental Comparison of Three Methods for Constructing Ensemblesof Decision Trees: Bagging, Boosting, and Randomization
(
Citations: 722
)
Thomas G. Dietterich
Journal:
Machine Learning  ML
, vol. 40, no. 2, pp. 139157, 2000
Learning with Queries Corrupted by Classification Noise
(
Citations: 15
)
Jeffrey C. Jackson
,
Eli Shamir
,
Clara Shwartzman
Journal:
Discrete Applied Mathematics  DAM
, vol. 92, no. 23, pp. 157175, 1999
Hardness Results for Neural Network Approximation Problems
(
Citations: 23
)
Peter L. Bartlett
,
Shai Bendavid
Conference:
European Conference on Computational Learning Theory  EuroCOLT
, pp. 5062, 1999
Sort by:
Citations
(2)
A Novel Contrast Colearning Framework for Generating High Quality Training Data
Zeyu Zheng
,
Jun Yan
,
Shuicheng Yan
,
Ning Liu
,
Zheng Chen
,
Ming Zhang
Conference:
IEEE International Conference on Data Mining  ICDM
, pp. 649658, 2010
Knowledge Acquisition in Statistical Learning Theory
(
Citations: 1
)
Shai Fine
Published in 1999.