Sign in
Author

Conference

Journal

Organization

Year

DOI
Look for results that meet for the following criteria:
since
equal to
before
between
and
Search in all fields of study
Limit my searches in the following fields of study
Agriculture Science
Arts & Humanities
Biology
Chemistry
Computer Science
Economics & Business
Engineering
Environmental Sciences
Geosciences
Material Science
Mathematics
Medicine
Physics
Social Science
Multidisciplinary
Keywords
(3)
Learning Algorithm
Pac Learning
Sample Complexity
Related Publications
(20)
Soft Margins for AdaBoost
A decisiontheoretic generalization of online learning and an application to boosting
Learning Nested Differences in the Presence of Malicious Noise
Learning rofk Functions by Boosting
A PolynomialTime Algorithm for Learning Noisy Linear Threshold Functions
Subscribe
Academic
Publications
Smooth Boosting and Learning with Malicious Noise
Smooth Boosting and Learning with Malicious Noise,10.1007/3540445811_31,Rocco A. Servedio
Edit
Smooth Boosting and Learning with Malicious Noise
(
Citations: 50
)
BibTex

RIS

RefWorks
Download
Rocco A. Servedio
We describe a new boosting algorithm which generates only smooth distributions which do not assign too much weight to any single example. We show that this new boosting algorithm can be used to construct efficient
PAC learning
algorithms which to lerate relatively high rates of malicious noise. In particular, we use the new smooth boosting algorithm to construct malicious noise toler ant versions of the PACmodel pnorm linear threshold learning algorithms described by Se rvedio (2002). The bounds on
sample complexity
and malicious noise tolerance of these new PAC algo rithms closely correspond to known bounds for the online pnorm algorithms of Grove, Littlestone and Schuurmans (1997) and Gentile and Littlestone (1999). As special cases of our new algorithms we obtain linear threshold learning algorithms which match the
sample complexity
and malicious noise tolerance of the online Perceptron and Winnow algorithms. Our analysis reveals an interest ing connection between boosting and noise tolerance in the PAC setting.
Conference:
Computational Learning Theory  COLT
, pp. 473489, 2001
DOI:
10.1007/3540445811_31
Cumulative
Annual
View Publication
The following links allow you to view full publications. These links are maintained by other sources not affiliated with Microsoft Academic Search.
(
www.springerlink.com
)
(
www.springerlink.com
)
Citation Context
(31)
...Their boosting algorithm is based on a smooth version of Adaboost [11] by Domingo and Watanabe [7] and Servedio [
32
] and uses an equivalent of our \balancing" step...
...Smoothness property is crucial in a number of applications of boosting algorithms such as learning DNF over the uniform distribution [19], learning with malicious noise [
32
] and the connection to hardcore set constructions [27]...
Vitaly Feldman
.
DistributionSpecific Agnostic Boosting
...Indeed, known algorithms for learning halfspaces [
25
, 11] or even simpler target functions [18] with malicious noise typically make strong assumptions about the underlying distribution D, and can learn to accuracy 1 − ǫ only for noise rates η much smaller than ǫ...
...However, by using a “smooth” boosting algorithm [
25
] that skews the distribution as little as poss ible, we are able to control these undesirable effects and make the analysis go through...
...We note that our approach of using smooth boosting is reminiscent of [23,
25
], but the current algorithm goes well beyond that earlier work...
...[23] did not consider a noisy scenario, and [
25
] only considered the averaging algorithm without any outlier removal as the weak learner (and thus could only handle quite low rates of malicious noise, at most ǫ/ √ n in our isotropic logconcave setting)...
...The following lemma from [
25
] (similar results can be readily found elsewhere, see e.g...
Adam R. Klivans
,
et al.
Learning Halfspaces with Malicious Noise
...The important improvements were carried on the modification of the weight of examples [19], [18], [1], [
20
], [14], [8], the modification of the margin [9], [20], [17], the modification of the classifiers’ weight [15], the choice of weak learning [5], [24] and the speed of convergence [22], [13], [18]...
...The important improvements were carried on the modification of the weight of examples [19], [18], [1], [20], [14], [8], the modification of the margin [9], [
20
], [17], the modification of the classifiers’ weight [15], the choice of weak learning [5], [24] and the speed of convergence [22], [13], [18]...
...An approach, which tries to reduce the effect of overfitting by imposing limitations on the distribution produced during the process of boosting is used in SmoothBoost [
20
]...
...explanation is that even if all the examples of training are already well classified, Boosting tends to maximize the margins [
20
]...
Emna Bahri
,
et al.
A Hybrid Approach of Boosting Against Noisy Data
...It has also been related to boosting in the PAC model of learning [22,
23
]...
John Dunagan
,
et al.
A simple polynomialtime rescaling algorithm for solving linear progra...
...‡ This work was partially supported by NSF Grants CNS 0721532, EFRI0735905, and the Canada Research Chairs Program (e.g., [BEK02, ACB98, CBDF+99,
Ser03
]) is on the sample complexity of learning in such setups and on particularly bad data sources...
Constantine Caramanis
,
et al.
Learning in the Limit with Adversarial Disturbances
References
(31)
An adaptive version of the boost by majority algorithm
(
Citations: 116
)
Yoav Freund
Conference:
Computational Learning Theory  COLT
, pp. 102113, 1999
Sampleefficient strategies for learning in the presence of noise
(
Citations: 10
)
Nicolò CesaBianchi
,
Eli Dichterman
,
Paul Fischer
,
Eli Shamir
,
HansUlrich Simon
Journal:
Journal of The ACM  JACM
, vol. 46, no. 5, pp. 684719, 1999
Learning Conjunctions with Noise under Product Distributions
(
Citations: 9
)
Yishay Mansour
,
Michal Parnas
Journal:
Information Processing Letters  IPL
, vol. 68, no. 4, pp. 189196, 1998
Specification and simulation of statistical query algorithms for efficiency and noise tolerance
(
Citations: 25
)
Javed A. Aslam
,
Scott E. Decatur
Conference:
Computational Learning Theory  COLT
, pp. 437446, 1995
Learning Disjunction of Conjunctions
(
Citations: 186
)
Leslie G. Valiant
Conference:
International Joint Conference on Artificial Intelligence  IJCAI
, pp. 560566, 1985
Sort by:
Citations
(50)
Random classification noise defeats all convex potential boosters
(
Citations: 1
)
Philip M. Long
,
Rocco A. Servedio
Journal:
Machine Learning  ML
, vol. 78, no. 3, pp. 287304, 2010
LPBoost with Strong Classifiers
Yu K. Fang
,
Yan Fu
,
Chong J. Sun
,
Jun. L. Zhou
Published in 2010.
DistributionSpecific Agnostic Boosting
(
Citations: 2
)
Vitaly Feldman
Journal:
Computing Research Repository  CORR
, vol. abs/0909.2, pp. 241250, 2009
Learning Halfspaces with Malicious Noise
(
Citations: 1
)
Adam R. Klivans
,
Philip M. Long
,
Rocco A. Servedio
Journal:
Journal of Machine Learning Research  JMLR
, vol. 10, pp. 609621, 2009
A Hybrid Approach of Boosting Against Noisy Data
Emna Bahri
,
Stéphane Lallich
,
Nicolas Nicoloyannis
,
Mondher Maddouri
Published in 2009.