Academic
Publications
Smooth Boosting and Learning with Malicious Noise

Smooth Boosting and Learning with Malicious Noise,10.1007/3-540-44581-1_31,Rocco A. Servedio

Smooth Boosting and Learning with Malicious Noise   (Citations: 50)
BibTex | RIS | RefWorks Download
We describe a new boosting algorithm which generates only smooth distributions which do not assign too much weight to any single example. We show that this new boosting algorithm can be used to construct efficient PAC learning algorithms which to lerate relatively high rates of malicious noise. In particular, we use the new smooth boosting algorithm to construct malicious noise toler- ant versions of the PAC-model p-norm linear threshold learning algorithms described by Se rvedio (2002). The bounds on sample complexity and malicious noise tolerance of these new PAC algo- rithms closely correspond to known bounds for the online p-norm algorithms of Grove, Littlestone and Schuurmans (1997) and Gentile and Littlestone (1999). As special cases of our new algorithms we obtain linear threshold learning algorithms which match the sample complexity and malicious noise tolerance of the online Perceptron and Winnow algorithms. Our analysis reveals an interest- ing connection between boosting and noise tolerance in the PAC setting.
Conference: Computational Learning Theory - COLT , pp. 473-489, 2001
Cumulative Annual
View Publication
The following links allow you to view full publications. These links are maintained by other sources not affiliated with Microsoft Academic Search.
    • ...Their boosting algorithm is based on a smooth version of Adaboost [11] by Domingo and Watanabe [7] and Servedio [32] and uses an equivalent of our \balancing" step...
    • ...Smoothness property is crucial in a number of applications of boosting algorithms such as learning DNF over the uniform distribution [19], learning with malicious noise [32] and the connection to hard-core set constructions [27]...

    Vitaly Feldman. Distribution-Specific Agnostic Boosting

    • ...Indeed, known algorithms for learning halfspaces [25, 11] or even simpler target functions [18] with malicious noise typically make strong assumptions about the underlying distribution D, and can learn to accuracy 1 − ǫ only for noise rates η much smaller than ǫ...
    • ...However, by using a “smooth” boosting algorithm [25] that skews the distribution as little as poss ible, we are able to control these undesirable effects and make the analysis go through...
    • ...We note that our approach of using smooth boosting is reminiscent of [23, 25], but the current algorithm goes well beyond that earlier work...
    • ...[23] did not consider a noisy scenario, and [25] only considered the averaging algorithm without any outlier removal as the weak learner (and thus could only handle quite low rates of malicious noise, at most ǫ/ √ n in our isotropic log-concave setting)...
    • ...The following lemma from [25] (similar results can be readily found elsewhere, see e.g...

    Adam R. Klivanset al. Learning Halfspaces with Malicious Noise

    • ...The important improvements were carried on the modification of the weight of examples [19], [18], [1], [20], [14], [8], the modification of the margin [9], [20], [17], the modification of the classifiers’ weight [15], the choice of weak learning [5], [24] and the speed of convergence [22], [13], [18]...
    • ...The important improvements were carried on the modification of the weight of examples [19], [18], [1], [20], [14], [8], the modification of the margin [9], [20], [17], the modification of the classifiers’ weight [15], the choice of weak learning [5], [24] and the speed of convergence [22], [13], [18]...
    • ...An approach, which tries to reduce the effect of overfitting by imposing limitations on the distribution produced during the process of boosting is used in SmoothBoost [20]...
    • ...explanation is that even if all the examples of training are already well classified, Boosting tends to maximize the margins [20]...

    Emna Bahriet al. A Hybrid Approach of Boosting Against Noisy Data

    • ...It has also been related to boosting in the PAC model of learning [22,23]...

    John Dunaganet al. A simple polynomial-time rescaling algorithm for solving linear progra...

    • ...‡ This work was partially supported by NSF Grants CNS- 0721532, EFRI-0735905, and the Canada Research Chairs Program (e.g., [BEK02, ACB98, CBDF+99, Ser03]) is on the sample complexity of learning in such setups and on particularly bad data sources...

    Constantine Caramaniset al. Learning in the Limit with Adversarial Disturbances

Sort by: