NOTE: The following materials are presented for timely
dissemination of academic and technical work. Copyright and all other rights
therein are reserved by authors and/or other copyright holders. Persoanl
use of the following materials is permitted and, however, people using
the materials or information are expected to adhere to the terms and
constraints invoked by the related copyright.
Regularized Boost for Semi-Supervised Learning
ABSTRACT
Semi-supervised inductive learning concerns how to learn a
decision rule from a data set containing both labeled and unlabeled data.
Several boosting algorithms have been extended to semi-supervised learning
with various strategies. To our knowledge, however, none of them takes local
smoothness constraints among data into account during ensemble learning.
In this paper, we introduce a local smoothness regularizer
to semi-supervised boosting algorithms based on the universal
optimization framework of margin cost functionals.
Our regularizer is applicable to existing semi-supervised
boosting algorithms to improve their generalization and speed up their
training. Comparative results on synthetic, benchmark and real world tasks
demonstrate the effectiveness of our local smoothness regularizer.
We discuss relevant issues and relate our regularizer to previous work.
Click
nips07.pdf for full text and
Slides for the presentation slides