# Surrogate maximization/minimization algorithms for AdaBoost and the logistic regression model

###
Zhihua Zhang, James T. Kwok, Dit-Yan Yeung

**Abstract:**
Surrogate maximization (or minimization) (SM) algorithms are a
family of algorithms that can be regarded as a generalization of
expectation-maximization (EM) algorithms. There are three major
approaches to the construction of surrogate functions, all relying
on the convexity of some function. In this paper, we solve the
boosting problem by proposing SM algorithms for the corresponding
optimization problem. Specifically, for AdaBoost, we derive an SM
algorithm that can be shown to be identical to the algorithm
proposed by [Collins 2002] based on Bregman distance.
More importantly, for LogitBoost (or logistic boosting), we use
several methods to construct different surrogate functions which
result in different SM algorithms. By combining multiple methods,
we are able to derive an SM algorithm that is also the same as an
algorithm derived by [Collins 2002]. Our approach
based on SM algorithms is much simpler and convergence results
follow naturally.
*Proceedings of the Twenty-First International Conference on Machine Learning
(ICML-2004)*, pp.927-934, Banff, Alberta, Canada, July 2004.

Pdf:
http://www.cs.ust.hk/~jamesk/papers/icml04.pdf

Back to James Kwok's home page.