# Block-quantized kernel matrix for fast
spectral embedding

###
Kai Zhang, James T. Kwok

**Abstract:**
Eigendecomposition of kernel matrix is an indispensable procedure in
many learning and vision tasks. However, the cubic complexity
$O(N^3)$ is impractical for large problem, where $N$ is the data
size. In this paper, we propose an efficient approach to solve the
eigendecomposition of the kernel matrix $W$. The idea is to
approximate $W$ with $\barW$ that is composed of $m^2$ constant
blocks. The eigenvectors of $\barW$, which can be solved in $O(m^3)$
time, is then used to recover the eigenvectors of the original
kernel matrix. The complexity of our method is only $O(mN+m^3)$,
which scales more favorably than state-of-the-art low rank
approximation and sampling based approaches ($O(m^2N+m^3)$), and the
approximation quality can be controlled conveniently. Our method
demonstrates encouraging scaling behaviors in experiments of image
segmentation (by spectral clustering) and kernel principal
component analysis.
*Proceedings of the Twenty-Third International Conference on Machine Learning
(ICML-2006)*, pp.1097-1104, Pittsburgh, USA, June 2006.

PDF:
http://www.cs.ust.hk/~jamesk/papers/icml06c.pdf

Back to James Kwok's home page.