ECCV 2016
 
Semi-supervised Learning based on Joint Diffusion of Graph Functions and Laplacians

Abstract
We observe the distances between estimated function outputs on data points to create an anisotropic graph Laplacian which, through an iterative process, can itself be regularized. Our algorithm is instantiated as a discrete regularizer on a graph's diffusivity operator. This idea is grounded in the theory that regularizing the diffusivity operator corresponds to regularizing themetric on Riemannianmanifolds, which further corresponds to regularizing the anisotropic Laplace-Beltrami operator. We show that our discrete regularization framework is consistent in the sense that it converges to (continuous) regularization on underlying data generating manifolds. In semi-supervised learning experiments, across ten standard datasets, our diffusion of Laplacian approach has the lowest average error rate of eight different established and state-of-the-art approaches, which shows the promise of our approach.

 
@inproceedings{kim2016:ECCV,
author = {Kwang In Kim},
title = {Semi-supervised learning based on joint diffusion of graph functions and Laplacians},
booktitle = {Proc. ECCV},
pages = {713--729},
year = {2016},
}
   
Paper
PDF (0.5 MB)
  Poster
PDF (0.4 MB)

Acknowledgements
This research was funded by the EPSRC grant project Personalized Exploration of Imagery Database (EP/M00533X/1).