A kernel-independent FMM in general dimensions

WB March, B Xiao, S Tharakan, CD Yu… - Proceedings of the …, 2015 - dl.acm.org
Proceedings of the International Conference for High Performance Computing …, 2015dl.acm.org
We introduce a general-dimensional, kernel-independent, algebraic fast multipole method
and apply it to kernel regression. The motivation for this work is the approximation of kernel
matrices, which appear in mathematical physics, approximation theory, non-parametric
statistics, and machine learning. Existing fast multipole methods are asymptotically optimal,
but the underlying constants scale quite badly with the ambient space dimension. We
introduce a method that mitigates this shortcoming; it only requires kernel evaluations and …
We introduce a general-dimensional, kernel-independent, algebraic fast multipole method and apply it to kernel regression. The motivation for this work is the approximation of kernel matrices, which appear in mathematical physics, approximation theory, non-parametric statistics, and machine learning. Existing fast multipole methods are asymptotically optimal, but the underlying constants scale quite badly with the ambient space dimension. We introduce a method that mitigates this shortcoming; it only requires kernel evaluations and scales well with the problem size, the number of processors, and the ambient dimension---as long as the intrinsic dimension of the dataset is small. We test the performance of our method on several synthetic datasets. As a highlight, our largest run was on an image dataset with 10 million points in 246 dimensions.
ACM Digital Library
Showing the best result for this search. See all results