next up previous
Next: Regularization Up: Research Summary Previous: Realtime Robust Object Tracking

Support Vector Machines

The computation and memory required for kernel machines with N training samples is at least quadratic order. Such a complexity is significant even for moderate size problems and is prohibitive for large datasets. We present an approximation technique based on the improved fast Gauss transform to reduce the computation to linear order. We also give an error bound for the approximation.

References

  1. C. Yang, R. Duraiswami and L. Davis. Efficient Kernel Machines Using the Improved Fast Gauss Transform. In Advances in Neural Information Processing Systems 16, 2004.
  2. C. Yang, R. Duraiswami, N. Gumerov and L. Davis. Improved Fast Gauss Transform and Efficient Kernel Density Estimation, In IEEE International Conference on Computer Vision, pages 464-471, 2003.
  3. C. Yang, R. Duraiswami and N. Gumerov. Improved Fast Gauss Transform. Technical Report CS-TR-4495, Dept. of Computer Science, University of Maryland, College Park, also submitted SIAM Sci. Comput. for publication, 2003.


Changjiang Yang 2004-04-29