Theresa Wagner, TU Chemnitz
Kernel matrices are crucial in many learning tasks and typically dense and large-scale. Depending on the dimension of the feature space even the compu- tation of all its entries in reasonable time...mehr erfahren
The Arnoldi process can be applied to inexpensively approximate matrix functions of the form f(A)v and matrix functionals of the form v∗(f(A))∗g(A)v, where A is a large square non-Hermitian matrix, v is a vector, and the superscript ∗ denotes transposition and complex conjugation. Here f and g are analytic functions that are defined in suitable regions in the complex plane. This paper reviews available approximation methods and describes new ones that provide higher accuracy for essentially the same computational effort by exploiting available, but generally not used, moment information.
Numerical experiments show that in some cases the modifications of the Arnoldi decompositions proposed can improve the accuracy of v∗(f(A))∗g(A)v about as much as performing an additional step of the Arnoldi process.