Sequentializing kernels with state-space representations

Prof. Lyudmila Grigoryeva (University of St. Gallen)

Date: Wednesday 2nd April at 2PM

Title: Sequentializing kernels with state-space representations

Abstract

A universal kernel is constructed whose sections approximate any causal and time-invariant filter in the fading memory category with inputs and outputs in a finite-dimensional Euclidean space. This kernel is built using the reservoir functional associated with a state-space representation of the Volterra series expansion available for any analytic fading memory filter. It is hence called the Volterra reservoir kernel. Even though the state-space representation and the corresponding reservoir feature map are defined on an infinite-dimensional tensor algebra space, the kernel map is characterized by explicit recursions that are readily computable for specific data sets when employed in estimation problems using the representer theorem. We exemplify the performance of this kernel in popular benchmarks and present further directions for sequentializing static kernels using state-space representations.

Joint work with Lukas Gonon (University of St. Gallen, Switzerland) and Juan-Pablo Ortega (NTU, Singapore)

 

Back to: Institute for Financial and Actuarial Mathematics