Date: Wednesday 20 November 2PM - Room 210, Mathematical Sciences Building
Abstract: Parametrized state space models in the form of recurrent neural networks are often used in machine learning to learn from data streams exhibiting temporal dependencies. I will talk about a framework for rigorous analysis of such state space models known as Echo State Networks (ESN). In particular, I will show that there is a deep connection between the state space and the notion of feature space in kernel machines (a well-known class of machine learning methods). This viewpoint will lead to several rather surprising results linking the structure of the dynamical coupling in ESN with properties of the associated temporal feature space. I will also present statements regarding the universality of ESNs as capable of approximating a large class of fading memory filters with a minimal structural complexity.