Latent ordinary differential equations for irregularly-sampled time series

Y Rubanova, RTQ Chen… - Advances in neural …, 2019 - proceedings.neurips.cc
Advances in neural information processing systems, 2019proceedings.neurips.cc
Time series with non-uniform intervals occur in many applications, and are difficult to model
using standard recurrent neural networks (RNNs). We generalize RNNs to have continuous-
time hidden dynamics defined by ordinary differential equations (ODEs), a model we call
ODE-RNNs. Furthermore, we use ODE-RNNs to replace the recognition network of the
recently-proposed Latent ODE model. Both ODE-RNNs and Latent ODEs can naturally
handle arbitrary time gaps between observations, and can explicitly model the probability of …
Abstract
Time series with non-uniform intervals occur in many applications, and are difficult to model using standard recurrent neural networks (RNNs). We generalize RNNs to have continuous-time hidden dynamics defined by ordinary differential equations (ODEs), a model we call ODE-RNNs. Furthermore, we use ODE-RNNs to replace the recognition network of the recently-proposed Latent ODE model. Both ODE-RNNs and Latent ODEs can naturally handle arbitrary time gaps between observations, and can explicitly model the probability of observation times using Poisson processes. We show experimentally that these ODE-based models outperform their RNN-based counterparts on irregularly-sampled data.
proceedings.neurips.cc
Showing the best result for this search. See all results