[HTML][HTML] Reduction of Markov chains using a value-of-information-based approach
IJ Sledge, JC Príncipe - Entropy, 2019 - mdpi.com
IJ Sledge, JC Príncipe
Entropy, 2019•mdpi.comIn this paper, we propose an approach to obtain reduced-order models of Markov chains.
Our approach is composed of two information-theoretic processes. The first is a means of
comparing pairs of stationary chains on different state spaces, which is done via the
negative, modified Kullback–Leibler divergence defined on a model joint space. Model
reduction is achieved by solving a value-of-information criterion with respect to this
divergence. Optimizing the criterion leads to a probabilistic partitioning of the states in the …
Our approach is composed of two information-theoretic processes. The first is a means of
comparing pairs of stationary chains on different state spaces, which is done via the
negative, modified Kullback–Leibler divergence defined on a model joint space. Model
reduction is achieved by solving a value-of-information criterion with respect to this
divergence. Optimizing the criterion leads to a probabilistic partitioning of the states in the …
In this paper, we propose an approach to obtain reduced-order models of Markov chains. Our approach is composed of two information-theoretic processes. The first is a means of comparing pairs of stationary chains on different state spaces, which is done via the negative, modified Kullback–Leibler divergence defined on a model joint space. Model reduction is achieved by solving a value-of-information criterion with respect to this divergence. Optimizing the criterion leads to a probabilistic partitioning of the states in the high-order Markov chain. A single free parameter that emerges through the optimization process dictates both the partition uncertainty and the number of state groups. We provide a data-driven means of choosing the ‘optimal’ value of this free parameter, which sidesteps needing to a priori know the number of state groups in an arbitrary chain.

Showing the best result for this search. See all results