A matrix $X$ is called a linear matrix if its entries are affine forms, i.e. degree one polynomials in $n$ variables. What is a minimal-sized representation of a given matrix $F$ as a product of linear matrices? Finding such a minimal representation is closely related to finding an optimal way to compute a given polynomial via an algebraic branching program. Here we devise an efficient algorithm for an average-case version of this problem. Specifically, given $w,d,n \in \N$ and blackbox access to the $w^2$ entries of a matrix product $F = X_1 \cdots X_d$, where each $X_i$ is a $w \times w$ linear matrix over a given finite field $\F_q$, we wish to recover a factorization $F = Y_1 \cdots Y_{d'}$, where every $Y_i$ is also a linear matrix over $\F_q$ (or a small extension of $\F_q$). We show that when the input $F$ is sampled from a distribution defined by choosing random linear matrices $X_1, \ldots, X_d$ over $\F_q$ independently and taking their product and $n \geq 4w^2$ and $\text{char}(\F_q) = (ndw)^{\Omega(1)}$ then an equivalent factorization $F = Y_1 \cdots Y_d$ can be recovered in (randomized) time $(wdn \log q)^{O(1)}$. In fact, we give a (worst-case) polynomial time randomized algorithm to factor any non-degenerate or pure matrix product (a notion we define in the paper) into linear matrices; a matrix product $F = X_1 \cdots X_d$ is pure with high probability when the $X_i$'s are chosen independently at random. We also show that in this situation, if we are instead given a single entry of $F$ rather than its $w^2$ correlated entries then the recovery can be done in (randomized) time $(d^{w^3} n \log q)^{O(1)}$.
The exact non-degeneracy conditions for which the algorithm works have been added. The abstract and the introduction have been changed and a few more references have been added.
Let us call a matrix $X$ as a linear matrix if its entries are affine forms, i.e. degree one polynomials. What is a minimal-sized representation of a given matrix $F$ as a product of linear matrices? Finding such a minimal representation is closely related to finding an optimal way to compute a given polynomial via an algebraic branching program. Here we devise an efficient algorithm for an average-case version of this problem. Specifically, given $w,d,n \in \mathbb{N}$ and blackbox access to the $w^2$ entries of a matrix product $F = X_1 \cdots X_d$ , where each $X_i$ is a $w \times w$ linear matrix over a given finite field $\mathbb{F}_q$, we wish to recover a factorization $F = Y_1 \cdots Y_{d'}$, where every $Y_i$ is also a linear matrix over $\mathbb{F}_q$ (or a small extension of $\mathbb{F}_q$). We show that when the input $F$ is sampled from a distribution defined by choosing random linear matrices $X_1, \ldots, X_d$ over $\mathbb{F}_q$ independently and taking their product and $n \geq 4w^2$ and the characteristic of $\mathbb{F}_q$ is at least $(ndw)^{\Omega(1)}$ then an equivalent factorization $F = Y_1 \cdots Y_d$ can be recovered in (randomized) time $(wdn \log q)^{O(1)}$. We also show that in this situation, if we are instead given a single entry of $F$ rather than its $w^2$ correlated entries then the recovery can be done in (randomized) time $(d^{w^3} n \log q)^{O(1)}$.
The introduction has been altered to clarify the motivation and highlight the improvement over our previous work [Kayal, Nair, Saha, Tavenas {CCC 2017}].
Let us call a matrix $X$ as a linear matrix if its entries are affine forms, i.e. degree one polynomials. What is a minimal-sized representation of a given matrix $F$ as a product of linear matrices? Finding such a minimal representation is closely related to finding an optimal way to compute a given polynomial via an algebraic branching program. Here we devise an efficient algorithm for an average-case version of this problem. Specifically, given $w,d,n \in \mathbb{N}$ and blackbox access to the $w^2$ entries of a matrix product $F = X_1 \cdots X_d$ , where each $X_i$ is a $w \times w$ linear matrix over a given finite field $\mathbb{F}_q$, we wish to recover a factorization $F = Y_1 \cdots Y_{d'}$, where every $Y_i$ is also a linear matrix over $\mathbb{F}_q$ (or a small extension of $\mathbb{F}_q$). We show that when the input $F$ is sampled from a distribution defined by choosing random linear matrices $X_1, \ldots, X_d$ over $\mathbb{F}_q$ independently and taking their product and $n \geq 4w^2$ and the characteristic of $\mathbb{F}_q$ is at least $(ndw)^{\Omega(1)}$ then an equivalent factorization $F = Y_1 \cdots Y_d$ can be recovered in (randomized) time $(wdn \log q)^{O(1)}$. We also show that in this situation, if we are instead given a single entry of $F$ rather than its $w^2$ correlated entries then the recovery can be done in (randomized) time $(d^{w^3} n \log q)^{O(1)}$.