×
Oct 4, 2019 · This paper introduces a method to compress RNNs for resource constrained environments using Kronecker product (KP).
Abstract—Recurrent Neural Networks (RNN) can be difficult to deploy on resource constrained devices due to their size.
This paper introduces a method to compress RNNs for resource constrained environments using Kronecker product (KP).
Oct 9, 2019 · KPs can compress RNN layers by 16-38x with minimal accuracy loss. We show that KP can beat the task accuracy achieved by other state-of-the-art ...
People also ask
Recurrent Neural Networks (RNN) can be difficult to deploy on resource constrained devices due to their size. As a result, there is a need for compression ...
A recurrent neural network is trained on the first 96 MB of enwik8 and tested on the last 4 MB, achieving compression of 1.42 bpc on the training set and 1.33 ...
Missing: Pushing limits
Nov 25, 2019 · We demonstrate faster execution of an RNN model by reducing the number of RNN computations, without retraining the original RNN model.
May 21, 2015 · A glaring limitation of Vanilla Neural Networks (and also Convolutional Networks) is that their API is too constrained: they accept a fixed- ...
A new compressed RNN cell implementation called Hybrid Matrix Decomposition (HMD) is explored that results in faster inference runtime than pruning and ...