Specifically, we introduce Tensor Ring Nets (TRN), in which layers of a deep neural network are compressed us- ing tensor ring factorization. For fully ...
Feb 25, 2018 · We introduce Tensor Ring Networks (TR-Nets), which significantly compress both the fully connected layers and the convolutional layers of deep neural networks.
Inspired by the recent tensor ring factoriza- tion, we introduce Tensor Ring Networks (TR-Nets), which significantly compress both the fully connected layers ...
We introduce Tensor Ring Networks (TR-Nets), which significantly compress both the fully connected layers and the convolutional layers of deep neural networks.
This work introduces Tensor Ring Networks (TR-Nets), which significantly compress both the fully connected layers and the convolutional layers of deep ...
Our results show that our TR-Nets approach is able to compress LeNet-5 by 11× without losing accuracy, and can compress the state-of-the-art Wide ResNet by 243× ...
The key is that the tensor train representation has lower border ranks and larger interior ranks which limit an efficient representation, while all ranks could ...
Paper "Wide Compression: Tensor Ring Nets" accepted to CVPR ... Wenqi Wang, Yifan Sun, Brian Eriksson, Wenlin Wang, and Vaneet Aggarwal, "Wide Compression: Tensor ...
"Wide compression: Tensor ring nets". [link], Significantly compressing both the fully connected layers and the convolutional layers of deep networks via ...