S: Sign-Sparse-Shift Reparametrization for Effective Training of Low-bit Shift Networks

X Li, B Liu, Y Yu, W Liu, C Xu… - Advances in Neural …, 2021 - proceedings.neurips.cc
Advances in Neural Information Processing Systems, 2021proceedings.neurips.cc
Shift neural networks reduce computation complexity by removing expensive multiplication
operations and quantizing continuous weights into low-bit discrete values, which are fast
and energy-efficient compared to conventional neural networks. However, existing shift
networks are sensitive to the weight initialization and yield a degraded performance caused
by vanishing gradient and weight sign freezing problem. To address these issues, we
propose S $^ 3$ re-parameterization, a novel technique for training low-bit shift networks …
Abstract
Shift neural networks reduce computation complexity by removing expensive multiplication operations and quantizing continuous weights into low-bit discrete values, which are fast and energy-efficient compared to conventional neural networks. However, existing shift networks are sensitive to the weight initialization and yield a degraded performance caused by vanishing gradient and weight sign freezing problem. To address these issues, we propose S re-parameterization, a novel technique for training low-bit shift networks. Our method decomposes a discrete parameter in a sign-sparse-shift 3-fold manner. This way, it efficiently learns a low-bit network with weight dynamics similar to full-precision networks and insensitive to weight initialization. Our proposed training method pushes the boundaries of shift neural networks and shows 3-bit shift networks compete with their full-precision counterparts in terms of top-1 accuracy on ImageNet.
proceedings.neurips.cc
Showing the best result for this search. See all results