×
Jul 7, 2021 · These two algorithms yield state-of-the-art results for network pruning and optimization with lower computational overhead relative to existing ...
Missing: Applications | Show results with:Applications
Nov 9, 2021 · We provide efficient matrix-free approximations for inverse-Hessian vector products, with applications in optimization and neural network pruning.
Efficiently approximating local curvature information of the loss function is a key tool for optimization and compression of deep neural networks. Yet, most.
Efficiently approximating local curvature information of the loss function is a key tool for optimization and compression of deep neural networks.
Jul 8, 2021 · We propose two new algorithms as part of a framework called M-FAC: the first algorithm is tailored towards network compression and can compute ...
Nov 18, 2021 · These two algorithms yield state-of-the-art results for network pruning and optimization with lower computational overhead relative to existing ...
The Automatic Second-order Differentiation Library (ASDL) is proposed, an extension library for PyTorch, which offers various implementations and a ...
This work investigates matrix-free, linear-time approaches for estimating Inverse-Hessian Vector Products (IHVPs) for the case when the Hessian can be ...
Missing: Applications | Show results with:Applications
Dec 6, 2021 · These two algorithms yield state-of-the-art results for network pruning and optimization with lower computational overhead relative to existing ...
M-FAC: Efficient Matrix-Free Approximations of Second-Order Information. Elias Frantar, Eldar Kurtic, Dan Alistarh. Keywords: deep learning, optimization.