Site de Vincent Gripon

Blog sur mes recherches et mon enseignement

Rethinking Weight Decay For Efficient Neural Network Pruning

H. Tessier, V. Gripon, M. Léonardon, M. Arzel, T. Hannagan et D. Bertrand, "Rethinking Weight Decay For Efficient Neural Network Pruning," dans Journal of Imaging, Volume 8, Number 3, mars 2022.

Introduced in the late 80's for generalization purposes, pruning has now become a staple to compress deep neural networks. Despite many innovations brought in the last decades, pruning approaches still face core issues that hinder their performance or scalability. Drawing inspiration from early work in the field, and especially the use of weight-decay to achieve sparsity, we introduce Selective Weight Decay (SWD), which realizes efficient continuous pruning throughout training. Our approach, theoretically-grounded on Lagrangian Smoothing, is versatile and can be applied to multiple tasks, networks and pruning structures. We show that SWD compares favorably to state-of-the-art approaches in terms of performance/parameters ratio on the CIFAR-10, Cora and ImageNet ILSVRC2012 datasets.

Télécharger le manuscrit.

Bibtex
@article{TesGriLéArzHanBer20223,
  author = {Hugo Tessier and Vincent Gripon and
Mathieu Léonardon and Matthieu Arzel and Thomas
Hannagan and David Bertrand},
  title = {Rethinking Weight Decay For Efficient
Neural Network Pruning},
  journal = {Journal of Imaging},
  year = {2022},
  volume = {8},
  number = {3},
  month = {March},
}




Vous êtes le 1975652ème visiteur

Site de Vincent Gripon