Vincent Gripon's Homepage

Research and Teaching Blog

Élagage de réseaux profond de neurones par dégradation sélective des pondérations

H. Tessier, V. Gripon, M. Léonardon, M. Arzel, T. Hannagan and D. Bertrand, "Élagage de réseaux profond de neurones par dégradation sélective des pondérations," in GRETSI, 2022.

Deep neural networks are the standard in machine learning. However, to achieve the best performance, they require millions of trainable parameters, resulting in computationally and memory intensive architectures, and therefore not well suited to certain application contexts such as embedded systems. Parameter pruning during training is a frequently used methodology to reduce these costs, but it induces new problems: sudden performance collapse at high pruning rates, discontinuities between training phases... In this paper we introduce Selective Weight Decay (SWD), a method inspired by Lagrangian smoothing and allowing a progressive and continuous pruning during training. We show on standard datasets the ability of this method to achieve the best performances, especially at the highest pruning rates.

Download manuscript.

Bibtex
@inproceedings{TesGriLéArzHanBer2022,
  author = {Hugo Tessier and Vincent Gripon and
Mathieu Léonardon and Matthieu Arzel and Thomas
Hannagan and David Bertrand},
  title = {Élagage de réseaux profond de neurones
par dégradation sélective des pondérations},
  booktitle = {GRETSI},
  year = {2022},
}




You are the 2107361th visitor

Vincent Gripon's Homepage