Site de Vincent Gripon

Blog sur mes recherches et mon enseignement

Introducing Graph Smoothness Loss for Training Deep Learning Architectures

M. Bontonou, C. Lassance, G. B. Hacene, V. Gripon, J. Tang et A. Ortega, "Introducing Graph Smoothness Loss for Training Deep Learning Architectures," dans Data Science Workshop, pp. 160--164, juin 2019.

We introduce a novel loss function for training deep learning architectures to perform classification. It consists in minimizing the smoothness of label signals on similarity graphs built at the output of the architecture. Equivalently, it can be seen as maximizing the distances between the network function images of training inputs from distinct classes. As such, only distances between pairs of examples in distinct classes are taken into account in the process, and the training does not prevent inputs from the same class to be mapped to distant locations in the output domain. We show that this loss leads to similar performance in classification as architectures trained using the classical cross-entropy, while offering interesting degrees of freedom and properties. We also demonstrate the interest of the proposed loss to increase robustness of trained architectures to deviations of the inputs.

Télécharger le manuscrit.

Bibtex
@inproceedings{BonLasHacGriTanOrt20196,
  author = {Myriam Bontonou and Carlos Lassance and
Ghouthi Boukli Hacene and Vincent Gripon and Jian Tang
and Antonio Ortega},
  title = {Introducing Graph Smoothness Loss for
Training Deep Learning Architectures},
  booktitle = {Data Science Workshop},
  year = {2019},
  pages = {160--164},
  month = {June},
}




Vous êtes le 1975832ème visiteur

Site de Vincent Gripon