Companion Losses for Deep Neural Networks
- David Díaz-Vico 1
- Angela Fernández 1
- Dorronsoro, José R. 11
-
1
Universidad Autónoma de Madrid
info
- Hugo Sanjurjo González (coord.)
- Iker Pastor López (coord.)
- Pablo García Bringas (coord.)
- Héctor Quintián (coord.)
- Emilio Corchado (coord.)
Editorial: Springer International Publishing AG
ISBN: 978-3-030-86271-8, 978-3-030-86270-1
Any de publicació: 2021
Pàgines: 538-549
Congrés: Hybrid Artificial Intelligent Systems (HAIS) (16. 2021. Bilbao)
Tipus: Aportació congrés
Resum
Modern Deep Neuronal Network backends allow a great flexibility to define network architectures. This allows for multiple outputs with their specific losses which can make them more suitable for particular goals. In this work we shall explore this possibility for classification networks which will combine the categorical cross-entropy loss, typical of softmax probabilistic outputs, the categorical hinge loss, which extends the hinge loss standard on SVMs, and a novel Fisher loss which seeks to concentrate class members near their centroids while keeping these apart.