Eprint first made available on ORBilu (E-prints, Working papers and Research blog)
Tropical Backpropagation
Ceyhan, Özgür; Lucchetti, Federico


Full Text
Author preprint (335.69 kB)

All documents in ORBilu are protected by a user license.

Send to


Abstract :
[en] This work introduces tropicalization, a novel technique that delivers tropical neural networks as tropical limits of deep ReLU networks. Tropicalization transfers the initial weights from real numbers to those in the tropical semiring while maintain- ing the underlying graph of the network. After verifying that tropicalization will not affect the classification capacity of deep neural networks, this study introduces a tropical reformulation of backpropagation via tropical linear algebra. Tropical arithmetic replaces multiplication operations in the network with additions and addition operations with max, and therefore, theoretically, reduces the algorithmic complexity during the training and inference phase. We demonstrate the latter by simulating the tensor multiplication underlying the feed-forward process of state- of-the-art trained neural network architectures and compare the standard forward pass of the models with the tropical ones. Our benchmark results show that tropi- calization speeds up inference by 50 %. Hence, we conclude that tropicalization bears the potential to reduce the training times of large neural networks drastically.
Disciplines :
Computer science
Author, co-author :
Ceyhan, Özgür ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CritiX
Lucchetti, Federico ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CritiX
Language :
Title :
Tropical Backpropagation
Publication date :
Focus Area :
Computational Sciences
Available on ORBilu :
since 06 April 2023


Number of views
129 (7 by Unilu)
Number of downloads
63 (3 by Unilu)


Similar publications

Contact ORBilu