[en] This work introduces tropicalization, a novel technique that delivers tropical neural
networks as tropical limits of deep ReLU networks. Tropicalization transfers the
initial weights from real numbers to those in the tropical semiring while maintain-
ing the underlying graph of the network. After verifying that tropicalization will
not affect the classification capacity of deep neural networks, this study introduces
a tropical reformulation of backpropagation via tropical linear algebra. Tropical
arithmetic replaces multiplication operations in the network with additions and
addition operations with max, and therefore, theoretically, reduces the algorithmic
complexity during the training and inference phase. We demonstrate the latter by
simulating the tensor multiplication underlying the feed-forward process of state-
of-the-art trained neural network architectures and compare the standard forward
pass of the models with the tropical ones. Our benchmark results show that tropi-
calization speeds up inference by 50 %. Hence, we conclude that tropicalization
bears the potential to reduce the training times of large neural networks drastically.
Disciplines :
Computer science
Author, co-author :
CEYHAN, Özgür ; University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CritiX
LUCCHETTI, Federico ; University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > CritiX