Eprint already available on another site (E-prints, Working papers and Research blog)
FOCIL: Finetune-and-Freeze for Online Class Incremental Learning by Training Randomly Pruned Sparse Experts
Onur Yildirim, Murat; Ceren Gok Yildirim, Elif; MOCANU, Decebal Constantin et al.
2024
 

Files


Full Text
2403.14684v1.pdf
Author preprint (643.92 kB)
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
Computer Science - Computer Vision and Pattern Recognition; Sparse Neural Networks; Continual Learning
Abstract :
[en] Class incremental learning (CIL) in an online continual learning setting strives to acquire knowledge on a series of novel classes from a data stream, using each data point only once for training. This is more realistic compared to offline modes, where it is assumed that all data from novel class(es) is readily available. Current online CIL approaches store a subset of the previous data which creates heavy overhead costs in terms of both memory and computation, as well as privacy issues. In this paper, we propose a new online CIL approach called FOCIL. It fine-tunes the main architecture continually by training a randomly pruned sparse subnetwork for each task. Then, it freezes the trained connections to prevent forgetting. FOCIL also determines the sparsity level and learning rate per task adaptively and ensures (almost) zero forgetting across all tasks without storing any replay data. Experimental results on 10-Task CIFAR100, 20-Task CIFAR100, and 100-Task TinyImagenet, demonstrate that our method outperforms the SOTA by a large margin. The code is publicly available at https://github.com/muratonuryildirim/FOCIL.
Disciplines :
Computer science
Author, co-author :
Onur Yildirim, Murat;  Eindhoven University of Technology
Ceren Gok Yildirim, Elif;  Eindhoven University of Technology
MOCANU, Decebal Constantin  ;  University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
Vanschoren, Joaquin;  Eindhoven University of Technology
Language :
English
Title :
FOCIL: Finetune-and-Freeze for Online Class Incremental Learning by Training Randomly Pruned Sparse Experts
Publication date :
13 March 2024
Focus Area :
Computational Sciences
Development Goals :
9. Industry, innovation and infrastructure
Available on ORBilu :
since 09 May 2024

Statistics


Number of views
130 (3 by Unilu)
Number of downloads
106 (0 by Unilu)

Bibliography


Similar publications



Contact ORBilu