-
Jinyong Cheng
Key Laboratory of Computing Power Network and Information Security Ministry of Education, Shandong Computer Science Center (National Supercomputer Center in Jinan) Qilu University of Technology (Shandong Academy of Sciences) Jinan, China & Shandong Provincial Key Laboratory of Computing Power Internet and Service Computing Shandong Fundamental Research Center for Computer Science Jinan, China
-
Mengyun Chen
Key Laboratory of Computing Power Network and Information Security Ministry of Education, Shandong Computer Science Center (National Supercomputer Center in Jinan) Qilu University of Technology (Shandong Academy of Sciences) Jinan, China & Shandong Provincial Key Laboratory of Computing Power Internet and Service Computing Shandong Fundamental Research Center for Computer Science Jinan, China
-
Baoyu Du
Key Laboratory of Computing Power Network and Information Security Ministry of Education, Shandong Computer Science Center (National Supercomputer Center in Jinan) Qilu University of Technology (Shandong Academy of Sciences) Jinan, China & Shandong Provincial Key Laboratory of Computing Power Internet and Service Computing Shandong Fundamental Research Center for Computer Science Jinan, China
-
Min Guo
Key Laboratory of Computing Power Network and Information Security Ministry of Education, Shandong Computer Science Center (National Supercomputer Center in Jinan) Qilu University of Technology (Shandong Academy of Sciences) Jinan, China & Shandong Provincial Key Laboratory of Computing Power Internet and Service Computing Shandong Fundamental Research Center for Computer Science Jinan, China
ADOCIL: Enhancing Image Classification with Attention Distillation for Online Class-Incremental Learning
keywords: Catastrophic forgetting, class-incremental learning, two-stage sampling, ADVC, attention distillation
Catastrophic forgetting is a major challenge for online class-incremental learning. Existing replay-based methods have achieved a certain degree of effectiveness, but are limited by not considering the quality of the samples and the key semantic information in a single-pass data stream. To address these issues, we proposed the framework of Online Class-Incremental Learning Based on Attention Distillation(ADOCIL), which consists of three parts. A two-stage sampling method is used in the replay stage to improve the quality of the samples taken. Meanwhile, we introduced the Attention-based Dual-View Consistency (ADVC), which enables the model to fully explore the critical semantic information within a single-pass data stream. In addition, to further mitigate the problem of catastrophic forgetting, we introduced attention distillation to map the attentional map of the teacher model to the student model, thus solving the problem of forgetting historical tasks. Extensive experiments demonstrated the effectiveness of ADOCIL.
reference: Vol. 44, 2025, No. 5, pp. 1202–1228