Memory-efficient Continual Learning with Prototypical Exemplar Condensation

📰 ArXiv cs.AI

arXiv:2603.13804v2 Announce Type: replace-cross Abstract: Rehearsal-based continual learning (CL) mitigates catastrophic forgetting by maintaining a subset of samples from previous tasks for replay. Existing studies primarily focus on optimizing memory storage through coreset selection strategies. While these methods are effective, they typically require storing a substantial number of samples per class (SPC), often exceeding 20, to maintain satisfactory performance. In this work, we propose to

Published 13 Apr 2026
Read full paper → ← Back to Reads