Memory Bank Compression for Continual Adaptation of Large Language Models
Proposed MBC, a memory-augmented continual learning model that reduces the memory bank footprint by 99.7% by compressing stored representations through codebook optimization and online resetting mechanisms. Improved QA accuracy by 11.84% (EM) and 12.99% (F1) over state-of-the-art baselines, enabling efficient knowledge updates in LLMs without catastrophic forgetting.