The500Feed.Live

Everything going on in AI - updated daily from 500+ sources

← Back to The 500 Feed
📄 ResearchMay 14, 2026

Octopus: History-Free Gradient Orthogonalization for Continual Learning in Multimodal Large Language Models

Continual learning in multimodal large language models (MLLMs) aims to sequentially acquire knowledge while mitigating catastrophic forgetting, yet existing methods face inherent limitations: architecture-based approaches incur additional computational overhead and often generalize poorly to new tas...

Read Original Article →

Source

http://arxiv.org/abs/2605.14938v1