CodaRAG: Connecting the Dots with Associativity Inspired by Complementary Learning
📰 ArXiv cs.AI
arXiv:2604.10426v1 Announce Type: cross Abstract: Large Language Models (LLMs) struggle with knowledge-intensive tasks due to hallucinations and fragmented reasoning over dispersed information. While Retrieval-Augmented Generation (RAG) grounds generation in external sources, existing methods often treat evidence as isolated units, failing to reconstruct the logical chains that connect these dots. Inspired by Complementary Learning Systems (CLS), we propose CodaRAG, a framework that evolves retr
DeepCamp AI