MoEITS: A Green AI approach for simplifying MoE-LLMs

📰 ArXiv cs.AI

arXiv:2604.10603v1 Announce Type: cross Abstract: Large language models are transforming all areas of academia and industry, attracting the attention of researchers, professionals, and the general public. In the trek for more powerful architectures, Mixture-of-Experts, inspired by ensemble models, have emerged as one of the most effective ways to follow. However, this implies a high computational burden for both training and inference. To reduce the impact on computing and memory footprint as we

Published 14 Apr 2026
Read full paper → ← Back to Reads