Resource Consumption Threats in Large Language Models

📰 ArXiv cs.AI

arXiv:2603.16068v3 Announce Type: replace-cross Abstract: Given limited and costly computational infrastructure, resource efficiency is a key requirement for large language models (LLMs). Efficient LLMs increase service capacity for providers and reduce latency and API costs for users. Recent resource consumption threats induce excessive generation, degrading model efficiency and harming both service availability and economic sustainability. This survey presents a systematic review of threats to

Published 14 Apr 2026
Read full paper → ← Back to Reads