Transformer Approximations from ReLUs

📰 ArXiv cs.AI

arXiv:2604.24878v1 Announce Type: cross Abstract: We provide a systematic recipe for translating ReLU approximation results to softmax attention mechanism. This recipe covers many common approximation targets. Importantly, it yields target-specific, economic resource bounds beyond universal approximation statements. We showcase the recipe on multiplication, reciprocal computation, and min/max primitives. These results provide new analytical tools for analyzing softmax transformer models.

Published 29 Apr 2026
Read full paper → ← Back to Reads