TouchAI: Exploring human-AI perceptual alignment in touch through language model representations

📰 ArXiv cs.AI

arXiv:2406.06587v2 Announce Type: replace-cross Abstract: Aligning large language models (LLMs) behaviour with human intent is critical for future AI. An important yet often overlooked aspect of this alignment is the perceptual alignment. Perceptual modalities like touch are more multifaceted and nuanced compared to other sensory modalities such as vision. This work investigates how well LLMs align with human touch experiences using the "textile hand" task. We created a "Guess What Textile" inte

Published 29 Apr 2026
Read full paper → ← Back to Reads