Why it *feels* like AI can reason
๐ค AI models donโt actually reasonโso why does it feel like they do?
Sometimes AI says something so insightful, so on point, that itโs hard not to believe itโs thinking. But when you peek under the hood, whatโs really going on is just prediction. No understanding. No awareness. No reasoning. Just... pattern matching.
In this video, weโre diving into the weird, blurry line between real reasoning and what AI models appear to do. Weโll talk about:
๐ง What โreasoningโ even means
๐ Why LLMs can get the right answer without understanding
๐ How prediction isnโt the same as thought
๐งโโ๏ธ Why emboโฆ
Watch on YouTube โ
(saves to browser)
DeepCamp AI