Positional Encoding in Transformers | Deep Learning

Learn With Jay · Beginner ·🧠 Large Language Models ·1y ago
➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖ Timestamps: 0:00 Intro 0:42 Problem with Self-attention 2:30 Positional Encoding Derivation 11:32 Positional Encoding Formula 13:04 How it capture relative positions? 19:06 Concatenate or Add positional Encoding? 21:39 How Positional Encodings do not interfere with Word Embeddings? 25:04 Outro ➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖ Transformation Matrix resource - https://blog.timodenk.com/linear-relationships-in-the-transformers-positional-encoding/ Code - https://github.com/Coding-Lane/Positional-Encoding/blob/main/Positional%20Encoding.ipynb ➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖ Follow my entire Transformers playl…
Watch on YouTube ↗ (saves to browser)

Chapters (8)

Intro
0:42 Problem with Self-attention
2:30 Positional Encoding Derivation
11:32 Positional Encoding Formula
13:04 How it capture relative positions?
19:06 Concatenate or Add positional Encoding?
21:39 How Positional Encodings do not interfere with Word Embeddings?
25:04 Outro
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Next Up
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Dave Ebbelaar (LLM Eng)