Positional Encoding Explained - Sin, Cos, Encoding, Transformer - Advantages | Variants
In this video, we understand Positional Encoding in Transformers in a simple and intuitive way. This concept is very important because Transformers process data in parallel and do not naturally understand the order of words like RNNs or LSTMs.
Here is the GitHub repo link:
https://github.com/switch2ai
You can download all the code, scripts, and documents from the above GitHub repository.
We begin by understanding why positional encoding is needed.
In language, word order is very important.
Example
Ind beats NZ
NZ beats Ind
Both sentences have the same words but completely different mean…
Watch on YouTube ↗
(saves to browser)
DeepCamp AI