Relative Self-Attention Explained
In this video, we dive into a very interesting topic "Relative Self-Attention".
First, we will see the differences between relative and absolute position embedding, and then we will cover two algorithms for incorporating relative embedding in self-attention.
#transformers #deeplearning
Watch on YouTube ↗
(saves to browser)
DeepCamp AI