LORA - Low Rank Adaptation explained || A paper from Microsoft that made LLMS more efficient .
Hii,
Today we are reviewing the paper - LORA - Low Rank Adaptation a matrix decomposition technique for finetuning LLMS.
Link to the paper - https://arxiv.org/pdf/2106.09685
Do listen in 2 x to save your time and get the most out of the video in the shortest amount of time possible.
Also I would recommend, dive deep and look into the mathematical details.
Some more recourses :
Explained by the author himself - https://www.youtube.com/watch?v=DhRoTONcyZE
Watch on YouTube ↗
(saves to browser)
Sign in to unlock AI tutor explanation · ⚡30
More on: Reading ML Papers
View skill →Related AI Lessons
⚡
⚡
⚡
⚡
The ABCs of reading medical research and review papers these days
Medium · LLM
#1 DevLog Meta-research: I Got Tired of Tab Chaos While Reading Research Papers.
Dev.to AI
How to Set Up a Karpathy-Style Wiki for Your Research Field
Medium · AI
The Non-Optimality of Scientific Knowledge: Path Dependence, Lock-In, and The Local Minimum Trap
ArXiv cs.AI
🎓
Tutor Explanation
DeepCamp AI