ModernBERT - Modern Replacement for BERT | RAG, Embeddings, Classification, Reranking
ModernBERT is a drop-in replacement for the original BERT encoder model. It was trained on a much larger dataset, uses training techniques from modern LLMs and beats most encoder with much better performance. But what made all of this possible?
Blog post: https://huggingface.co/blog/modernbert
Weights: https://huggingface.co/answerdotai/ModernBERT-base
AI Academy: https://mlexpert.io/
Work with me: https://mlexpert.io/consulting
LinkedIn: https://www.linkedin.com/in/venelin-valkov/
Follow me on X: https://twitter.com/venelin_valkov
Discord: https://discord.gg/UaNPxVD6tv
Subscribe: http://bit.ly/venelin-subscribe
GitHub repository: https://github.com/curiousily/AI-Bootcamp
👍 Don't Forget to Like, Comment, and Subscribe for More Tutorials!
Join this channel to get access to the perks and support my work:
https://www.youtube.com/channel/UCoW_WzQNJVAjxo4osNAxd_g/join
#bert #transformers #huggingface #pytorch
Watch on YouTube ↗
(saves to browser)
Sign in to unlock AI tutor explanation · ⚡30
More on: LLM Foundations
View skill →Related AI Lessons
⚡
⚡
⚡
⚡
LlamaIndex + x711: enrich your RAG pipeline with real-time tools
Dev.to AI
Neutral-Atom Quantum: What Is It, And Why Infleqtion Stands Out
Forbes Innovation
The Human-in-the-Loop Trap
Medium · Machine Learning
I thought LLM tool calling would kill glue code and then my lights still wouldn’t turn on
Dev.to · Lars Winstand
🎓
Tutor Explanation
DeepCamp AI