Falcon 2: An 11B parameter pretrained language model and VLM, trained on over 5000B tokens and 11 languages

📰 Hugging Face Blog
Published 24 May 2024
Read full article → ← Back to News