The weights are live: 37M-parameter Gemma 4–style model on Hugging Face
📰 Medium · LLM
Yesterday, I shared that I built one of the smallest Gemma 4–style language models from scratch, 37M parameters, trained end-to-end in… Continue reading on Medium »
DeepCamp AI