h2ogpt: Another Open-source large language model by H2O.ai team
Another full open source large language model from H2Oa.o AI team with 12B and 20B parameters, trained on the Pile open-source dataset is released with the following features:
- Open-source repository with fully permissive, commercially usable code, data and models
- Code for preparing large open-source datasets as instruction datasets for fine-tuning of large language models (LLMs), including prompt engineering
- Code for fine-tuning large language models (currently up to 20B parameters) on commodity hardware and enterprise GPU servers (single or multi node)
- Code for enabling LoRA (low-ran…
Watch on YouTube ↗
(saves to browser)
DeepCamp AI