h2ogpt: Another Open-source large language model by H2O.ai team

650 AI Lab · Advanced ·🧠 Large Language Models ·2y ago
Another full open source large language model from H2Oa.o AI team with 12B and 20B parameters, trained on the Pile open-source dataset is released with the following features: - Open-source repository with fully permissive, commercially usable code, data and models - Code for preparing large open-source datasets as instruction datasets for fine-tuning of large language models (LLMs), including prompt engineering - Code for fine-tuning large language models (currently up to 20B parameters) on commodity hardware and enterprise GPU servers (single or multi node) - Code for enabling LoRA (low-ran…
Watch on YouTube ↗ (saves to browser)
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Next Up
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Dave Ebbelaar (LLM Eng)