MiniCPM-o 4.5: Watching, Listening and Proactive Speaking Simultaneously
๐ซ Ready to build with MiniCPM-o 4.5? Weโve made it #developer-ready with full support for llama.cpp, ollama, vLLM, SGLang, LLaMA-Factory, etc.
๐ป We also open source a new high-performing llama.cpp-omni engine together with an interactive Demo, to bring full-duplex omni experience directly to your local devices like MacBook!
๐ Try it on @Hugging Face now ๐๐ป
huggingface.co/openbmb/MiniCPM-o-4_5
Watch on YouTube โ
(saves to browser)
DeepCamp AI