How to Blend Real Footage + AI to Direct a Winter Fox
Today on Fine-Tuning Friday we have Cinematographer Matt Uhry joining us, giving us a behind the scenes of how he blended real world footage of his dog to create a film of a Winter Fox. Using a combination of WAN 2.1 VACE, Depth Maps, ComfyUI, Fine-Tuning and LoRAs gave him the real directability he wanted.
We also touch on LTX-2.3 as a newer model that has similar properties to the model he used 6 months ago.
Links + Notes 📝 https://www.oxen.ai/blog
Join Fine-Tune Fridays 🔧 https://oxen.ai/community
Discord 🗿 https://discord.com/invite/s3tBEn7Ptg
Use Oxen AI 🐂 https://oxen.ai/
Oxen.ai offers one click fine-tuning or fine-tunes for you! Built on top of the worlds best data versioning tool, we offer tools to automate model evals, generate synthetic data, and effortlessly fine-tune models.
--
Chapters
Watch on YouTube ↗
(saves to browser)
Sign in to unlock AI tutor explanation · ⚡30
Related AI Lessons
⚡
⚡
⚡
⚡
How to Write Better AI Image Prompts for Midjourney (With Examples That Actually Work)
Medium · ChatGPT
Image to Video AI: The Complete Workflow Playbook That Actually Produces Results
Medium · AI
Image Harvest v1.0.2: Internationalization, Free Pro Trial & Quality-of-Life Improvements
Dev.to · kyriewen
Pix2Pix: Image-to-Image Translation using Conditional GANs
Medium · Deep Learning
🎓
Tutor Explanation
DeepCamp AI