Creating an uber-realistic video animation from an avatar with Stable Diffusion
This tutorial will guide you through the process of creating an avatar with ReadyPlayerMe, animating it in Mixamo, building a 3d scene around it in Blender and feeding this scene into StableDiffusion Automatic1111 to create a video animation, using an uber-realistic custom model with the Deforum and ControlNet extensions.
To watch some more great music videos created with StableDiffusion, Unreal Engine and Blender, visit our YouTube music channel:
https://www.youtube.com/@-vero-
----------------------------
Chapters:
00:00 Intro
00:20 Creating an avatar with ReadyPlayerMe
01:45 Preparing th…
Watch on YouTube ↗
(saves to browser)
Chapters (8)
Intro
0:20
Creating an avatar with ReadyPlayerMe
1:45
Preparing the avatar for uploading it to Mixamo
3:00
Animating the avatar in Mixamo and importing it into Blender
5:41
Importing a 3d background scene from Sketchfab into blender and render the scene
10:19
Preparing Automatic1111 for a hyper-realistic render
12:13
Using the Deforum and ControlNet extension for rendering the animation
14:47
Creating the final video
DeepCamp AI