Replies: 7 comments 8 replies
-
This is a super interesting project! Fine-tuning SD for depth. It also doesn't look like too challenging of an integration. |
Beta Was this translation helpful? Give feedback.
-
Wow, and their results are phenomenal for a wide range of scales and scenes. |
Beta Was this translation helpful? Give feedback.
-
https://github.com/zhyever/PatchFusion is this the same thing? |
Beta Was this translation helpful? Give feedback.
-
Came here to post this! |
Beta Was this translation helpful? Give feedback.
-
Here is the start... With some probability I will return to this only after some considerable time. |
Beta Was this translation helpful? Give feedback.
-
I am currently using a blender 2D plane with displacement map and blender's VR mode to view this Marigold depthmap stuff in VR and it's very good. Using a Quest 3 with Virtual Desktop if you set a green background in your AI generation the objects will appear to be in your room in Mixed Reality if you turn on the VD passthrough mode. There seems to be about a 150 degree view around the person or object. Blender has full motion controls also for moving the it around and rotating. The blender VR plugin seems to work pretty well now for images. Currently trying to convert this VR unity project which just added Marigold in the beta (though I can't get the Marigold working at the moment due to whl errors) https://github.com/parkchamchi/DepthViewer/releases The source Unity project is available for download there, it also has no positional tracking or motion controls for whatever reason but I'm working on that. It does realtime 2D video to VR conversion which is pretty good with the basic Midas when I checked it out with a Stable Diffusion Video I made. I'm not sure if the Marigold can do this in realtime though. My idea here is to add automatic1111 --api access to the Unity Project and be able to type in VR and just have the AI images appear in 3D front of you in mixed reality in the room. Right now it takes me about 3 minutes to create an image with depth map then update it in Blender for viewing in VR for me currently, a VR Unity build automating all of this would be so much better. Seeing this stuff in full 3D in VR is with the Marigold update pretty mind blowing compared to just looking at your images on a flat screen. Edit: Also Kijaj added a comfyui workflow so you can update the depthmap in realtime by turning on autoqueue https://github.com/kijai/ComfyUI-Marigold Would love to see something like this on the extension here for auto1111 if dev gets time and it's feasable. |
Beta Was this translation helpful? Give feedback.
-
One really big thing I've found with marigold is by editing the initial noise you can actually create much lower flickering. I was trying to in-paint but I think I messed up a few aspects so I got this (The outside is suppose to be static): out1.mp4This gives a much less flickering depth video which is something that a lot of people seem interested in. sadtalker.mp4 |
Beta Was this translation helpful? Give feedback.
-
Would this be of interest for this plug-in?
https://github.com/prs-eth/Marigold
Beta Was this translation helpful? Give feedback.
All reactions