3D scene reconstruction when data sparse by filling in using diffusion with rgbd-diffusion
3D scene reconstruction when data sparse by filling in using diffusion with rgbd-diffusion
Generative Scene Synthesis via Incremental View Inpainting using RGBD Diffusion Models
arXiv paper abstract https://arxiv.org/abs/2212.05993
arXiv PDF paper https://arxiv.org/pdf/2212.05993.pdf
Project page https://jblei.site/project-pages/rgbd-diffusion.html
... address the challenge of recovering an underlying scene geometry and colors from a sparse set of RGBD view observations.
... present a new solution that sequentially generates novel RGBD views along a camera trajectory, and the scene geometry is simply the fusion result of these views.
... maintain an intermediate surface mesh used for rendering new RGBD views, which subsequently becomes complete by an inpainting network; each rendered RGBD view is later back-projected as a partial surface and is supplemented into the intermediate mesh.
The use of intermediate mesh and camera projection helps solve the refractory problem of multi-view inconsistency.
... practically implement the RGBD inpainting network as a versatile RGBD diffusion model, which is previously used for 2D generative modeling; ... make a modification to its reverse diffusion process to enable ... use.
... experiments on the ScanNet dataset demonstrate the superiority of ... approach over existing ones ...
Please like and share this post if you enjoyed it using the buttons at the bottom!
Stay up to date. Subscribe to my posts https://morrislee1234.wixsite.com/website/contact
Web site with my other posts by category https://morrislee1234.wixsite.com/website
Comments