Radiometrically Consistent Gaussian Surfels for Inverse Rendering
Kyu Beom Han, Jaeyoon Kim, Woo Jae Kim, Jinhwan Seo, Sung-eui Yoon
Radiometric Consistency for Gaussian Surfels provide accurate indirect illumination for inverse rendering
Abstract
Inverse rendering with Gaussian Splatting has advanced rapidly, but accurately disentangling material properties from complex global illumination effects, particularly indirect illumination, remains a major challenge. Existing methods often query indirect radiance from Gaussian primitives pre-trained for novel-view synthesis. However, these pre-trained Gaussian primitives are supervised only towards limited training viewpoints, thus lack supervision for modeling indirect radiances from unobserved views. To address this issue, we introduce radiometric consistency, a novel physically-based constraint that provides supervision towards unobserved views by minimizing the residual between each Gaussian primitive’s learned radiance and its physically-based rendered counterpart. Minimizing the residual for unobserved views establishes a self-correcting feedback loop that provides supervision from both physically-based rendering and novel-view synthesis, enabling accurate modeling of inter-reflection. We then propose Radiometrically Consistent Gaussian Surfels (RadioGS), an inverse rendering framework built upon our principle by efficiently integrating radiometric consistency by utilizing Gaussian surfels and 2D Gaussian ray tracing. We further propose a finetuning-based relighting strategy that adapts Gaussian surfel radiances to new illuminations within minutes, achieving low rendering cost ($<$10ms). Extensive experiments on existing inverse rendering benchmarks show that RadioGS outperforms existing Gaussian-based methods in inverse rendering, while retaining the computational efficiency.
RadioGS introduces radiometric consistency supervision for inverse rendering to accurately model indirect illumination in Gaussian-based representations.
- Radiometric consistency principle providing physically-based supervision for unobserved views
- RadioGS framework efficiently integrating radiometric consistency using Gaussian surfels and 2D ray tracing
- Fast relighting strategy adapting surfel radiances to new illuminations within minutes
- Gaussian splatting
- Inverse rendering
- Physically-based rendering
- Ray tracing
- Radiometric consistency
Currently supports only dielectric materials; extending to anisotropic or highly-reflective surfaces would be interesting
from the paper
Extend radiometric consistency to more complex materials such as anisotropic or highly-reflective surfaces
from the paper
Author keywords
- Radiometric Consistency
- Indirect Illumination
- Gaussian Splatting
- Inverse Rendering
Related orals
Improving Diffusion Models for Class-imbalanced Training Data via Capacity Manipulation
Capacity manipulation improves diffusion models' handling of class-imbalanced data by reserving capacity for minority classes via low-rank decomposition.
Depth Anything 3: Recovering the Visual Space from Any Views
DA3 predicts spatially consistent 3D geometry from arbitrary camera views using plain transformer and depth-ray targets.
Text-to-3D by Stitching a Multi-view Reconstruction Network to a Video Generator
VIST3A stitches text-to-video models with 3D reconstruction systems and aligns them via reward finetuning for high-quality text-to-3D generation.
True Self-Supervised Novel View Synthesis is Transferable
Presents XFactor, first geometry-free self-supervised model for transferable novel view synthesis without 3D inductive biases.
Locality-aware Parallel Decoding for Efficient Autoregressive Image Generation
Introduces parallel decoding for autoregressive image generation with flexible ordering achieving 3.4x latency reduction.