3D reconstruction from unconstrained image collections presents substantial challenges due to varying appearances and transient occlusions. In this paper, we introduce Micro-macro Wavelet-based Gaussian Splatting (MW-GS), a novel approach designed to enhance 3D reconstruction by disentangling scene representations into global, refined, and intrinsic components. The proposed method features two key innovations: Micro-macro Projection, which allows Gaussian points to capture details from feature maps across multiple scales with enhanced diversity; and Wavelet-based Sampling, which leverages frequency domain information to refine feature representations and significantly improve the modeling of scene appearances. Additionally, we incorporate a Hierarchical Residual Fusion Network to seamlessly integrate these features. Extensive experiments demonstrate that MW-GS delivers state-of-the-art rendering performance, surpassing existing methods.
从非受约束的图像集合进行3D重建面临诸多挑战,包括外观变化和瞬时遮挡等问题。本文提出了一种新颖的方法——微宏小波基高斯点云(Micro-macro Wavelet-based Gaussian Splatting, MW-GS),通过将场景表示解耦为全局、精细和内在组件,提升3D重建质量。该方法的核心创新包括:(i)微宏投影(Micro-macro Projection),使高斯点能够从不同尺度的特征图中提取信息,增强细节捕捉的多样性;(ii)基于小波的采样(Wavelet-based Sampling),利用频域信息优化特征表示,显著提升场景外观建模能力。此外,我们引入了分层残差融合网络(Hierarchical Residual Fusion Network),用于无缝集成这些特征。大量实验表明,MW-GS在渲染性能上达到了最先进水平,超越了现有方法。