By acquiring three-dimensional profiles of biological tissues, interventions can be performed with increased speed and accuracy, driving the development of next-generation image guided therapy. However, current three-dimensional reconstruction techniques relying on feature detection and matching struggle with tissues lacking distinct features, resulting in relatively sparse reconstruction results. In this paper, we propose a data-driven method for reconstructing three-dimensional surfaces from a single polarimetric image, utilizing physics-based priors. We constructed a calibrated imaging system consisting of a polarization camera and a 3D scanner to collect polarization information and ground truth 3D data. Using this system, we created a dataset with organ models, capturing polarization images, depth maps, and surface normal maps under different lighting conditions. To achieve our goal, we designed a deep neural network based on the Unet architecture. This network takes the polarization image and prior physical parameter maps (phase angle, degree of polarization, and unpolarized intensity) as inputs and is trained to output the surface normal map and relative depth map of the organ. Experimental results on the tissue phantom dataset demonstrate the effectiveness of our method in generating dense reconstruction results, even for the regions lacking distinct features. Furthermore, we validated the robustness of our method to changes in the light source direction, showcasing its ability to handle variations in lighting conditions. Overall, our proposed data-driven approach provides a promising solution for dense three-dimensional reconstruction from a single polarimetric image, leveraging physics-based priors and deep learning techniques.
|