Structured light stereo vision depth perception is a key method in the fields of 3D contour reconstruction and human-computer interaction. To address the problem that the array-encoded raw image cannot be accurately localized due to artifacts when the target under test is moving rapidly, we propose a linear sparse depth perception method based on artifact analysis. Also, in order to effectively perceive the artefact parameters in motion, we design a heterogeneous coding model with grid line mode and array coding mode. To obtain the artifact width parameters of fast-moving objects under binocular vision, the artifact parameters were obtained based on a gradient ridge extraction method. Finally, to obtain the artefact analysis function required for depth perception of fast-moving targets, we used discrete artefact parameters to obtain a global artefact analysis function. Linear sparse depth information is obtained from the artifact parameter information in the binocular images, thus improving the speed and robustness of depth perception. Keywords: structured light; stereo vision; image segmentation; artifact analysis.
Virtual reality fusion based on augmented reality has become a research hotspot, which is widely used in cultural relics exhibition, medical care and other fields. Spatial projection mapping matrix is the basis for projection equipment to project the prefabricated image onto the target surface. However, in practical operation, it is necessary to determine the relative position relationship between the projection equipment and the actual scene based on complex spatial target calibration. This paper aims to solve the problem of projection information dislocation and realize real-time tracking projection. A high-precision center positioning method based on the invariable characteristic of concentric circle intersection ratio is designed, and the mapping matrix from projection equipment to target is calculated based on PNP method. Finally, the distortion parameters of the lens are used to generate a projection pattern that can offset the projection distortion, so as to optimize the coincidence between the projection pattern and the real object, and achieve efficient and high-precision virtual reality fusion projection.
OSR is a photosensitive material commonly used in aerospace photovoltaic displays, and it is difficult to detect the quality of OSR sheets due to its extremely high specular reflectivity. The mirror target is a common target in industrial production, such as the patch on the surface of photovoltaic cells, and its compact and neat arrangement affects the photosensitive quality. Therefore, the automatic perception and measurement of its gap is of great significance. In this paper, the gap measurement method of specular target based on binocular vision is proposed by taking the gap of a high reflectivity OSR sheet as the measurement target. In this method, the binocular camera is first calibrated to determine the relative geometric position of the binocular camera and its structural parameters. Then, the Canny algorithm that uses bilateral filtering instead of Gaussian filtering extracts the edge information in the image, and the LSD straight line detection algorithm is used to extract the extracted edge information in a straight line. Finally, the parallax map or depth map is obtained through the parallax change of the binocular camera, the virtual point is synthesized, and the fixed height constraint binocular matching method and triangulation method are used to solve the periodic blurring problem of boundary registration, and the gap width of the high reflectivity specular target is calculated. Binocular visual quality inspection experiments were designed to verify the feasibility and practicability of the proposed method.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.