Open Access
30 November 2021 Development of an ultracompact endoscopic three-dimensional scanner with flexible imaging fiber optics
Zhongjie Long, Hengbing Guo, Kouki Nagamune, Yunbo Zuo
Author Affiliations +
Abstract

We developed an ultracompact three-dimensional (3D) endoscopic scanner that utilizes an electromagnetic-based stereo imaging method (EBSIM) for shape measurements. The method was used to investigate the applicability of imaging fiber optics to featureless surface profiles in confined spaces. The EBSIM is combined with a coherent flexible fiber bundle for structured light pattern projection to ensure that the scanner head can achieve a final diameter of merely 3.4 mm and a baseline length of 1.4 mm. Compared with the existing methods, the scanner is designed to have a minimum focal distance of 2 mm to address the intraoperative evaluation problem. System registration and complex calibration are unnecessary before measurements, and the scanner can be operated from any position and orientation. The scanner is capable of acquiring 3D videos at 30 frames per second. The experimental results demonstrate the feasibility of using the EBSIM for 3D surface profile measurements of featureless cylindrical surfaces. A prototype scanner is presented in extensor, and the average error of a series of known depth distances is found to be equal to 0.15 mm. The implementation shows that the 3D scanner can be potentially applied for endoscopic inspections and intraoperative evaluations in minimally invasive surgeries.

1.

Introduction

Over the last decade, laser-based three-dimensional (3D) shape-measurement methods have been developed and applied to industrial production, quality assurance, biotechnology, and medical technology.1 The test objects can be as small as coins or as large as buildings. Other techniques have also been developed for 3D shape measurement. Among them, the mainstream 3D shape-measurement methods are noncontact approaches that use optical measurement devices owing to their convenience and high speed. The 3D shape-measurement optical systems (such as 3D scanners) designed during the past 5 years have uniaxial configurations with beam splitters.24 Uniaxial configurations are characterized by laser illumination and imaging on the same axis. The combination of these two approaches allows the use of small scanning tips because the illumination and charge-coupled device camera are placed behind the specimen to ensure the use of complex imaging optics. Consequently, these optical systems can obtain high accuracy at values below 100  μm. However, the maximum measurable depth range of these systems is 1.1 mm and the maximum lateral range is merely 3  mm×4  mm. Hence, such measurement systems are commonly used microscopy systems intended for medical and industrial applications using a fixed desktop scanner.

Research aiming to develop new designs for more compact endoscopic system has been continuously conducted in the field of 3D endoscopic scanning. To this end, certain medical-application-oriented 3D endoscopic measurement systems have been developed.5,6 These systems are based on fringe projection and monocular endoscope. However, the measurement of multiple pattern projections in a limited space is challenging due to scattering caused by surface roughness and surface curvature has a significant impact on the phase contrast of interference fringe patterns. To ensure satisfactory phase contrast and phase-shifting results, these measurement systems are currently used for smooth surface measurements or measurements over a relatively large working distance. These endoscopic 3D measurement systems are still based on a fixed desktop measuring platform, which is not portable for use in real medical applications. Aiming at clinical research for tissue surface measurement, a dual-modality endoscopic system has been proposed.7 3D reconstruction is based on a light pattern formed of randomly distributed spots with different colors. The system setup was quite distinctive, which enabled the authors to reconstruct tissue surfaces within 15- to 40-mm working distances with a baseline of 6  mm. This system can be used for tissue depth measurement through a small surgical access port.

Motivated by the aforementioned endoscopic 3D measurement systems, in this work, we developed an endoscopic 3D scanner for orthopedists, who face difficulties with joint surface perception. Currently, intraoperative inspection and evaluation of the knee-joint surface are critical to osteochondral transplantation surgery. To avoid a potential pumping action and negative pressure in the depths at the recipient side, osteochondral grafts should be inserted flush with the knee-joint surface (perpendicular insertion) to maximize surface congruency.8,9 Currently, intraoperative inspections and evaluations within the knee-joint cavity are conducted (after the drilling of a small incision, see Fig. 1) based on the dexterity and expertise of the surgeon, yielding nonuniform results. Decision-making regarding the perpendicular insertion would be easier to accomplish if a 3D model of the knee-joint surface is available to the surgeon. However, no studies have been conducted on 3D profile measurements of knee-joint surfaces, especially as these pertain to in vivo applications.

Fig. 1

Osteochondral transplantation surgery.10

OE_60_11_114108_f001.png

In this paper, we developed an ultracompact 3D endoscopic scanner using an electromagnetic-based stereo imaging method (EBSIM) to measure the 3D profiles of complete joint surfaces. The main highlights of this paper are as follows:

  • (1) Complex camera calibration and system preoperative registration were unnecessary in the simple setup of the endoscopic scanner with regular components. Hence, the systematic error component that depends on the axial setup can be reduced.

  • (2) Unlike other scanning systems, a unique characteristic of the proposed scanner is its underwater shape measurement. In addition, the small size of the endoscope and its scanning depth range of 2 to 12 mm meet the needs of medical inspection during minimally invasive surgeries.11

  • (3) The characteristic curve and error propagation of the prototype are analyzed, which provides precision guides for other researchers when designing a 3D measurement device.

  • (4) This work serves as a first attempt to demonstrate the applicability of EBSIM for the development of a compact improved flexible endoscopic system.

The remainder of this paper is organized as follows:

In Sec. 2, we describe the details of the system setup and detection method. The experimental results and discussion are given in Sec. 3. Section 4 presents the conclusions.

2.

Materials and Methods

2.1.

Basic Principle of Laser Light Section Method

The laser light section method (LLSM), shown in Fig. 2, is an extensively used alternative to stereo vision, in which one of the binocular cameras is generally replaced by a light source. One or multiple structured light patterns are projected onto a test object and captured by a camera. The intersection of the camera ray with the corresponding light plane can yield 3D coordinates of point P only if the imaging geometry is calibrated. Development based on this method benefits the miniaturization of the projection hardware, and the prototype can thus be designed as a compact unit for applications in confined spaces.

Fig. 2

Principle of the LLSM. A light plane from the projector illuminates the object and is captured by a camera. The 3D coordinates of point P relative to the camera system can be computed after calibration.

OE_60_11_114108_f002.png

Sensitivity to depth is generally determined by the geometry of triangulation in the LLSM, especially by the length of the baseline.12,13 Our previous study showed that a shorter baseline indicates a larger systematic error associated with the detection of the distance.14

Additional beam scanning is needed with one-or two-dimensional (2D) relative motions between the detector and test object to measure the 3D shape of the object. However, if test objects have deep holes or are located in confined spaces, the relative motion cannot be easily measured. Consequently, developing a unique 3D measurement/imaging method is necessary for acquiring the 3D shape of an object with an ultracompact system configuration. Notably, a compact configuration will lead to a short baseline, and the change in baseline will increase systematic error. Hence, the designing of a compact optical configuration with a preferable precision is one of the highlights of this study. In addition, the measurement of the 3D shape of an object alone is insufficient in some surgical inspections. The focus should also be on the needs of surgeons, such as shape rotation, zooming ability, and even relative position relationships between the endoscope’s distal tip and other intraoperative tools.

2.2.

Endoscope Design

2.2.1.

Imaging fiber-based optical configuration

Figure 3 shows the optical configuration of the endoscopic 3D scanner using the LLSM method. The optical configuration mainly consists of three parts: the light source projection, coupling optics, and scanner’s distal tip. For the first part, we adopted a helium–neon laser projector supplied by Daheng Optics (DH-HN, Beijing, China) as the laser scanning light source. The projected laser has a wavelength of 632.8 nm, since the knee-bone has the highest reflection at this wavelength. The diameter of the laser line must remain as small as possible to achieve a high-resolving power. Hence, the laser line is first focused using lens L1, and then projected onto the proximal end surface of the imaging fiber, wherein the coupling optics consist of two lenses. As a result, the laser line is delivered via the flexible imaging fiber and transmitted to the reference plane through the probe imaging lens mounted on the distal-end surface of the optical fiber. In addition, the optical fiber was manufactured by Asahi Kasei Corporation (SPN, Tokyo, Japan) and comprised 13,000 pixel elements with a distal-end diameter of 1.0 mm. Baseline b indicates the distance between the distal-end surface of the optical fiber and optical axis z of the camera. In this case, b was an extremely short distance of 1.4 mm.

Fig. 3

Optical configuration of the developed endoscopic 3D scanner based on LLSM (L13, lens; PIL, probe imaging lens; b, baseline; RP, reference plane, with the middle one as the focal plane).

OE_60_11_114108_f003.png

2.2.2.

Ultracompact camera

The applicability of the proposed 3D endoscopic scanner in a confined space cannot be ensured using a commercially available, normal-sized camera that does not have a compact design or high precision. For example, in Ref. 15, the image sequence captured by a normal camera demonstrated an extremely high noise level and periodic pulsation. If the system operates in an underwater environment, the camera would not be able to acquire any images effectively. Therefore, a customized camera (ultracompact camera) with a diameter of 1.8 mm and its control circuit were designed for the proposed fiber-based endoscopic imaging, as shown in Fig. 4. To avoid causing illumination artifacts, four white micro-LEDs were mounted radially around the front of the camera. This camera is ideal for medical applications owing to its wide field-of-view with ultrashort focus distance. Specifications of the camera are listed in Table 1.

Fig. 4

Photograph of the (a) distal tip of the camera and the LEDs and (b) the customized control circuit (size: 7  cm×2.5  cm×0.5  cm).

OE_60_11_114108_f004.png

Table 1

Specifications of the ultracompact camera used.

ParameterValue
Cube chip sensorOVM6946
Resolution400×400  pixels
Optical size1/18  in
Minimum focal distance2 mm
Field-of-view120  deg(horizontal)×60  deg(vertical)
Video frame rate160 K pixel @ 30 frames per second
Pixel size1.75  μm×1.75  μm
Output interfaceUSB 3.0
IlluminationFour LEDs

According to the optical configuration shown in Fig. 3, the camera coordinate system and the image plane we defined are both shown in Fig. 5. A point P is captured by camera and is projected onto the projection center of the camera with a corresponding point (x,y) on the image plane. The relationship of the 3D coordinates of point P (xc,yc,zc) in the camera coordinate system can be denoted as follows based on the laser triangulation of the pinhole camera model:

Eq. (1)

[xcyczc]=zc[fx0cx0fycy001]1[xy1],
where (cx,cy) is the principal point, and fx and fy are the focal lengths in the x and y directions, respectively, which can be obtained following the calibration of the camera.

Fig. 5

Definition of the camera coordinate system.

OE_60_11_114108_f005.png

As the imaging fiber is located on the top of the camera (Fig. 3), the imaged laser line only appears on the top half of the image plane. Based on the camera coordinates, if light spots were captured successfully, then 0ycy and zc0. In addition, the imaged light spots located on the top half of the image plane are negative in the yc, and those on the left half of the image plane are negative whereas those on the right half are positive in the xc, as shown in Fig. 5. Thus, the depth distance can be expressed as follows:

Eq. (2)

zc=bfy|ycy|=bfycyy.
where b is the baseline length with a value of 1.4 mm.

2.3.

Electromagnetic-Based Stereo Imaging Method

Similar to many monocular structured light systems, the scanning results of the current frame (e.g., frame n) will be replaced immediately by those of frame n+1 with respect to the movement of the endoscope because the camera is not equipped with a functionality to save previous data. Furthermore, a common problem in monocular endoscopic imaging is the initialization, because the depth information of a scene cannot be recovered from only one single image frame.16 Hence, a unique method for 3D imaging is essential in our design to save the complete scanned data and display the shape of the objects accurately.

We propose a real-time stereo imaging method for 3D shape measurements using a state-of-the-art electromagnetic-based tracking system (LIBERTY, Polhemus, United States). The system consists of a transmitter and microsensor. The transmitter produces an electromagnetic field that acts as an accurate reference for position and orientation measurements of the microsensor. The microsensor (with a diameter of 1.6 mm) can offer high-fidelity continuous tracking with six degrees-of-freedom and a maximum update rate of 240 Hz. Furthermore, the microsensor is assembled with a camera and imaging fiber that allows the distal tip of the endoscope to be small with a diameter of 3.4 mm. All three parts were fixed in a simple axially aligned configuration comprising the distal tip.

The difference between our proposed approach and the feature-based tracking method presented in a previous 3D imaging study for minimally invasive surgery15,17 is shown in Fig. 6, demonstrating the use of different coordinate systems during object imaging. The endoscope is located in the camera coordinate system and aimed at the virtual object (the bone) to be displayed. Figure 6(a) shows the coordinate system of the feature-based tracking method. Once the feature is detected and tracked, the object is imaged in the feature coordinate system. Assuming Pf as a 3D point (xf,yf,zf) on the bone surface, the 3D point can be transformed into a 2D point (x,y) in the endoscopic camera’s view using the following equation:

Eq. (3)

[xy1]=[fx0cx0fycy001][RfcTfc][xfyfzf1],
where Tfc is a (3×1) translation vector and Rfc is a (3×3) rotation matrix.

Fig. 6

The comparison of the (a) marker-less tracking method and (b) our proposed EBSIM tracking method.

OE_60_11_114108_f006.png

In our proposed EBSIM, as can be seen in Fig. 6(b), we add a microsensor local space S as an agent, which serves as the intermediary and is incrementally built from the point cloud sensed in the environment, enabling the achievement of significant robustness in data collection. Let Ps=[xs,ys,zs]T describe the same point, P, with respect to the microsensor coordinate system as follows:

Eq. (4)

[Ps1]=[RcsTcs01][xcyczc1],
where Tcs is the (3×1) translation vector that locates the camera’s origin with respect to the microsensor’s coordinate system, and Rcs is the (3×3) rotation matrix that transforms a point from camera frame to microsensor frame.

Euler angles, namely, azimuth, elevation, and roll are represented by α, β, and γ, respectively. These angles represent an azimuthal primary sequence of frame rotations (counterclockwise rotation) that define the current orientation of the microsensor corresponding to the zero-orientation state of the transmitter. Thus, the rotation matrix of the current orientation of the sensor is expressed as follows:

Eq. (5)

Rst=Rro·Rel·Raz,
where
Rro=[cosγsinγ0sinγcosγ0001],Rel=[cosβ0sinβ010sinβ0cosβ],andRaz=[1000cosαsinα0sinαcosα].

Equations (4) and (5) show that the 3D coordinates of point P with respect to the transmitter coordinate system are denoted as

Eq. (6)

[Pt1]=[RstTst01][Ps1],
where Tst is the position vector from the sensor to the transmitter. Using the transmitter space as a terminal, we solved two important issues in monocular imaging: (i) no precaptured images/initialization is needed, which saves time and avoids information overlay; (ii) the endoscope can scan from any position and orientation, which is convenient for constant extraction and insertion of the endoscope.

Real-time imaging reconstruction can be performed on a frame-by-frame basis on the 3D coordinates of a specimen’s surface scanned by the camera using Eq. (6) and the OpenGL technique. In addition, the reconstructed 3D shapes, including shape rotation, zooming, and offline observations based on the unified transmitter coordinate system, are available for online processing (Sec. 3.3).

2.4.

Characteristic Curve of the Proposed Endoscopic Scanner

2.4.1.

System constants

The characteristic curve of the proposed endoscopic scanner is formulated in this section. The curve describes the relationship between the depth distance and location of the imaged light spot of the camera detector. Using Eq. (2), the corresponding point, y, can be expressed as follows:

Eq. (7)

y=bfy1zc+cy=A1zc+B,
where A=bfy and B=cy are the system constants of the endoscopic scanner. The values of the system constants can be determined based on the calibration results shown in Sec. 3.1.

Equation (7) indicates that point y is not uniform but has an inverse linear relationship with respect to depth distance zc. This property hinders the operation of a 3D endoscopic scanner. Furthermore, any change in the corresponding point y will result in a systematic error; the larger the change, the larger the error. Therefore, the corresponding point is an important parameter when designing a 3D endoscopic scanner.

To reveal the relationship between the point y and the depth distance zc, a spot was projected onto a white flat board vertically by the scanner when located on a moving stage. The details of the components setting and operations were same as that for the flat-plane evaluation experiment described in Sec. 3.2. By doing so, we can obtain the relationship between a corresponding point and the illumination distance. Figure 7 shows the ideal characteristic curve of the proposed endoscopic scanner. However, the actual working characteristics of the endoscopic scanner must be considered thoroughly. A series of illuminated spots, zc, and their corresponding points, y, were measured, as shown in Fig. 7, for comparison with the results obtained by Eq. (7). The results from the measured positions are consistent with the ideal characteristic curve. In addition, the measuring range of the endoscopic scanner in the zc-coordinate range of 2 to 21 mm is imaged on the camera’s detector length at a y-coordinate, of 100  mm in this case. The measurement ranges for the xc and yc coordinates correspond to the fixed field-of-views of 120 deg and 60 deg, respectively, regardless of the measurement distance.

Fig. 7

Relationship between the coordinate of image spot y and the position of light spot zc on the specimen of the prototype.

OE_60_11_114108_f007.png

2.4.2.

Error propagation

The displacement of the illuminated spot zc by amount dzc leads to the displacement of the corresponding image point by dy, which is dependent on the illuminated distance as a consequence of the nonlinear characteristic depicted by the curve obtained using Eq. (7). We considered the derivation of function y=y(zc) from Eq. (7) with respect to zc:dy/dzc.

According to the law of error propagation, the standard deviation of image spot coordinates σy on the image plane is defined as follows:

Eq. (8)

σy2=a2σzc2(a=dydzc),
where σzc is the standard deviation of illuminated spot zc, and a is the gradient of the relationship between the illuminated spot zc and the y-coordinate of the image spot. It can be rewritten as

Eq. (9)

σy=aσzc.

Figure 8 shows the relationship between the gradient and standard deviation of the image spot coordinate, σy, based on the illuminated spot zc coordinate. Gradient a1 is considerably smaller than a2 in this figure. In this case, standard deviation σy1 becomes smaller than the standard deviation σy2 when standard deviations of zc1 and zc2 are constant at σzc. Therefore, the gradient is a vital factor in considering the measurement accuracy and range when utilizing a 3D endoscopic scanner. Moreover, the scanning of specimens with the maximum working distance of 12 mm can achieve high precision.

Fig. 8

Relationship between the gradient and standard deviation of image spot coordinates σy based on illuminated distance zc.

OE_60_11_114108_f008.png

3.

Results

3.1.

Experimental Configuration

The 3D shape measurement was performed using the hardware setup shown in Fig. 9. The control circuit mainly consists of a decoding circuit for the OVM 6946 chip sensor and a driver for the light-emitting diode (LED) illumination, with adjustable brightness. The microsensor is connected to the electronic unit by a cable for data exchange. The transmitter is placed at an optimal distance of 1 m from the microsensor during the data-collection process. Figure 9 also shows the front view of the endoscope tip. The camera, fiber, and microsensor were fixed in a tube of diameter 3.4 mm. The e=1.7  mm is a user-defined distance between the camera and microsensor, which can yield the translation vector Tcs given in Eq. (4).

Fig. 9

Hardware setup of the measurement system.

OE_60_11_114108_f009.png

The pinhole camera was calibrated prior to the verification of the proposed endoscope scanner using Zhang’s method as used in Ref. 18. The employed chessboard has a pattern accuracy of 0.01 mm. The comparison of the chessboard capture before and after calibration is shown in Fig. 10. The radial lens distortion shown in Fig. 10(a) is large owing to the large field-of-view of the camera. Figure 10(b) shows the precise correction of the distortion after camera calibration. The intrinsic camera matrix obtained by calibration was as follows:

Eq. (10)

I=[fx0cx0fycy001]=[250.63890206.66490250.6611197.9362001].

Fig. 10

Comparison of image capture (a) before and (b) after camera calibration. The size of the chessboard is 18  mm×13.5  mm with a square size of 1.5 mm.

OE_60_11_114108_f010.png

3.2.

Flat-Plane Evaluation

3D scanners are commonly evaluated to develop a set of procedures that use artifacts with a common geometry, such as planes, spheres, and cones.19 Liquid crystal displays were assumed as flat plane specimens to evaluate the precision of the endoscopic scanner.20 First, the test plane was positioned in front of the scanner. Second, the scanner was located on a handwheel translational stage (GCM-83 Daheng Optics, Beijing, China) and was moved in a controlled fashion along its z-axis. In addition, the z-axis was parallel to the normal direction of the reference plate. Thus, the position of the scanner was changed precisely with a positioning resolution of 0.01 mm.

Reference planes were scanned from 2 to 12 mm with intervals of 2 mm. This range was chosen based on the actual distance within which surgeons usually conduct observations in actual medical cases.10 The reference planes were placed perpendicular to the optical axis of the endoscope to obtain the optimal focus of the laser. Table 2 lists the average depth distributions and measurement errors between the plane positions and measured results. These results indicate that the measurement error of the flat plane was <0.18  mm, with the standard deviation of <0.018  mm, which shows a slight improvement compared with the values obtained for a distance of 8.00 mm.

Table 2

Measured results of reference planes.

Test groups1#2#3#4#5#6#
Given z position (mm)2.004.006.008.0010.0012.00
Measured depths (mm)1.823.825.847.869.9011.87
Measurement error (mm)0.180.180.160.140.100.13
Standard deviation (mm)0.0180.0180.0140.0120.0110.012

The captured image of the laser beam and the extracted center lines are shown in Fig. 11. As the laser beam was very short, we indicate the local amplification effect for a better observation. Figure 12 shows the depth distribution results of the flat-plane with a scanning distance of 8 mm.

Fig. 11

Captured image sample of (a) laser beam, (b) area detection of the laser beam, and (c) extracted center line of the laser.

OE_60_11_114108_f011.png

Fig. 12

Measurement results of flat-plane.

OE_60_11_114108_f012.png

3.3.

Cylindrical Surface Evaluation

A cylindrical wood specimen with a diameter of 40.05 mm, shown in Fig. 13(a), was used to test the performance of the endoscopic scanner system. The shape was machined via CNC milling. A quarter of the cylindrical surface was radially and manually scanned at a distance of <12  mm with the endoscopic sensor. Figure 13(b) clearly depicts the 3D data of the obtained surface. The original 3D raw point measurement data contained 49,778 points. However, the standard evaluation criteria or standard specimens for 3D point cloud data of the endoscope are lacking. We then employed a geometric algorithm to fit a set of 3D points with the tested cylinder.21

Fig. 13

Measured results of the cylindrical surface: (a) photograph of the measured wood and (b) point-by-point 3D representation of the scanned surface.

OE_60_11_114108_f013.png

Figure 14 shows the representation of a cylinder. The cylinder was specified by an axis containing a point C and having unit-length direction W. Moreover, r was the radius of the cylinder. Two more unit-length vectors U and V were defined such that {U,V,W} was a right-handed orthonormal set and each vector was mutually perpendicular. Thus, any 3D point P can be written uniquely as

Eq. (11)

P=C+y0U+y1V+y2W=C+RY,
where R is a rotation matrix with columns U, V, and W and where Y is a column vector with rows y0,y1,y2. To be on the cylinder, we need

Eq. (12)

r2=|PC|2.

Fig. 14

Representation of a cylinder.

OE_60_11_114108_f014.png

Let {Pi}i=1n be the cylindrical point set. An error function for a cylinder is expressed as E=i=1n(ri2r2)2. Setting the partial derivative of the error function with respect to the squared radius to zero, we then obtained the constraint

Eq. (13)

0=i=1n(ri2r2),
where the obtained parameter ri yields the radius of the cylinder

Eq. (14)

r=1ni=1nri2.

The experimental results indicate that the diameter of the obtained cylindrical surface is 39.81 mm. The deviation from the cylindrical surface was 0.24 mm with a standard deviation of 0.031 mm. An error of approximately two times the error of the flat plane was observed in the measurement of the cylindrical surface mainly because the manual measurement of the flat plane was static, while that of the cylindrical test was based on dynamic scanning. Although unrelated to the original intention (manual scanning) of the endoscopic development, the precision of the measurement can be improved slightly if the experiment is designed to use a microuniversal stage to scan shapes with cylindrical surfaces.

Keeping the endoscope perpendicular to the surface is difficult because of the variable scanning distance. The poor focus of the projected structured light results in a weaker detection and data losses. The nonuniform movement during manual scanning is another factor that must be considered. Figure 15 shows the position tracking with six degrees-of-freedom and orientation of the endoscope during the measurement of the cylindrical surface. The tracking curves for the x- and y-positions are clearly not as smooth as those of the 3D orientation. Hence, an excessively fast scan will cause data loss. Accordingly, an extremely slow scan could result in repeated scans at the same location and duplicated data. Our previous endoscope characterized image data through nonperiodic fluctuations. These fluctuations during the detection of light intensity are caused by the exposure or frame drops of the camera. These situations often occur in web cameras. Therefore, this feature was eliminated by locking the exposure parameters, which were set in the customized control circuit.

Fig. 15

Six degree-of-freedom tracking of the endoscope.

OE_60_11_114108_f015.png

The endoscopic scanner software was developed on the Windows platform based on EBSIM and then combined with a rendering algorithm.22 The software interface is shown in Fig. 16. The left side of the image shows the 3D raw point measurement data for the surface. The two other forms can provide users with shape rotation, tilting, and zooming of the scanned surface. Figure 17 presents the rendered surface from different viewpoints. The recovered surface quality is highly promising.

Fig. 16

User interface of the 3D point cloud measurement of surface and observation software.

OE_60_11_114108_f016.png

Fig. 17

Observation effect of the rendered surface at different viewing angles: (a) front and (b) left views.

OE_60_11_114108_f017.png

3.4.

Knee-Joint Test

An ex vivo knee-joint of a chicken bone sample with an approximate size of 1.5  mm×1  mm was used to demonstrate the functionality of the endoscopic system and obtain the 3D model (see Fig. 18). To simulate the intraoperative inspections of the knee-joint, this measurement test was conducted in an underwater environment. The measurement was completed in <2  s and the collected data consisted of 103,672 points. The point density is significantly higher (more than two times) than that of the cylindrical surface owing to the better reflection of light compared with that of the cylindrical surface. The experimental results showed that the endoscopic scanner works even on biological surfaces that cannot be easily scanned owing to volume scattering and highlights. The data quality demonstrates potential because the curvature of the complete joint can be recovered. However, in a microscopic environment, measuring the surface of tissue with high reflectivity is inevitable. Reflective highlights from the water cause some interferential data, as shown by the black arrow in Fig. 19. We attribute the possible reasons to the following two points. For one thing, the material of the knee surface was smooth and characterized by reflection of light. For another, measuring a surface under water with a short scan distance made the camera more sensitive to the light intensity. There are mainly two types of techniques that may optimize the measurement results for such problems. One technique is based on changing the illustration light source intensity or based on camera exposure techniques.2325 The other is to employ a filter in front of the camera tip. The former method is more complicated for operation because it requires to fine-tuning the projection and exposure parameters several times to find a best effect before measurement. However, the exposure settings of the camera cannot be adjusted in some cases. Moreover, low intensity of the light may probably lead to a worse detection of the structured light. In contrast, the latter method is easier to implement if economic cost is disregarded.

Fig. 18

Ex vivo knee-joint chicken bone sample.

OE_60_11_114108_f018.png

Fig. 19

3D point cloud measurement data for knee-joint surface of a chicken.

OE_60_11_114108_f019.png

To quantitatively evaluate the surface congruency of the measured knee-joint, an assessment method that is based on the measured 3D point sets must be designed for free-form surfaces. This will be considered in our future work.

4.

Conclusions

A flexible 3D endoscopic scanner based on EBSIM with a final diameter of 3.4 mm has been proposed in this study. To the best of our knowledge, this is the first study to investigate a 3D endoscope based on imaging fiber structured light combined with an electromagnetic tracking strategy and the smallest distal-tip configuration possible. Complex calibration and system registration are unnecessary for the proposed endoscope, and objects can be scanned from any position and orientation. The characteristic curve and error propagation of the endoscope were analyzed, and the results showed that the working distance is an important parameter for the precision of a 3D endoscope using the EBSIM. Two experiments with a flat plane and cylindrical surface demonstrated the endoscope’s precision. A remarkable accuracy of 0.2  mm was obtained for an extremely short baseline of 1.4 mm. A knee-joint of a chicken bone was used as an ex vivo example, where the joint was measured and the 3D profile was accurately reconstructed. The proposed 3D endoscope is expected to provide qualitative analysis to surgeons in their decision-making with the aid of a 3D surface shape.

Acknowledgments

This research was supported by Beijing Natural Science Foundation (Grant Nos. 3204039 and 4204113), the National Natural Science Foundation of China (Grant No. 52005046), and Beijing Municipal Commission of Education (Grant No. KM201911232021).

References

1. 

A. Donges and R. Noll, Laser Measurement Technology: Fundamentals and Applications, 1st ed.Springer-Verlag, Berlin, Heidelberg (2015). Google Scholar

2. 

G. A. P. Escamilla, F. Kobayashi and Y. Otani, “Three-dimensional surface measurement based on the projected defocused pattern technique using imaging fiber optics,” Opt. Commun., 390 57 –60 (2017). https://doi.org/10.1016/j.optcom.2016.12.057 OPCOB8 0030-4018 Google Scholar

3. 

H. M. Park and K.-N. Joo, “Endoscopic precise 3D surface profiler based on continuously scanning structured illumination microscopy,” Curr. Opt. Photonics, 2 (2), 172 –178 (2018). https://doi.org/10.1364/COPP.2.000172 Google Scholar

4. 

M. Chen et al., “Three-dimensional surface profile measurement of a cylindrical surface using a multi-beam angle sensor,” Precis. Eng., 62 62 –70 (2020). https://doi.org/10.1016/j.precisioneng.2019.11.009 PREGDL 0141-6359 Google Scholar

5. 

J. Schlobohm, A. Posch and E. Reithmeier, “A raspberry pi based portable endoscopic 3D measurement system,” Electronics, 5 (3), 43 (2016). https://doi.org/10.3390/electronics5030043 ELECAD 0013-5070 Google Scholar

6. 

S. Pulwer et al., “Dynamic pattern generation by single-mode fibers for endoscopic 3D measurement systems,” Proc. SPIE, 11293 112930F (2020). https://doi.org/10.1117/12.2543526 PSISDG 0277-786X Google Scholar

7. 

J. Lin et al., “Dual-modality endoscopic probe for tissue surface shape reconstruction and hyperspectral imaging enabled by deep neural networks,” Med. Image Anal., 48 162 –176 (2018). https://doi.org/10.1016/j.media.2018.06.004 Google Scholar

8. 

S. G. Pearce et al., “An investigation of 2 techniques for optimizing joint surface congruency using multiple cylindrical osteochondral autografts,” J. Arthrosc. Relat. Surg., 17 (1), 50 –55 (2001). https://doi.org/10.1053/jars.2001.19966 Google Scholar

9. 

L. Hangody et al., “Osteochondral plugs: autogenous osteochondral mosaicplasty for the treatment of focal chondral and osteochondral articular defects,” Oper. Tech. Orthop., 7 (4), 312 –322 (1997). https://doi.org/10.1016/S1048-6666(97)80035-3 Google Scholar

10. 

H. Robert, “Chondral repair of the knee joint using mosaicplasty,” Orthop. Traumatol.: Surg. Res., 97 (4), 418 –429 (2011). https://doi.org/10.1016/j.otsr.2011.04.001 Google Scholar

11. 

P. Di Benedetto et al., “Arthroscopic mosaicplasty for osteochondral lesions of the knee: computer-assisted navigation versus freehand technique,” Arthroscopy, 28 (9), 1290 –1296 (2012). https://doi.org/10.1016/j.arthro.2012.02.013 ARTHE3 0749-8063 Google Scholar

12. 

H. Haneishi, T. Ogura and Y. Miyake, “Profilometry of a gastrointestinal surface by an endoscope with laser beam projection,” Opt. Lett., 19 (9), 601 –603 (1994). https://doi.org/10.1364/OL.19.000601 OPLEDP 0146-9592 Google Scholar

13. 

L. Maier-Hein et al., “Optical techniques for 3D surface reconstruction in computer-assisted laparoscopic surgery,” Med. Image Anal., 17 (8), 974 –996 (2013). https://doi.org/10.1016/j.media.2013.04.003 Google Scholar

14. 

Z. Long and K. Nagamune, “Underwater 3D imaging using a fiber-based endoscopic system for arthroscopic surgery,” J. Adv. Comput. Intell. Inf., 20 (3), 448 –454 (2016). https://doi.org/10.20965/jaciii.2016.p0448 Google Scholar

15. 

N. Haouchine et al., “Impact of soft tissue heterogeneity on augmented reality for liver surgery,” IEEE Trans. Vis. Comput. Graphics, 21 (5), 584 –597 (2015). https://doi.org/10.1109/TVCG.2014.2377772 1077-2626 Google Scholar

16. 

L. Chen et al., “SLAM-based dense surface reconstruction in monocular minimally invasive surgery and its application to augmented reality,” Comput. Methods Prog. Biomed., 158 135 –146 (2018). https://doi.org/10.1016/j.cmpb.2018.02.006 Google Scholar

17. 

J. H. Kim et al., “Tracking by detection for interactive image augmentation in laparoscopy,” Lect. Notes Comput. Sci., 7359 246 –255 (2012). https://doi.org/10.1007/978-3-642-31340-0_26 LNCSD9 0302-9743 Google Scholar

18. 

Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell., 22 (11), 1330 –1334 (2000). https://doi.org/10.1109/34.888718 ITPIDJ 0162-8828 Google Scholar

19. 

P. Rachakonda, B. Muralikrishnan and D. Sawyer, “Sources of errors in structured light 3D scanners,” Proc. SPIE, 10991 1099106 (2019). https://doi.org/10.1117/12.2518126 PSISDG 0277-786X Google Scholar

20. 

M. Fujigaki, T. Sakaguchi and Y. Murata, “Development of a compact 3D shape measurement unit using the light-source-stepping method,” Opt. Lasers Eng., 85 9 –17 (2016). https://doi.org/10.1016/j.optlaseng.2016.04.016 Google Scholar

21. 

D. Eberly, “Least squares fitting of data by linear or quadratic structures: 7 fitting a cylinder to 3D points,” (1999). https://www.geometrictools.com/Documentation/LeastSquaresFitting.pdf Google Scholar

22. 

Z. Long and K. Nagamune, “A marching cubes algorithm: application for three-dimensional surface reconstruction based on endoscope and optical fiber,” Information, 18 (4), 1425 –1437 (2015). Google Scholar

23. 

Z. Cai et al., “Structured light field 3D imaging,” Opt. Express, 24 (18), 20324 –20334 (2016). https://doi.org/10.1364/OE.24.020324 OPEXFF 1094-4087 Google Scholar

24. 

Z. Song and S. Yau, “High dynamic range scanning technique,” Proc. SPIE, 48 (3), 033604 (2009). https://doi.org/10.1117/1.3099720 PSISDG 0277-786X Google Scholar

25. 

H. Zhao et al., “Rapid in-situ 3D measurement of shiny object based on fast and high dynamic range digital fringe projector,” Opt. Lasers Eng., 54 170 –174 (2014). https://doi.org/10.1016/j.optlaseng.2013.08.002 Google Scholar

Biography

Zhongjie Long received his BE degree in vehicle engineering from South China University of Technology in 2010, his ME degree in mechanical engineering from Beijing Information Science and Technology University in 2013, and his PhD in advanced interdisciplinary science and technology from the University of Fukui, Japan, in 2016. Currently, he is an associate professor at Beijing Information Science and Technology University. His research interests include computer-assisted surgery systems and 3D endoscopic imaging.

Hengbing Guo received his MD degree in clinical medicine from the Health Science Center of Xi’an Jiaotong University in 2001. He is now an associate chief physician at the Orthopedics Rehabilitation Center, Beijing Rehabilitation Hospital of Capital Medical University. He is engaged in the treatment and rehabilitation of sports medicine and sports trauma and is experienced in arthroscopic minimally invasive surgery. He is a member of the Shoulder and Elbow Surgery Professional Committee of CMEA.

Kouki Nagamune received his PhD in computer engineering from Himeji Institute of Technology, Japan, in 2004. He worked as a lecturer at Kobe University Graduate School of Medicine, in 2006 to 2007. He relocated to the University of Fukui in 2007. Currently, he is an associate professor with the University of Fukui. Since 2017, he has been an IEEE senior member. He is a member of IEICE, JSMBE, JSCB, and JSFTII. His research interest includes medical imaging.

Yunbo Zuo received his PhD in mechanical manufacture and automation from Beijing Institute of Technology in 2008. He is now an associate fellow at the Key Laboratory of Modern Measurement and Control Technology, Ministry of Education, Beijing Information Science and Technology University. His research interests include robot vision recognition, image object detection, image target tracking, and state monitoring of mechanical systems.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Zhongjie Long, Hengbing Guo, Kouki Nagamune, and Yunbo Zuo "Development of an ultracompact endoscopic three-dimensional scanner with flexible imaging fiber optics," Optical Engineering 60(11), 114108 (30 November 2021). https://doi.org/10.1117/1.OE.60.11.114108
Received: 12 June 2021; Accepted: 16 November 2021; Published: 30 November 2021
Lens.org Logo
CITATIONS
Cited by 4 scholarly publications.
Advertisement
Advertisement
KEYWORDS
3D scanning

Endoscopy

3D metrology

Cameras

Laser scanners

Scanners

Endoscopes

RELATED CONTENT


Back to Top