The plenoptic camera is a new camera structure which can record the intensity, color and the direction of the light by adding a microlens array in front of the image sensor. Based on some basic concepts of the focused plenoptic camera, we first analyze the imaging characteristics. Then, the design method was given. Furthermore, we establish the depth resolution model after detailed derivation. The imaging characteristics of the focused plenoptic camera with a novel structure of four types focal lengths microlens array was analyzed. The simulation results show that our designed camera has the advantages of large depth of field and high depth resolution. The method proposed in this paper can provide reference for designing plenoptic cameras for specific application scenarios.
In this paper, we propose and demonstrate an improvement of plenoptic imaging configuration for high resolution imaging with wide field of view in turbulent atmosphere. For the improvement, the plenoptic imaging configuration is equipped with a high resolution conventional imaging system. Plenoptic imaging system is only used for measuring the wavefront distortion of imaging beams. Based on wavefront distortions measured by plenoptic imaging system and blurred images captured by the conventional imaging system, high resolution images can be achieved by the deconvolution of blurred images. Numerical simulations and experimental results show that the improved plenoptic imaging configuration can be used for restoration of near-diffraction-limited images of objects, successfully. Compared with conventional imaging system and plenoptic imaging system, the improved plenoptic imaging configuration combines advantages of wavefront distortion correction, high resolution imaging, wide field of view. The technology proposed in this paper can have wide applications in photo-electric theodolite and large telescopes.
Light-spots centroid positions detection is one of the major error sources in the application of Hartmann-Shack (H-S) wavefront sensors. The double images (ghosting) in part of sub-aperture, caused by multiple reflections from optical elements or paraxial stray light from operating environment, result in centroid detection errors and that the precisions of wavefront aberration reconstruction decline. The conventional threshold method often leads to the loss of available information of light-spots, though, by which ghosting can been brought under effective control. In this paper, an improved method and procedure, aiming at ghosting detection and local removal automatically, is proposed by combining several algorithms which include nonlinear processing, autocorrelation, convolution, and local filtering, and so on. Then, the corresponding research has been carried out by a numerical simulation platform established by ourselves, and the results can show that this method and procedure is effective.
Full-aperture noninterferometric phase retrieval system, namely one single shot, can overcome the impact of low
Signal-to-Noise ratio in the condition of weak illumination by extended beacon. Contributing to its robustness and
practicability, the technology has been widely applied in industrial inspections. However, the technology is limited by
the operational speed and the accuracy of the phase retrieval algorithm in most situations. Based on phase space optics,
an analytical relationship can be set up between the phase of the quasi-coherent light field from the extended beacon of
small field of view and 3 adjacent intensity distributions, which may be resolved fast. That is, the unknown phase is
equal to the convolution of the partial differential of the difference value of the three intensities with respect to the
rotation angle of the phase space and the sign function. This paper introduces a design and realization which
accomplishes this goal using a specially designed chromatic aberration lens and a 3CCD camera. By this way, three high
resolution images of the beacon can be captured within a single shot. The numerical simulation results show that the
method can accurately recover aberrations of more than 10 orders.
Polarization imaging plays an important role in various fields, especially for skylight navigation and target identification, whose imaging system is always required to be designed with high resolution, broad band, and single-lens structure. This paper describe such a imaging system based on light field 2.0 camera structure, which can calculate the polarization state and depth distance from reference plane for every objet point within a single shot. This structure, including a modified main lens, a multi-quadrants Polaroid, a honeycomb-liked micro lens array, and a high resolution CCD, is equal to an “eyes array”, with 3 or more polarization imaging “glasses” in front of each “eye”. Therefore, depth can be calculated by matching the relative offset of corresponding patch on neighboring “eyes”, while polarization state by its relative intensity difference, and their resolution will be approximately equal to each other. An application on navigation under clear sky shows that this method has a high accuracy and strong robustness.
Lucky imaging technology is widely applied in astronomical imaging system because of its low cost and good performance. However, the probability of capturing an excellent lucky image is low, especially for a large aperture telescope. Thus a method of adaptive image partition is proposed in this paper to extract any lucky part of an image so as to increase the probability of obtaining the lucky image. This system is comprised of a telescope and three cameras running synchronously at the image plane, the front defocus plane and the back defocus plane respectively. Two out-focus cameras have the same defocus distance. Our algorithm of selecting each lucky part of the space object picture, which is influenced little by atmosphere turbulence, is based on the difference between pictures obtained by the front and the back defocus cameras. Then image stitching is used to obtain the entire sharp picture.
To take advantage of the large-diameter telescope for high-resolution imaging of extended targets, it is necessary to detect and compensate the wave-front aberrations induced by atmospheric turbulence. Data recorded by Plenoptic cameras can be used to extract the wave-front phases associated to the atmospheric turbulence in an astronomical observation. In order to recover the wave-front phase tomographically, a method of completing the large Field Of View (FOV), multi-perspective wave-front detection simultaneously is urgently demanded, and it is plenoptic camera that possesses this unique advantage. Our paper focuses more on the capability of plenoptic camera to extract the wave-front from different perspectives simultaneously. In this paper, we built up the corresponding theoretical model and simulation system to discuss wave-front measurement characteristics utilizing plenoptic camera as wave-front sensor. And we evaluated the performance of plenoptic camera with different types of wave-front aberration corresponding to the occasions of applications. In the last, we performed the multi-perspective wave-front sensing employing plenoptic camera as wave-front sensor in the simulation. Our research of wave-front measurement characteristics employing plenoptic camera is helpful to select and design the parameters of a plenoptic camera, when utilizing which as multi-perspective and large FOV wave-front sensor, which is expected to solve the problem of large FOV wave-front detection, and can be used for AO in giant telescopes.
Traditional stereo imaging technology is not working for dynamical translucent media, because there are no obvious characteristic patterns on it and it’s not allowed using multi-cameras in most cases, while phase space optics can solve the problem, extracting depth information directly from “space-spatial frequency” distribution of the target obtained by plenoptic sensor with single lens. This paper discussed the presentation of depth information in phase space data, and calculating algorithms with different transparency. A 3D imaging example of waterfall was given at last.
The phase distribution of light field at a certain location can be calculated from two close intensities in first-order optical system along the propagation direction, and can be considered as the wavelet transform coefficient of the pupil light field. According to such theory, a new phase retrieval algorithm based on several intensities of different layers is presented in this paper, which can quickly calculate the phase with low frequency, and then gradually increase the resolution by adding more intensity to calculate.
The extended beacon belongs to partially coherent light field, and Phase space optics is an effective
tool for analyzing such light field, especially in calculating the phase of multi-angle and multi-layer.
Due to the lack of directly detection means, the research of this theory is crippled a long time, in this
paper an optical structure to obtain the spectrogram which is a kind of phase space distribution has
been presented, while the resolution problem and an improved method for possible has been
discussed too, and two new methods to get high quality astronomical Image are appearing with such
algorithm.
To overcome the shortcomings of Shack-Hartmann wavefront sensor, we developed a lightfield wavefront detection system,
which is able to complete the large field of view, multi-perspective wavefront detection in a single photographic exposure.
The lightfield wavefront detection system includes an imaging primary mirror, a lenslet array and a photosensitive device.
The lenslet array is located on the imaging plane of the imaging primary mirror and the photosensitive device is located on
the focal plane of the lenslet array. In this system, each lenslet reimages the aperture and forms a low-resolution image of
the aperture. Compared with the Shack-Hartmann sensor, this lightfield measuring method can obtain imaging arrays in
different perspectives. By comparing the array information with the standard information, we can obtain the slope matrix of
the wavefront in different perspectives and restore the wavefront in a large field of view. Based on Fourier optics, we built
the corresponding theoretical model and simulation system. By busing Meade telescope, turbulent phase screen, lenslet
array and CCD camera, we founded the experimental lightfield wavefront measuring system. Numerical simulation results
and experimental results show that this wavefront measuring method can effectively achieve the wavefront aberration
information. This wavefront measurement method can realize the multi-perspective wavefront measurement, which is
expected to solve the problem of large viewing field wavefront detection, and can be used for adaptive optics in giant
telescopes.
Efficient Nitrogen diluted HF/DF laser with cryogenic adsorption provides a possible way to realize compact and relative
high power mid-infrared laser system. Driven by only one dc-discharge tube, 117 Watt laser power, total e-o efficiency
of 6.6% in HF case, and that stands four to five-fold increase had been achieved with Nitrogen dilute by careful designed
supersonic nozzle array gain generator. Spectra, gain distribution, and chemiluminescence are further investigated to
explain the enhancing mechanism.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.