Photoluminescence lifetime imaging of upconverting nanoparticles is increasingly featured in recent progress in optical thermometry. Despite remarkable advances in photoluminescent temperature indicators, existing optical instruments lack the ability of wide-field photoluminescence lifetime imaging in real time, thus falling short in dynamic temperature mapping. Here, we have developed single-shot photoluminescence lifetime imaging thermometry (SPLIT), which is developed from a compressed-sensing ultrahigh-speed imaging paradigm. Using the core/shell NaGdF4:Er3+,Yb3+/NaGdF4 upconverting nanoparticles as the lifetime-based temperature indicators, we apply SPLIT in longitudinal wide-field temperature monitoring beneath a thin scattering medium. SPLIT also enables video-rate temperature mapping of a moving biological sample at single-cell resolution.
In this paper, we report a dispersion-eliminated coded-aperture light field (DECALF) imaging system based on digital micromirror devices. Using a dual-DMD design to compensate for dispersion in the entire visible spectrum, the DECALF imaging system captures 1280×1024×5×5 (𝑥, 𝑦, θ,φ) color light field images at 20 Hz. Using three-dimensional (3D) color scenes, we experimentally demonstrate multi-perspective viewing and digital refocusing by DECALF imaging system.
KEYWORDS: Photography, Optical imaging, CMOS cameras, 3D scanning, Reconstruction algorithms, Real time imaging, Physics, Materials science, Laser scanners, Imaging systems
Single-shot real-time ultra-high-speed imaging is of significance in capturing transient phenomena. Existing techniques fall short in possessing satisfying specifications in the imaging speed, sequence depth, and pixel count. To overcome these limitations, we have developed compressed optical-streaking ultra-high-speed photography (COSUP) that records a scene (x, y, t) by applying the operations of spatial encoding, temporal shearing, and spatiotemporal integrating. The COSUP system possesses an imaging speed of 1.5 million frames per second (fps), a sequence depth of 500 frames, and a pixel count of 0.5 megapixels per frame. COSUP is demonstrated by imaging single laser pulses illuminating through transmissive targets and by tracking a fast-moving object. We envision COSUP to be applied in widespread applications in biomedicine and materials science.
KEYWORDS: Holography, 3D acquisition, 3D displays, Cameras, Speckle, Wavefronts, Integral imaging, Image resolution, Computer generated holography, 3D image processing
It is well known that holographic display can provide 3D scenes with continuous viewpoints and is free of accommodation-convergence conflict. So far most of the research in this area focuses on the display end, leaving the acquisition end merely explored. For holographic content acquisition, one needs to capture the scene in 3D. Ways to do this include the traditional optical holography and integral imaging. However, optical holography suffers from serious speckle while integral imaging has a long march to increase the resolution. In this paper, we propose a technique based on a variation of the transport of intensity equation to calculate the “phase” information of a scene from its defocusd intensity captured by a color camera under white light illumination. With the defocused phase and intensity data at hand, we can calculate the infocused wavefront of the scene, and further encode it into a computer generated hologram for subsequent holographic display. We demonstrate the proposed technique by simulation and experimental results. Compared with existing 3D acquisition techniques for holographic display, our method may provide better viewing experience due to the free of speckle in the acquisition stage, as well as the fact that the resolution does not limited by the microlenslet.
Phase contains important information about the diffraction or scattering property of an object, and therefore
the imaging of phase is vital to many applications including biomedicine and metrology, just name a few.
However, due to the limited bandwidth of image sensors, it is not possible to directly detect the phase of an
optical field. Many methods including the Transport of Intensity Equation (TIE) have been well demonstrated
for quantitative and non-interferometric imaging of phase. The TIE offers an experimentally simple technique
for computing phase quantitatively from two or more defocused images. Usually, the defocused images were
experimentally obtained by shifting the camera along the optical axis with slight intervals. Note that light
field imaging has the capability to take an image stack focused at different depths by digital refocusing the
captured light field of a scene. In this paper, we propose to combine Light Field Microscopy and the TIE
method for phase imaging, taking the digital-refocusing advantage of Light Field Microscopy. We demonstrate
the propose technique by simulation results. Compare with the traditional camera-shifting technique, light-field
imaging allows the capturing the defocused images without any mechanical instability and therefore demonstrate
advantage in practical applications.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.