Open Access Paper
2 July 2019 A STEM outreach tool for demonstrating the sensing and compensation of atmospheric turbulence
Author Affiliations +
Proceedings Volume 11143, Fifteenth Conference on Education and Training in Optics and Photonics: ETOP 2019; 111431H (2019) https://doi.org/10.1117/12.2523799
Event: Fifteenth Conference on Education and Training in Optics and Photonics: ETOP 2019, 2019, Quebec City, Quebec, Canada
Abstract
The US Air Force (USAF) conducts research involving the sensing and compensation of atmospheric turbulence, which acts to blur images and make laser-beam propagation more challenging. As such, USAF scientists and engineers (S and Es) often face the challenging task of explaining this research to audiences without relevant technical expertise. These audiences vary widely all the way from upper military leadership down to K-12 students as part of science, technology, engineering, and mathematics (STEM) outreach activities. Previously, a team of USAF S and Es developed a table-top setup for the demonstration of digital-holography (DH) technology. This technology enables the measurement of the complex-optical field, which in turn enables a plethora of applications that involve imaging and wavefront sensing. Therefore, in this paper we extend this table-top setup to illustrate both the effects of atmospheric turbulence on the imaging and wavefront sensing process and the digital-signal processing required to estimate and mitigate these effects. The enhanced demonstration provides a visual-learning aid to help explain the complicated concepts associated with imaging through atmospheric turbulence. Specifically, we show that we can introduce aberrations into the DH system and use digital-image correction to refocus the resultant blurry images. This paper discusses the overall system design, improvements, and lessons learned.

1.

INTRODUCTION

The US Air Force (USAF) is heavily invested in optics and laser research to help maintain a technological edge in air and space. A major thrust of this research focuses on mitigating the effects of atmospheric turbulence on optical-beam propagation. Temperature variations in the atmosphere cause perturbations in the index of refraction which can alter the phase of light, causing it to spread nonuniformly and distort. These phase variations degrade the performance of imaging systems used for reconnaissance and surveillance and of directed energy systems, two major components of the USAF’s technological portfolio. Therefore, significant effort goes into developing methods to sense these so-called phase errors and correct them.

Not unlike other career fields, scientists and engineers (S&Es) leading research in atmospheric-turbulence mitigation must advocate and educate others about their work. This includes explaining the importance of their research to organizational leadership to obtain the necessary resources. Another example is bolstering interest in the optics and laser career field through science, technology, engineering, and mathematics (STEM) outreach activities. These activities help ensure that a future talent pool exists to recruit new S&Es. Unfortunately, these audiences often lack the relevant technical expertise needed to fully grasp the science behind sensing and mitigating atmospheric turbulence. This particular science is multidisplanary, spanning atmospheric physics, optics, electronics, control systems, and signal processing. Thus, explaining this research to non-technical audiences is challenging.

With this in mind, the SPIE Student Chapter at the Air Force Institute of Technology (AFIT) developed two novel table-top demonstration tools used to help explain the science behind atmospheric turbulence and optical beam propagation. The first tool, the laser propagation demonstration (LPD), is designed to interactively display the optical effects associated with coherent light propagation through atmospheric turbulence.1 The LPD design sends a collimated laser source through a phase screen and an amplitude mask. The resulting field is imaged onto a camera to produce a distorted image that changes as the phase screen rotates. The second demonstration tool, the digital holography demonstration (DHD), is an extension of the LPD designed to highlight a wider range of optics and photonics concepts.2 It uses spatial-heterodyne interferometry to demonstrate the wave-nature of light and principals of both coherent illumination and coherent detection. The system measures the amplitude and phase of light propagated through a transmissive mask.

In this work, we build upon the DHD design to include the ability to generate, sense, and digitally correct phase errors in the optical path. Specifically, we implement the model-based iterative reconstruction (MBIR) algorithm from Pellizzari, et. al.3 to estimate the phase errors and the object image. This allows us to demonstrate concepts related to atmospheric sensing and turbulence mitigation in the form of a table-top interactive tool. In addition to this extension, we add the capability to collect and display data in near-real time. This allows us to obstruct the beam propagation and introduce dynamic phase errors that change or degrade the holographic image and immediately show the results. The near-real-time processing also simplifies system alignment, an important feature for a portable optics table.

In the following sections, we provide an overview of the DHD system and discuss the additions of MBIR and near-real-time imaging. Additionally, we discuss lessons learned from recent demonstrations.

2.

DHD DESIGN OVERVIEW

The DHD uses spatial-heterodyne detection to sense both the amplitude and phase of a signal. Figure 1 shows the an annotated image of the system and a full parts list can be found in Ref. 2. For our optical source, we use a single, eye-safe, visible laser diode with a 532 nm wavelength to illuminate an object mask and to act as a reference for coherent detection. After the source is expanded and collimated, we use a beam splitter and mirrors to split the light into signal and reference paths. We place a transmissive object mask, the USAF 1951 bar chart, in the signal path, then image the resulting field onto a camera using a set of two lenses. We also image the collimated reference signal onto the camera using a set of lenses identical to those in the signal path. Note that we use adjustable mirrors to induce tilt on the reference path which spatially modulates the signal. This allows us to isolate the desired signal digitally in post processing.4 For both legs, we use neutral density (ND) filters to control the relative strength of both paths. The resulting interference pattern, or hologram, incident on the camera is digitized and saved as a .tif file. Finally, we display and process the digital hologram using a graphical user interface (GUI) developed in Ref. 2 using MATLAB. The DHD GUI, shown in Fig. 2 displays three images: the hologram, the magnitude of the hologram spectrum, and either the resulting amplitude or phase image.

Figure 1.

Image of DHD system with annotations shows the otpical elements and the paths for both the signal and the reference.

00039_PSISDG11143_111431H_page_2_1.jpg

Figure 2.

Screen shot of main DHD GUI.

00039_PSISDG11143_111431H_page_3_1.jpg

3.

DHD UPGRADES

In this work, we added two major upgrades to the DHD system, near-real-time hologram processing and the ability to impart, sense, and digitally correct optical aberrations.

3.1

Addition of Near-Real-Time Processing

For the first major upgrade, we added the ability to capture holograms and demodulate the images at refresh rates from 5 to 10 Hz with the primary DHD GUI. To initiate near-real-time processing, we added the green play button on the top menu bar, shown in Fig 2. We also enabled control of the camera gain and exposure time to maximize the signal-to-noise ratio in the hologram. These new features are located in the bottom left portion of Fig 2. Previously, the demonstrator (the person demonstrating the DHD) had to capture the hologram using the external camera software and load the recorded hologram frame into the GUI. This process took upwards of 30 secs and created an unnecessary break in the demonstration. With near-real-time processing, this break is eliminated, making it easier to keep an observer’s attention. Additionally, the new system provides nearinstantaneous feedback which allows us to demonstrate shifting interference fringes, signal drops, and in the future, a time varying optical aberration. This has made the DHD a more engaging tool for audiences.

3.2

Addition of MBIR

For the second DHD upgrade, we include the ability to introduce, sense, and digitally correct optical aberrations, ϕ, using the MBIR algorithm described in Refs. 3 and 5. MBIR uses a Bayesian framework to jointly estimate the real-valued object reflectance, r, and any phase errors, ϕ, from coherent data. In most coherent imaging scenarios, we measure the field reflected off an object and model the measured data as y = Aϕg + w, where Aϕ is the sensor’s measurement matrix with unknown phase errors, ϕ, and w is additive white complex Gaussian noise with variance 00039_PSISDG11143_111431H_page_3_2.jpg.3,6 Here, g is the reflection coefficient, a zero-mean, circularly-symmetric complex normal random variable, with variance E[|gs|2|r] = rs, that produces a speckled image. The variance of g is produced by rough surface scattering. We obtain the complex-valued measurements, y, by taking a Fourier transform of the real-valued hologram and isolating the signal of interest. For the DHD, the resulting measurement matrix, A, is simply a two-dimensional discrete Fourier transform (DFT) matrix, D. Note that the aperture function that appears in Ref. 3 is simply the identity matrix for the DHD. To form an image, we use the MBIR algorithm to jointly compute the maximum a posteriori (MAP) estimates of r and ϕ, given the data, y. For the prior models of r and ϕ, we use variants of Gaussian Markov random fields which help enforce smoothness in the image and phase functions.

There are two important assumptions inherent in the MBIR algorithm that we must consider when applying it to the DHD. First, we are sensing the field that is transmitted through a transmissive object rather than reflected off of a reflective object. Therefore, for the DHD setup, g represents the field passing through the object, with |gs|2 = rs, and does not produce speckled images. However, we found that the MBIR algorithm still works well in this case. Additionally, we discovered that using an optical diffuser, just prior to the object mask, generates a g that is more consistent with the reflective scenario, i.e., it produces random phase variations resulting in a speckled image. Using the diffuser also allows us to demonstrate concepts related to rough surface scattering and the signal processing required to mitigate the speckle variations. Future work may include modifying the MBIR model for the transmissive scenario.

The second assumption inherent in the basic MBIR algorithm is that ϕ is isoplanatic. This means that the point spread function (PSF) in the image domain is shift invariant. Therefore, in order to use MBIR with the DHD, we must induce isoplanatic aberrations. A typical process to generate arbitrary isoplanatic aberrations involves placing a phase screen in the Fourier plane of the image. Unfortunately, the current DHD design does not include a Fourier plane. Adding a phase screen at any point in the signal path will generate severely-anisoplanatic aberrations, resulting in a shift-varying PSF in the image domain.

To overcome this limitation, we developed two methods to generate isoplanatic phase errors. For the first method, we adjust the position of the object mask until the image is out of focus. The resulting blur is a function of the distance between the mask and the image plane, a quantity that is identical for all points in the image. Thus, the blurring kernel is shift invariant and we can use MBIR. For the second method, we generate isoplanatic phase errors digitally through the GUI. Specifically, we produce ϕ with Komologrov statistics using techniques described in Ref. 7, then generate abberrated image data according to y’ = 𝒟 (e)y. Here 𝒟 (·) denotes an operator that produces a diagonal matrix from its vector argument.

The first blurring method described above provides a hands-on method to demonstrate the effects of optical aberrations that observers can witness directly using the near-real-time processing. However, this method is limited to quadratic phase errors that result from the object being out of focus. Conversely, the second method provides an opportunity to implement and discuss more complex aberrations that typically result from atmospheric turbulence. However, observers are not able to see and interact with the abberrating process. In future work, we plan to modify the DHD design to include a Fourier plane on the signal path. This will allow us to introduce time-varying isoplanatic phase errors using a rotating phase wheel, similar to that used in Ref. 4.

We trigger the MBIR algorithm by selecting the mulit-color phase screen button located on the top tool bar in the main DHD GUI. When selected, the software will pass the most-current hologram to a MBIR script. Upon receipt of a hologram, the MBIR script displays a second GUI window, shown in Fig. 3, for the user to set reconstruction parameters. A full list and description of the MBIR parameters can be found in Ref. 3. Here, the phase and image regularization sliders control the amount smoothness to enforce in the phase and image functions, respectively. The Signal-Center parameters tell the algorithm where to locate the signal in the Fourier plane, Image Size sets the size of the final reconstructed image, measured in pixels, and Bootstraping Iterations controls the number of iterations used for estimating the phase function prior to the final image reconstruction. The parameter, Screen Bin, allows users to reconstruct the phase function at lower resolutions, a feature useful in low SNR scenarios. Finally, D/r0 allows users to digitally add aberrations, as discussed above.

Figure 3.

Screen shot of MBIR GUI.

00039_PSISDG11143_111431H_page_5_1.jpg

We designed the MBIR GUI so that it initializes with a set of default parameters that we found to work well for the DHD. The user may change these default values, or simply click the run button, located in the bottom right of Fig. 3. While the algorithm is running, a new figure window, shown in Fig. 4 is displayed to show the original blurry image and the current estimates of r and ϕ.

Figure 4.

Figure window showing the MBIR reconstruction process. Here, the original corrupted image is shown on the left, the refined image is in the middle, and the estimated phase errors are on the right.

00039_PSISDG11143_111431H_page_5_2.jpg

4.

DHD LESSONS LEARNED

So far, we have demonstrated the DHD at several STEM outreach events and learned valuable lessons to improve the process. First, unsurprisingly, the system tends to require minor alignment after transportation. Thus, the user should plan for at least 30 minutes of setup time prior to an event. In most cases, this setup time simply requires adjusting the tip and tilt of the laser and the translation of the initial lens to ensure light makes it through the filtering pin hole. If this is not done prior to the demonstration, it may result in poor signal-to-noise ratios which can detract from the experience.

During the demonstration, we found that having a poster, as shown in Fig. 5, to support the presentation is helpful. This poster provides a guide that the demonstrator can use to structure their presentation. It also allows us to present the mathematical concepts related to wave interference and the MBIR algorithm, something that would be difficult to explain otherwise.

Figure 5.

Poster used in conjunction with DHD demo.

00039_PSISDG11143_111431H_page_7_1.jpg

Finally, we have found that observers are most intrigued by two particular parts of the demonstration. The first intriguing feature is when we reconstruct an image under weak-signal conditions. Observers are surprised since they can no longer see the image in the hologram; however, after demodulation, the image reappears. The second intriguing feature of the demonstration is the difference between a corrupted image and the MBIR reconstruction. Again, observers are surprised by the improvement in image quality. From our experiences, we find that these two intriguing features should be used as attention grabbers to gain and maintain observers attention. In particular, it may be useful to start the process by highlighting these results, then explaining the process used to obtain them.

5.

CONCLUSION

This paper provides an overview of the DHD system, discusses recent improvements including near-real-time processing and digital image correction, and provides lessons learned from our recent demonstrations. We find the system to be a useful tool for conveying the complex ideas associated with optics, lasers, wave interference, atmospheric-turbulence sensing and mitigation, and signal processing. In addition to sharing specific details about the DHD, we hope to inspire other S&Es to take a similar approach for explaining their research and generating interest in their career fields through interactive STEM outreach demonstrations.

REFERENCES

[1] 

Spencer, M. F., Steinbock, M. J., Hyde, M. W., and Marciniak, M. A., “The laser propagation demonstration: a stem-based outreach project,” in International Society for Optics and Photonics, 91880D (2014). Google Scholar

[2] 

Thornton, D. E., Spencer, M. F., Plimmer, B. T., and Mao, D., “The digital holography demonstration: a table-top setup for stem-based outreach events,” in International Society for Optics and Photonics, 107410J (2018). Google Scholar

[3] 

Pellizzari, C. J., Spencer, M. F., and Bouman, C. A., “Phase-error estimation and image reconstruction from digital-holography data using a bayesian framework,” JOSA A, 34 (9), 1659 –1669 (2017). https://doi.org/10.1364/JOSAA.34.001659 Google Scholar

[4] 

Spencer, M. F., Raynor, R. A., Banet, M. T., and Marker, D. K., “Deep-turbulence wavefront sensing using digital-holographic detection in the off-axis image plane recording geometry,” Optical Engineering, 56 (3), 031213 –031213 (2017). https://doi.org/10.1117/1.OE.56.3.031213 Google Scholar

[5] 

Pellizzari, C., Banet, M. T., Spencer, M. F., and Bouman, C. A., “Demonstration of single-shot digital holography using a bayesian framework,” JOSA A, (2017). https://doi.org/10.1364/JOSAA.34.001659 Google Scholar

[6] 

Pellizzari, C. J., Spencer, M. F., and Bouman, C. A., “Imaging through distributed-volume aberrations using single-shot digital holography,” JOSA A, 36 (2), (2019). https://doi.org/10.1364/JOSAA.36.000A20 Google Scholar

[7] 

Schmidt, J. D., “Numerical Simulation of Optical Wave Propagation, With examples in Matlab,” SPIE, (2010). Google Scholar
© (2019) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Casey J. Pellizzari, Douglas E. Thornton, Connor Vikupitz, Matthew Cooper, and Mark F. Spencer "A STEM outreach tool for demonstrating the sensing and compensation of atmospheric turbulence", Proc. SPIE 11143, Fifteenth Conference on Education and Training in Optics and Photonics: ETOP 2019, 111431H (2 July 2019); https://doi.org/10.1117/12.2523799
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Holograms

Atmospheric turbulence

Atmospheric optics

Digital holography

Laser optics

Atmospheric propagation

Image processing

Back to Top