The GOES-17 Advanced Baseline Imager (ABI) has an anomaly with its on-board cooling system that prevents it from maintaining its Focal Plane Modules (FPMs) at cold, optimal temperatures. Because of this, during certain times of the year the FPMs and their detectors warm and cool throughout the day. Changing the detectors’ temperature changes their response to incoming radiance, which leads to calibration errors over time and degrades the imagery. Numerous mitigation strategies have been implemented to reduce the solar insolation on the instrument and to mitigate image degradation, including semi-annual yaw flips and changing the integration time of the detectors twice daily. These and other mitigations all work with the baseline calibration algorithms currently in place on the GOES-R Ground System. In an attempt to reduce the image degradation even further, the ABI vendor designed a new calibration scheme that predicts key parameters forward in time to account for the drifting FPM temperatures. These parameters, the linear gain term and dark current scene, are nominally updated on orbit every 5 minutes and 30 seconds, respectively. However, even at these relatively short cadences the detectors can change temperature, thereby rendering the parameters invalid for accurate calibration. By projecting these parameters forward in time the radiometric bias is reduced and image quality improves. This Predictive Calibration modification was deployed to operations on July 25, 2019, following several months of extensive testing and optimization by the GOES-R science teams. During this time several parameters and thresholds were tuned to ensure Predictive Calibration was turning on and off at the optimal times. Since going into operations users have seen noticeable improvement to the imagery and its calibration. This paper will discuss the fundamental assumptions behind the baseline equations and highlight the changes introduced by Predictive Calibration. Results will show the improvements to the calibration of the operational L1b products and reduction in image degradation.
Two flight models of the Advanced Baseline Imager (ABI) are in-orbit on the GOES-16 and GOES-17 geostationary satellites, with two more planned to be launched on GOES-T (2021) and GOES-U (2024). The ABI is the primary Earthviewing weather imaging instrument on the GOES-R Series, producing Level 1b (L1b) radiances and Cloud and Moisture Imagery (CMI) data products. The ABI L1b product is the source for all the ABI Level 2+ (L2+) products, including CMI, which makes the maturity process for these two products important. CMI is the only key performance parameter (KPP) of the GOES-R Series mission and thus CMI takes precedence over other ABI L2+ products. As the only KPP, CMI follows the same maturity schedule as the ABI L1b product. For the ABI L1b and CMI data products to be declared operational, they must pass through a series of calibration and validation tests and analyses, with the peerreviewed results showing that the instruments and products have achieved each level of maturity consistent with mission success. This paper describes the assessment process, the definitions of the product validation maturity levels, and an overview of the product performance for each instrument at each validation level. Additionally, this paper will describe planned programmatic changes aimed at streamlining the maturity process for the upcoming GOES-T and GOES-U satellites.
The advanced baseline imager (ABI) on the Geostationary Operational Environmental Satellite (GOES)-R Series is a great improvement compared to the legacy GOES imager. For example, there are more spectral bands at improved spatial resolution and more frequent imagery. The vast majority of the images generated by the ABIs are free of visual defects, well calibrated, and produced in a timely fashion. Yet, there are rare times when visual artifacts, or anomalies, occur. Our study highlights and explains a number of these artifacts, some of which are traditional imagery defects for imagers such as striping and stray light, and colorfully named artifacts such as “caterpillar tracks” and “shark fins.” In addition, multiple resources are presented for more information about image quality and near-real-time image monitoring.
The GOES-16 Advanced Baseline Imager (ABI) is the first of four of NOAA's new generation of Earth imagers. The ABI uses large focal plane arrays (100s to 1000s of detectors per channel), which is a significant increase in the number of detectors per channel compared to the heritage GOES O-P imagers (2 to 8 detectors per channel). Due to the increase in number of detectors there is an increased risk of imaging striping in the L1b & L2+ products. To support post-launch striping risk mitigation strategies, customized ABI special scans (ABI North South Scans - NSS) were developed and implemented in the post-launch checkout validation plan. ABI NSS collections navigate each detector of a given channel over the same Earth target enabling the characterization of detector-level performance evaluation. These scans were used to collect data over several Earth targets to understand detector-to-detector uniformity as function of a broad set of targets. This effort will focus on the data analysis, from a limited set of NSS data (ABI Ch. 1), to demonstrate the fundamental methodology and ability to conduct post-launch detector-level performance characterization and advanced relative calibrations using such data. These collections and results provide critical insight in the development of striping risk mitigation strategies needed in the GOES-R era to ensure L1b data quality to the GOES user community.
A primary objective of the GOES-16 post-launch airborne science field campaign was to provide an independent validation of the SI traceability of the Advanced Baseline Imager (ABI) spectral radiance observations for all detectors post-launch. The GOES-16 field campaign conducted sixteen validation missions (March to May 2017), three of which served as the primary ABI validation missions and are the focus of this work. These validation missions were conducted over ideal Earth targets with an integrated set of well characterized hyperspectral reference sensors aboard a high-altitude NASA ER-2
aircraft. These missions required ABI special collections (to scan all detectors over the earth targets), unique aircraft maneuvers, coordinated ground validation teams, and a diplomatic flight clearance with the Mexican Government. This effort presents a detector-level deep-dive analysis of data from the targeted sites using novel geospatial database and image abstraction techniques to select and process matching pixels between ABI and reference instruments. The ABI reflective solar band performance (ABI bands 1-3 & 5-6) was found to have biases within 5 % radiance for all bands, except band 2; and the ABI thermal emissive band performance was found to have biases within 1 K for all bands. Additional inter-comparison results using targeted ABI special collections with the Low Earth Orbit reference sensor S-NPP/VIIRS will also be discussed. The reference data collected from the campaign has demonstrated that the ABI SI traceability has been
validated post-launch and established a new performance benchmark for NOAA’s next generation geostationary Earth observing instrument products.
The first satellite of the Geostationary Operational Environmental Satellite-R series (GOES-R), the next generation of NOAA geostationary environmental satellites, was launched November 19, 2016. This satellite, GOES-16, carries six instruments dedicated to the study of the Earth’s weather (ABI), lightning mapping (GLM), solar observations (EXIS and SUVI), and space weather monitoring (SEISS and MAG). Each of these six instruments are in the process of going through a series of specialized calibration plans to achieve their product quality requirements. In this review paper we will describe the overall status of the on-orbit calibration program, the path forward to Full product validation status, and any changes that may occur for the cal/val plans for GOES-S, which is planned for launch in early 2018.
Jon Fulbright, Elizabeth Kline, David Pogorzala, Wayne MacKenzie, Ryan Williams, Kathryn Mozer, Dawn Carter, Randall Race, Jamese Sims, Matthew Seybold
The Geostationary Operational Environmental Satellite-R series (GOES-R) will be the next generation of NOAA geostationary environmental satellites. The first satellite in the series is planned for launch in November 2016. The satellite will carry six instruments dedicated to the study of the Earth’s weather, lightning mapping, solar observations, and space weather monitoring. Each of the six instruments require specialized calibration plans to achieve their product quality requirements. In this talk we will describe the overall on-orbit calibration program and data product release schedule of the GOES-R program, as well as an overview of the strategies of the individual instrument science teams. The Advanced Baseline Imager (ABI) is the primary Earth-viewing weather imaging instrument on GOES-R. Compared to the present on-orbit GOES imagers, ABI will provide three times the spectral bands, four times the spatial resolution, and operate five times faster. The increased data demands and product requirements necessitate an aggressive and innovative calibration campaign. The Geostationary Lightning Mapper (GLM) will provide continuous rapid lightning detection information covering the Americas and nearby ocean regions. The frequency of lightning activity points to the intensification of storms and may improve tornado warning lead time. The calibration of GLM will involve intercomparisons with ground-based lightning detectors, an airborne field campaign, and a ground-based laser beacon campaign. GOES-R also carries four instruments dedicated to the study of the space environment. The Solar Ultraviolet Imager (SUVI) and the Extreme Ultraviolet and X-Ray Irradiance Sensors (EXIS) will study solar activity that may affect power grids, communication, and spaceflight. The Space Environment In-Situ Suite (SEISS) and the Magnetometer (MAG) study the in-situ space weather environment. These instruments follow a calibration and validation (cal/val) program that relies on intercomparisons with other space-based sensors and utilize special spacecraft maneuvers. Given the importance of cal/val to the success of GOES-R, the mission is committed to a long-term effort. This commitment enhances our knowledge of the long-term data quality and builds user confidence. The plan is a collaborative effort amongst the National Oceanic and Atmospheric Administration (NOAA), the National Institute of Standards and Technology (NIST), and the National Aeronautics and Space Administration (NASA). It is being developed based on the experience and lessons-learned from the heritage GOES and Polar-orbiting Operational Environmental Satellite (POES) systems, as well as other programs. The methodologies described in the plan encompass both traditional approaches and the current state-of-the-art in cal/val.
KEYWORDS: Data modeling, Computer simulations, Image resolution, 3D modeling, Remote sensing, Visualization, Atmospheric modeling, Digital imaging, Databases, Scene simulation
Simulated imagery has been and will continue to be a great resource to the remote sensing community. It not only fills
in the gaps when real imagery is not available, but allows the user to know and control every aspect of the scene. Over
the last 20 years we have seen its value in algorithm development, systems level design trade studies and
phenomenology investigation. The realism of this data is often linked to its radiometric accuracy. The Rochester
Institute of Technology's Digital Imaging and Remote Sensing (DIRS) Laboratory has done extensive work on making
simulations more realistic for years, while developing our in house image generator, DIRSIG. In the past we have
invested hundreds of man-hours to painstakingly build large scale scenes of real locations with manual methods.
Recently, new procedural tools and open source geometry repositories have allowed the creation of similar scenes with
improved scene clutter in significantly less time. It is now possible to assemble and build large city-scale scene
geometries with a more automated workflow over the course of a few hours. Even with these advances, an observer
viewing these high resolution, complex, spectrally and spatially textured simulated images is still visually aware that
they are nothing but simulations, albeit radiometrically and spectrally accurate. This paper will investigate the above
concern regarding simulated imagery by looking at the utility, evolution and future of image simulations.
The current extent of publicly available space-based imagery and data products is unprecedented. Data from research
missions and operational environmental programs provide a wealth of information to global users, and in many cases,
the data are accessible in near real-time. The availability of such data provides a unique opportunity to investigate how
information can be cascaded through multiple spatial, spectral, radiometric, and temporal scales. A hierarchical image
classification approach is developed using multispectral data sources to rapidly produce large area landuse identification
and change detection products. The approach derives training pixels from a coarser resolution classification product to
autonomously develop a classification map at improved resolution. The methodology also accommodates parallel
processing to facilitate analysis of large amounts of data.
Previous work successfully demonstrated this approach using a global MODIS 500 m landuse product to construct a
30 m Landsat-based classification map. This effort extends the previous approach to high resolution U.S. commercial
satellite imagery. An initial validation study is performed to document the performance of the algorithm and identify
limitations in the process. Results indicate this approach is scalable and has broad applications to target and anomaly
detection applications. In addition, discussion is focused on how information is preserved throughout the processing
chain, as well as situations where the data integrity could break down. This work is part of a larger effort to deduce
practical, innovative, and alternative ways to leverage and exploit the extensive low-resolution global data archives to
address relevant civil, environmental, and defense objectives.
Simulation of moving vehicle tracking has been demonstrated using hyperspectral and polarimetric imagery (HSI/PI).
Synthetic HSI/PI image cubes of an urban scene containing moving vehicle content were generated using the Rochester
Institute of Technology's Digital Imaging and Remote Sensing Image Generation (DIRSIG) Megascene #1 model.
Video streams of sensor-reaching radiance frames collected from a virtual orbiting aerial platform's imaging sensor were
used to test adaptive sensor designs in a target tracking application. A hybrid division-of-focal-plane imaging sensor
boasting an array of 2×2 superpixels containing both micromirrors and micropolarizers was designed for co-registered
HSI/PI aerial remote sensing. Pixel-sized aluminum wire-grid linear polarizers were designed and simulated to measure
transmittance, extinction ratio, and diattenuation responses in the presence of an electric field. Wire-grid spacings of 500
[nm] and 80 [nm] were designed for lithographic deposition and etching processes. Both micromirror-relayed
panchromatic imagery and micropolarizer-collected PI were orthorectified and then processed by Numerica
Corporation's feature-aided target tracker to perform multimodal adaptive performance-driven sensing of moving
vehicle targets. Hyperspectral responses of selected target pixels were measured using micromirror-commanded slits to
bolster track performance. Unified end-to-end track performance case studies were completed using both panchromatic
and degree of linear polarization sensor modes.
A novel multi-object spectrometer (MOS) is being explored for use as an adaptive performance-driven sensor that tracks
moving targets. Developed originally for astronomical applications, the instrument utilizes an array of micromirrors to
reflect light to a panchromatic imaging array. When an object of interest is detected the individual micromirrors imaging
the object are tilted to reflect the light to a spectrometer to collect a full spectrum. This paper will present example
sensor performance from empirical data collected in laboratory experiments, as well as our approach in designing optical
and radiometric models of the MOS channels and the micromirror array. Simulation of moving vehicles in a highfidelity,
hyperspectral scene is used to generate a dynamic video input for the adaptive sensor. Performance-driven
algorithms for feature-aided target tracking and modality selection exploit multiple electromagnetic observables to track
moving vehicle targets.
Remote sensing often utilizes models to predict the ability of an optical system to collect data optimally prior to
costly sensor testing and manufacturing. Significant effort is required to create an accurate model, and therefore
most designs focus on either radiometric or spatial precision rather than a combination of the two. We present
a case study in which a model has been created to satisfy both radiometric and spatial fidelity requirements.
Terrain, vegetation, targets and other components of the model were designed with high precision. Hyperspectral
imagery was generated using the Digital Imaging and Remote Sensing Image Generation Model (DIRSIG) based
on numerous spectral and spatial ground-truth measurements. These included spectral reflectance of targets
and the environment, atmospheric variables, as well as geometry and distribution of objects within the scene.
Imagery was collected by airborne systems for accuracy assessment. The generated data has been validated by
qualitative evaluation of the spectral characteristics and comparisons of results from PC transform and the RX
anomaly detection algorithm. Validation results indicate that the model achieved a desired level of accuracy.
As the interest in polarization sensitive imaging systems increases, the modeling tools used to perform instrument trade studies and to generate data for algorithm testing must be adapted to correctly predict polarization signatures. The incorporation of polarization into the image chain simulated by these tools must address the modeling of the natural illuminants (e.g. Sun, Moon, Sky), background sources (e.g. adjacent objects), the polarized Bidirectional Reflectance Distribution Function (pBRDF) of surfaces, atmospheric propagation (extinction, scattering and self-emission) and sensor effects (e.g. optics, filters). Although, each of these links in the image chain may utilize unique modeling approaches, they must be integrated under a framework that addresses important aspects such as a unified coordinate space and a common polarization state convention. This paper presents a modeling framework for the prediction of polarized signatures within a natural scene. The proposed image chain utilizes community developed modeling tools including an experimental version of MODTRAN and BRDF models that have been either derived or extended for polarization (e.g. Beard-Maxwell, Priest-Germer, etc.). This description also includes the theory utilized in the modeling tools incorporated into the image chain model to integrate these links into a full signature prediction capability. Analytical and experimental lab studies are presented to demonstrate the correct implementation and integration of the described image chain framework within the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model.
Comparisons have been made showing that modeled multi and hyperspectral imagery can approach the complexity
of real data and the use of modeled data to perform algorithm testing and sensor modeling is well established.
With growing interest in the acquisition and exploitation of polarimetric imagery, there is a need to perform
similar comparisons for this imaging modality.
This paper will describe the efforts to reproduce polarimetric imagery acquired of a real world scene in a
synthetic image generation environment. Real data was collected with the Wildfire Airborne Sensor
Program-Lite (WASP-Lite) imaging system using three separate cameras to acquire simultaneously three polarization
orientations. Modeled data were created using the Digital Imaging and Remote Sensing Image Generation
(DIRSIG) model. This model utilizes existing tools such as polarized bi-directional reflectance distribution
functions (pBRDF), polarized atmospheric models, and
polarization-sensitive sensor models to recreate polarized
imagery. Results will show comparisons between the real and synthetic imagery, highlighting successes in the
model as well as areas where improved fidelity is required.
Identification of constituent gases in effluent plumes is performed using linear least-squares regression techniques. Airborne thermal hyperspectral imagery is used for this study. Synthetic imagery is employed as the test-case for algorithm development. Synthetic images are generated by the Digital Imaging and Remote Sensing Image Generation (DIRSIG) Model. The use of synthetic data provides a direct measure of the success of the algorithm through the comparison with truth map outputs. In image test-cases, plumes emanating from factory stacks will have been identified using a separate detection algorithm. The gas identification algorithm being developed in this work is performed only on pixels having been determined to contain the plume. Constrained stepwise linear regression is used in this study. Results indicate that the ability of the algorithm to correctly identify plume gases is directly related to the concentration of the gas. Previous concerns that the algorithm is hindered by spectral overlap were eliminated through the use of constraints on the regression.
Identification of constituent gases in effluent plumes is performed using linear least-squares regression techniques. Overhead thermal hyperspectral imagery is used for this study. Synthetic imagery is employed as the test-case for algorithm development. Synthetic images are generated by the Digital Imaging and Remote Sensing Image Generation (DIRSIG) Model. The use of synthetic data provides a direct measure of the success of the algorithm through the comparison with truth map outputs. In image test-cases, plumes emanating from factory stacks will have been identified using a separate detection algorithm. The gas identification algorithm being developed in this work will then be used only on pixels having been determined to contain the plume. Stepwise linear regression is considered in this study. Stepwise regression is attractive for this application as only those gases truly in the plume will be present in the final model. Preliminary results from the study show that stepwise regression is successful at correctly identifying the gases present in a plume. Analysis of the results indicates that the spectral overlap of absorption features in different gas species leads to false identifications.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.