Spitzer Warm Mission operations have remained robust and exceptionally efficient since the cryogenic mission ended in
mid-2009. The distance to the onow exceeds 1 AU, making telecommunications increasingly difficult; however,
analysis has shown that two-way communication could be maintained through at least 2017 with minimal loss in
observing efficiency. The science program continues to emphasize the characterization of exoplanets, time domain
studies, and deep surveys, all of which can impose interesting scheduling constraints. Recent changes have significantly
improved on-board data compression, which both enables certain high volume observations and reduces Spitzer's
demand for competitive Deep Space Network resources.
Following the successful dynamic planning and implementation of IRAC Warm Instrument Characterization activities,
transition to Spitzer Warm Mission operations has gone smoothly. Operation teams procedures and processes required
minimal adaptation and the overall composition of the Mission Operation System retained the same functionality it had
during the Cryogenic Mission. While the warm mission scheduling has been simplified because all observations are
now being made with a single instrument, several other differences have increased the complexity. The bulk of the
observations executed to date have been from ten large Exploration Science programs that, combined, have more
complex constraints, more observing requests, and more exo-planet observations with durations of up to 145 hours.
Communication with the observatory is also becoming more challenging as the Spitzer DSN antenna allocations have
been reduced from two tracking passes per day to a single pass impacting both uplink and downlink activities. While
IRAC is now operating with only two channels, the data collection rate is roughly 60% of the four-channel rate leaving a
somewhat higher average volume collected between the less frequent passes. Also, the maximum downlink data rate is
decreasing as the distance to Spitzer increases requiring longer passes. Nevertheless, with well over 90% of the time
spent on science observations, efficiency has equaled or exceeded that achieved during the cryogenic mission.
KEYWORDS: Large Synoptic Survey Telescope, Data modeling, Astronomy, Calibration, Observatories, Cameras, Data processing, Telescopes, Image quality, Point spread functions
LSST will have a Science Data Quality Assessment (SDQA) subsystem for the assessment of the data products that will
be produced during the course of a 10 yr survey. The LSST will produce unprecedented volumes of astronomical data as
it surveys the accessible sky every few nights. The SDQA subsystem will enable comparisons of the science data with
expectations from prior experience and models, and with established requirements for the survey. While analogous
systems have been built for previous large astronomical surveys, SDQA for LSST must meet a unique combination of
challenges. Chief among them will be the extraordinary data rate and volume, which restricts the bulk of the quality
computations to the automated processing stages, as revisiting the pixels for a post-facto evaluation is prohibitively
expensive. The identification of appropriate scientific metrics is driven by the breadth of the expected science, the scope
of the time-domain survey, the need to tap the widest possible pool of scientific expertise, and the historical tendency of
new quality metrics to be crafted and refined as experience grows. Prior experience suggests that contemplative, off-line
quality analyses are essential to distilling new automated quality metrics, so the SDQA architecture must support
integrability with a variety of custom and community-based tools, and be flexible to embrace evolving QA demands.
Finally, the time-domain nature of LSST means every exposure may be useful for some scientific purpose, so the model
of quality thresholds must be sufficiently rich to reflect the quality demands of diverse science aims.
KEYWORDS: Databases, Data archive systems, Data processing, Space operations, Data centers, Astronomy, Calibration, Space telescopes, Infrared astronomy, Image processing
Data Quality Analysis (DQA) for astronomical infrared maps and spectra acquired by NASA's Spitzer Space Telescope is one of the important functions performed in routine science operations at the Spitzer Science Center of the California Institute of Technology. A DQA software system has been implemented to display, analyze and grade Spitzer science data. This supports the project requirement that the science data be verified after calibration and before archiving and subsequent release to the astronomical community. The software has an interface for browsing the mission data and for visualizing images and spectra. It accesses supporting data in the operations database and updates the database with DQA grading information. The system has worked very well since the beginning of the Spitzer observatory's routine phase of operations, and can be regarded as a model for DQA operations in future space science missions.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.