The Dark Energy Camera (DECam) has been installed on the V. M. Blanco telescope at Cerro Tololo Inter-American Observatory in Chile. This major upgrade to the facility has required numerous modifications to the telescope and improvements in observatory infrastructure. The telescope prime focus assembly has been entirely replaced, and the f/8 secondary change procedure radically changed. The heavier instrument means that telescope balance has been significantly modified. The telescope control system has been upgraded. NOAO has established a data transport system to efficiently move DECam's output to the NCSA for processing. The observatory has integrated the DECam highpressure, two-phase cryogenic cooling system into its operations and converted the Coudé room into an environmentally-controlled instrument handling facility incorporating a high quality cleanroom. New procedures to
ensure the safety of personnel and equipment have been introduced.
The V. M. Blanco 4-m telescope at Cerro Tololo Inter-American Observatory is undergoing a number of improvements
in preparation for the delivery of the Dark Energy Camera. The program includes upgrades having potential to deliver
gains in image quality and stability. To this end, we have renovated the support structure of the primary mirror,
incorporating innovations to improve both the radial support performance and the registration of the mirror and telescope
top end. The resulting opto-mechanical condition of the telescope is described. We also describe some improvements to
the environmental control. Upgrades to the telescope control system and measurements of the dome environment are
described in separate papers in this conference.
KEYWORDS: Large Synoptic Survey Telescope, Observatories, Astronomy, Solar system, Image processing, Cameras, Data centers, Telescopes, Databases, Space telescopes
The astronomical time domain is entering an era of unprecedented growth. LSST will join current and future surveys at
diverse wavelengths in exploring variable and transient celestial phenomena characterizing astrophysical domains from
the solar system to the edge of the observable universe. Adding to the large but relatively well-defined load of a project
of the scale of the Large Synoptic Survey Telescope will be many challenging issues of handling the dynamic empirical
interplay between LSST and contingent follow-up facilities worldwide. We discuss concerns unique to this telescope,
while exploring consequences common to emerging observational time domain paradigms.
The Large Synoptic Survey Telescope (LSST) will continuously image the entire sky visible from Cerro Pachon
in northern Chile every 3-4 nights throughout the year. The LSST will provide data for a broad range of science
investigations that require better than 1% photometric precision across the sky (repeatability and uniformity)
and a similar accuracy of measured broadband color. The fast and persistent cadence of the LSST survey
will significantly improve the temporal sampling rate with which celestial events and motions are tracked. To
achieve these goals, and to optimally utilize the observing calendar, it will be necessary to obtain excellent
photometric calibration of data taken over a wide range of observing conditions - even those not normally
considered "photometric". To achieve this it will be necessary to routinely and accurately measure the full
optical passband that includes the atmosphere as well as the instrumental telescope and camera system. The
LSST mountain facility will include a new monochromatic dome illumination projector system to measure the
detailed wavelength dependence of the instrumental passband for each channel in the system. The facility will
also include an auxiliary spectroscopic telescope dedicated to measurement of atmospheric transparency at all
locations in the sky during LSST observing. In this paper, we describe these systems and present laboratory
and observational data that illustrate their performance.
The Small and Moderate Aperture Research Telescope System (SMARTS)* consists of four telescopes atop Cerro Tololo Inter-American Observatory (CTIO): the 0.9m, 1.0m, 1.3m, and 1.5m. A consortium of twelve institutions and universities began funding operations in February 2003. Time allocation for these facilities is as
follows: ~65% to consortium members, ~25% to the general community, and 10% to Chilean researchers. Thus, resources remain available to the community while providing a unique opportunity for consortium members; the possibility of high temporal cadence monitoring coupled with long time baseline monitoring. Indeed, a number of member programs have benefited from such a schema. Furthermore, two of the four telescopes are scheduled in a queue mode in which observations are collected by service observers. Queue mode investigators
have access to spectroscopic observations (both RC and echelle) as well as direct imaging (both optical and near-IR simultaneously). Of the remaining two telescopes, the 1.0m is almost exclusively operated in user mode and contains a 20'×20' FOV optical imager, and the 0.9m is operated both in user and service mode in equal allotments and also has a dedicated optical imager. The latter facilities are frequently used for hands-on student training under the superb sky conditions afforded at CTIO.
Currently, three of the partner universities are responsible for managing telescope scheduling and data handling, while one additional university is responsible for some of the instruments. In return, these universities receive additional telescope time. Operations are largely run by a handful of people, with six personnel from the four support universities and seven dedicated personnel in Chile (five observers, one observer support engineer,
and one postdoctoral appointee). Thus far, this model has proven to be both an efficient and an effective method for operating the small telescopes at CTIO.
The Dark Energy Survey Collaboration is building the Dark Energy Camera (DECam), a 3 square degree, 520
Megapixel CCD camera which will be mounted on the Blanco 4-meter telescope at CTIO. DECam will be used to
perform the 5000 sq. deg. Dark Energy Survey with 30% of the telescope time over a 5 year period. During the
remainder of the time, and after the survey, DECam will be available as a community instrument. Construction of
DECam is well underway. Integration and testing of the major system components has already begun at Fermilab and
the collaborating institutions.
KEYWORDS: Large Synoptic Survey Telescope, Data centers, Data archive systems, Astronomy, Data communications, Cameras, Telescopes, Lanthanum, Imaging systems, Control systems
The Large Synoptic Survey Telescope (LSST) is an 8.4m (6.5m effective), wide-field (9.6 degree2), ground-based
telescope with a 3.2 GPixel camera. It will survey over 20,000 degree2 with 1,000 re-visits over 10 years in six visible
bands, and is scheduled to begin full scientific operations in 2016. The Data Management System will acquire and
process the images, issue transient alerts, and catalog the world's largest database of optical astronomical data. Every 24
hours, 15 terabytes of raw data will be transferred via redundant 10 Gbps fiber optics down from the mountain summit at
Cerro Pachon, Chile to the Base Facility in La Serena for transient alert processing. Simultaneously, the data will be
transferred at 2.5Gbps over fiber optics to the Archive Center in Champaign, Illinois for archiving and further scientific
processing and creation of scientific data catalogs. Finally, the Archive Center will distribute the processed data and
catalogs at 10Gbps to a number Data Access Centers for scientific ,educational, and public access. Redundant storage
and network bandwidth is built into the design of the system. The current networking acquistiion strategy involves
leveraging existing dark fiber to handle within Chile, Chile - U.S. and within U.S. links. There are a significant number
of carriers and networks involved and coordinating the acquisition, deployment, and operations of this capability.
Advanced protocols are being investigated during our Research and Development phase to address anticipated
challenges in effective utilization. We describe the data communications requirements, architecture, and acquisition
strategy in this paper.
The Dark Energy Survey (DES; operations 2009-2015) will address the nature of dark energy using four independent and complementary techniques: (1) a galaxy cluster survey over 4000 deg2 in collaboration with the South Pole Telescope Sunyaev-Zel'dovich effect mapping experiment, (2) a cosmic shear measurement over 5000 deg2, (3) a galaxy angular clustering measurement within redshift shells to redshift=1.35, and (4) distance measurements to 1900 supernovae Ia. The DES will produce 200 TB of raw data in four bands, These data will be processed into science ready images and catalogs and co-added into deeper, higher quality images and catalogs. In total, the DES dataset will exceed 1 PB, including a 100 TB catalog database that will serve as a key science analysis tool for the astronomy/cosmology community. The data rate, volume, and duration of the survey require a new type of data management (DM) system that (1) offers a high degree of automation and robustness and (2) leverages the existing high performance computing infrastructure to meet the project's DM targets. The DES DM system consists of (1) a gridenabled, flexible and scalable middleware developed at NCSA for the broader scientific community, (2) astronomy
modules that build upon community software, and (3) a DES archive to support automated processing and to serve DES catalogs and images to the collaboration and the public. In the recent DES Data Challenge 1 we deployed and tested the first version of the DES DM system, successfully reducing 700 GB of raw simulated images into 5 TB of reduced data products and cataloguing 50 million objects with calibrated astrometry and photometry.
Historically, few astronomical measurements have required sub-percent accuracy in photometry. Measuring SNIa fluxes
in order to determine cosmological parameters, however, often requires the comparison of images from different
telescopes, and at different redshifts. This can introduce a myriad of sources of error. Conventional methods of data
reduction are intrinsically flawed, either making assumptions about the effects of wavelength dependence in the response
function of the system or, when K-corrections are not performed, neglecting them altogether. We consider the
advantages of a method utilizing a direct, spectrally-resolved measurement of the entire system's response function
relative to a calibrated photodiode.
KEYWORDS: Astronomy, Databases, Telescopes, Image processing, Stars, Point spread functions, Space telescopes, Data processing, Data acquisition, Calibration
The era of large survey datasets has arrived, and the era of large survey telescope projects is upon us. Many of these new telescope projects will not only produce large datasets, they will produce datasets that require real-time astronomical analysis, including object detection, photometry, and classification. These datasets promise to open new horizons in the exploration of the time domain in astrophysical systems on large scales. But to fulfill this promise, the projects must design and develop data management systems on a much larger scale (many Terabytes per day continuously) than has previously been achieved in astronomy. Working together, NOAO and the University of Washington are developing prototype pipeline systems to explore the issues involved in real-time time-variability analysis. These efforts are not simply theoretical exercises, but rather are driven by NOAO Survey programs which are generating large data flows. Our survey projects provide a science-driven testbed of data management strategies needed for future initiatives such as the Large Synoptic Survey Telescope and other large-scale astronomical data production systems.
KEYWORDS: Data archive systems, Astronomy, Telescopes, Data storage, Large Synoptic Survey Telescope, Data processing, Data modeling, Observatories, Calibration, Mining
The NOAO Data Products Program (DPP) is a new program aimed at identifying scientifically interesting datasets from ground-based O/IR telescopes and making them available to the astronomical community, together with the tools for exploring them. The program coordinates NOAO projects that are data intensive, including the handling, pipeline processing, analysis, and archiving of data. These datasets, and the facilities for mining them, will form a significant component of the resources of the National Virtual Observatory, and will be an important part of NOAO’s participation in that endeavor. In the longer term, this activity will lead to a data management role in the Large-aperture Synoptic Survey Telescope, a facility that will produce one petabyte of imaging data per year.
The Large-aperture Synoptic Survey Telescope will repeatedly image a large fraction of the visible sky in multiple optical passbands in a way that will sample temporal phenomena over a large range of time scales. This will enable a suite of synoptic investigations that range in temporal sampling requirements from the detection of near Earth asteroids (minutes), through discovery and followup of supernovae to long period monitoring of QSOs, AGN and LPVs (years). Additionally, the data must be obtained in a way to support programs aimed at building up deep static images of part or all of the sky.
Here we examine some of the issues involved in crafting an observing scheme that serves these goals. The problem has several parts: a) what is the optimal time sampling strategy that best serves the desired temporal range? b) how can a chosen time sampling sequence be packed into an observing scheme that accommodates all pointings and 'whiteout' windows (daytime, lunation period)? c) how vulnerable is such an observing plan to realistic models of disruption by poor observing conditions and weather? d) how does one build in the most economical contingency/redundancy to i) mitigate against such disruption and ii) reserve time for recovery and followup of transient phenomena (e.g. gamma-ray bursts, supernovae)?
In this article we touch upon several of these issues, and come to an understanding of some of the limitations, as well as areas in which scientific priorities and trade-offs will have to be made.
In July of 1998 the National Optical Astronomy Observatories (NOAO) successfully upgraded MOSAIC 1, an 8192 by 8192 pixel array using eight Scientific Imaging Technologies, Inc. (SITe) St-002A thinned backside 2k by 4k charge coupled devices (CCDs). In July of 1999 MOSAIC II, a clone of MOSAIC I was commissioned also using eight SITe ST-002A CCDs. Additionally in December of 1998 NOAO implemented Mini- MOSAIC a 4096 by 4096 pixel array using two SITe ST-002A thinned CCDs. This report will discuss the performance, characterization and capabilities of the three wide field imagers now in operation at NOAO's Kitt Peak Observatory, Cerro Tololo Inter-American Observatory and at the WIYN Consortium 3.5-Meter telescope on Kitt Peak.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.