Digital Electronics and Optical Navigation Camera (DE-ONC) is an edge computing node of the asteroid probe HAYABUSA2. DE-ONC was developed to provide real-time image recognition performance for optical navigation. Lightweight, low power consumption and miniaturization are realized to overcome resource restrictions. It also satisfies high reliability and safety requirements of HAYABUSA2 missions. There are static and dynamic requirements for reliability and safety. The former increases reliability by adding redundancy combining the concept of functional distribution and time-division redundancy to meet resource constraints. Functional distribution mode, standby redundancy mode and hot redundancy mode were realized with the same device configuration. The real-time performance of optical navigation exploiting image recognition functions of the unit was demonstrated through the interplanetary cruising phase, as well as touch down to and taking off from the asteroid Ryugu. DE-ONC is always required to operate in the critical operation phase. In addition to that, it must always satisfy latency requirements to complete processing within a predetermined duration and to guarantee hard real-time performance. In order to satisfy these requirements, the image processing unit of DE-ONC adopts a unified language processing system and a distributed memory model with reference to a parallel inference machine, which is a so-called the second generation artificial intelligence technology. Its image processing module integrates a radiation hardened micro-controller unit (MCU) and field programmable gate arrays (FPGAs) with the language processing system and the distributed object model. We report the evaluation result of reliability and safety with real-time performance of the unit’s architecture.
We have been researching and developing a CMOS image sensor that has 2.8 μm x 2.8 μm pixel, 33-Mpixel resolution
(7680 horizontal pixels x 4320 vertical pixels), 120-fps frame rate, and 12-bit analog-to-digital converter for “8K Super
Hi-Vision.” In order to improve its sensitivity, we used a 0.11-μm nanofabricated process and attempted to increase the
conversion gain from an electron charge to a voltage in the pixel. The prototyped image sensor shows a sensitivity of 2.4
V/lx•s, which is 1.6 times higher than that of a conventional image sensor. This image sensor also realized the input-referred
random noise as low as 2.1 e-rms.
We have been working on developing an image sensor with three stacked organic photoconductive films (OPFs)
sensitive to only one primary color component (red—R, green—G, or blue—B); each OPF has a signal readout circuit.
This type of stacked sensor is advantageous for the manufacture of compact color cameras with high-quality pictures,
since color separation systems, such as prisms or color filter arrays, are eliminated because of the color selectivity of
OPFs. To achieve a high-resolution stacked sensor, its total thickness should be reduced to less than 10 μm. In this study,
we fabricated a color image sensor with R and G-sensitive OPFs by applying amorphous In-Ga-Zn-O thin-film transistor
(TFT) readout circuits. A 10 μm-thick interlayer insulator separated the R and G-sensitive layers. The entire fabrication
process for the device was implemented below 150°C to avoid damaging the OPFs. Output signals were successfully
read from each OPF through the TFT circuit, and multi-color images were reproduced from the fabricated sensor.
We have developed a back-side-illuminated image sensor with a burst capturing speed of 5.2 Tpixels per second. Its
sensitivity was 252 V/lux·s (12.7 times that of a front-side-illuminated image sensor) in an evaluation. Sensitivity of a
camera system was 2,000 lux F90. The increased sensitivity resulted from optical and time aperture ratios of 100% and
also by increasing from a higher optical utilization ratio. The ultrahigh-speed shooting resulted from the use of in-situ
storage image sensor. Reducing the wiring resistance and dividing the image area into eight blocks increased the
maximum frame rate to 16.7 million frames per second. The total pixel count was 760 horizontally and 411 vertically.
The product of the pixel count and maximum frame rate is often used as a figure of merit for high-speed imaging devices,
and in this case, 312,360 multiplied by 16.7 million yields 5.2 Tpixels per second. The burst capturing speed is thus 5.2
Tpixels per second, which is the highest speed achieved in high-speed imaging devices to date.
We have developed an ultrahigh-speed CCD camera that can capture instantaneous phenomena not visible to the human
eye and impossible to capture with a regular video camera. The ultrahigh-speed CCD was specially constructed so that
the CCD memory between the photodiode and the vertical transfer path of each pixel can store 144 frames each. For
every one-frame shot, the electric charges generated from the photodiodes are transferred in one step to the memory of
all the parallel pixels, making ultrahigh-speed shooting possible. Earlier, we experimentally manufactured a 1M-fps
ultrahigh-speed camera and tested it for broadcasting applications. Through those tests, we learned that there are cases
that require shooting speeds (frame rate) of more than 1M fps; hence we aimed to develop a new ultrahigh-speed camera
that will enable much faster shooting speeds than what is currently possible. Since shooting at speeds of more than
200,000 fps results in decreased image quality and abrupt heating of the image sensor and drive circuit board, faster
speeds cannot be achieved merely by increasing the drive frequency. We therefore had to improve the image sensor
wiring layout and the driving method to develop a new 2M-fps, 300k-pixel ultrahigh-speed single-chip color camera for
broadcasting purposes.
We developed a 300,000-pixel ultrahigh-speed CCD with a maximum frame rate of 2,000,000 frames per second. The
shooting speed of the CCD was possible by directly connecting CCD memories, which record video images, to the
photodiodes of individual pixels. The simultaneous parallel recording operation of all pixels results in the ultimate frame
rate. We analyzed a voltage wave pattern in the equivalent circuit model of the ultrahigh-speed CCD by using a SPICE
simulator to estimate the maximum frame rate. The pixel area was consisted of 410 and 720 pixels in the vertical and
horizontal and divided into 8 blocks for parallel driving. An equivalent circuit of one block was constructed from an RC
circuit with 410 × 90 pixels. The voltage wave pattern at the final stage of an equivalent circuit was calculated when a
square wave pulse was input. Results showed that the square wave pulse became blunt when the driving speed was
increased. After estimation, we designed the layout of the new ultrahigh-speed CCD V6 and fabricated the device.
Results of an image capturing experiment indicated a saturation signal level of 100% that was maintained up to 300,000
frames per second. A saturation signal level of 50% was observed in 1,000,000 frames per second and of 13% in
2,000,000 frames per second. We showed that the maximum frame rate is dependent on a drop of the saturation signal
level resulting from the driving voltage wave pattern becoming blunt.
Our group has been developing a new type of image sensor overlaid with three organic photoconductive films, which are
individually sensitive to only one of the primary color components (blue (B), green (G), or red (R) light), with the aim
of developing a compact, high resolution color camera without any color separation optical systems. In this paper, we
firstly revealed the unique characteristics of organic photoconductive films. Only choosing organic materials can tune the
photoconductive properties of the film, especially excellent wavelength selectivities which are good enough to divide the
incident light into three primary colors. Color separation with vertically stacked organic films was also shown. In addition,
the high-resolution of organic photoconductive films sufficient for high-definition television (HDTV) was confirmed in a
shooting experiment using a camera tube. Secondly, as a step toward our goal, we fabricated a stacked organic image
sensor with G- and R-sensitive organic photoconductive films, each of which had a zinc oxide (ZnO) thin film transistor
(TFT) readout circuit, and demonstrated image pickup at a TV frame rate. A color image with a resolution corresponding
to the pixel number of the ZnO TFT readout circuit was obtained from the stacked image sensor. These results show the
potential for the development of high-resolution prism-less color cameras with stacked organic photoconductive films.
KEYWORDS: Cameras, Charge-coupled devices, CCD cameras, Signal processing, Digital signal processing, Video, Field programmable gate arrays, Image processing, Photodiodes, Eye
We have developed an ultrahigh-speed, high-sensitivity portable color camera with a new 300,000-pixel single CCD.
The 300,000-pixel CCD, which has four times the number of pixels of our initial model, was developed by seamlessly
joining two 150,000-pixel CCDs. A green-red-green-blue (GRGB) Bayer filter is used to realize a color camera with the
single-chip CCD. The camera is capable of ultrahigh-speed video recording at up to 1,000,000 frames/sec, and small
enough to be handheld. We also developed a technology for dividing the CCD output signal to enable parallel, highspeed
readout and recording in external memory; this makes possible long, continuous shots up to 1,000 frames/second.
As a result of an experiment, video footage was imaged at an athletics meet. Because of high-speed shooting, even
detailed movements of athletes' muscles were captured. This camera can capture clear slow-motion videos, so it enables
previously impossible live footage to be imaged for various TV broadcasting programs.
We are developing an ultrahigh-speed, high-sensitivity broadcast camera that is capable of capturing clear, smooth slow-motion videos even where lighting is limited, such as at professional baseball games played at night. In earlier work, we developed an ultrahigh-speed broadcast color camera1) using three 80,000-pixel ultrahigh-speed, highsensitivity CCDs2). This camera had about ten times the sensitivity of standard high-speed cameras, and enabled an entirely new style of presentation for sports broadcasts and science programs. Most notably, increasing the pixel count is crucially important for applying ultrahigh-speed, high-sensitivity CCDs to HDTV broadcasting. This paper provides a summary of our experimental development aimed at improving the resolution of CCD even further: a new ultrahigh-speed high-sensitivity CCD that increases the pixel count four-fold to 300,000 pixels.
We developed an ultrahigh-speed, high-sensitivity, color camera that captures moving images of phenomena too fast to be perceived by the human eye. The camera operates well even under restricted lighting conditions. It incorporates a special CCD device that is capable of ultrahigh-speed shots while retaining its high sensitivity. Its ultrahigh-speed shooting capability is made possible by directly connecting CCD storages, which record video images, to photodiodes of individual pixels. Its large photodiode area together with the low-noise characteristic of the CCD contributes to its high sensitivity. The camera can clearly capture events even under poor light conditions, such as during a baseball game at night. Our camera can record the very moment the bat hits the ball.
An image sensor for an ultra-high-speed video camera was developed. The maximum frame rate, the pixel count and the number of consecutive frames are 1,000,000 fps, 720 x 410 (= 295,200) pixels, and 144 frames. A micro lens array will be attached on the chip, which increases the fill factor to about 50%. In addition to the ultra-high-speed image capturing operation to store image signals in the in-situ storage area adjacent to each pixel, standard parallel readout operation at 1,000 fps for full frame readout is also introduced with sixteen readout taps, for which the image signals are transferred to and stored in a storage device with a large capacity equipped outside the sensor. The aspect ratio of the frame is about 16 : 9, which is equal to that of the HDTV format. Therefore, a video camera with four sensors of the ISIS-V4, which are arranged to form the Bayer’s color filter array, realizes an ultra-high-speed video camera of a semi-HDTV format.
KEYWORDS: Microwave radiation, Amorphous silicon, Particles, Hydrogen, Resonance enhancement, Plasma enhanced chemical vapor deposition, Temperature metrology, Silicon films, Chemical vapor deposition, Chemical species
The effects of rf-bias to the substrate on the defects of hydrogenated amorphous silicon deposited by electron cyclotron resonance - plasma enhanced chemical vapor deposition were investigate.d Measurements by constant photo-current method showed that the defect density decreased as the rf-power increased. The decrease of the defect density by rf-bias did not depend on the microwave power so much. Deposition rate did not depend on the rf- power, whereas it increased with the increase of the microwave power. Photoconductivity was shown to increase with rf-power corresponding to the decrease of the defect density. Surface roughness measurements indicated that the surface flatness was increased by rf-bias voltage independently of the deposition temperature. It was considered that these elimination of defects was induced by the increase of the number of mobile particles on the surface due to rf-bias.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.