In order to improve the characteristic of light-duty of space optical remote sensors and the quality of remote sensing images. A novel means was designed. It is used for remote sensing image restoration in-orbit that based on the modulation transfer function compensation (MTFC). The principle of the modulation transfer function compensation in-orbit was given, and the control system in-orbit of the modulation transfer function compensation was designed. The modulation transfer function curve was obtained by direct measurement in laboratory. The remote sensing image restoration was realized by constrained least square filter. The indexes including mean, standard deviation, edge intensity and others are used to evaluate the quality of remote sensing image restoration. The results show that the evaluation indicators of the restored image are better than the original image, and the MTF at Nyquist frequency is increased to 0.1635 from 0.1501. It totally satisfies the requirement for real-time remote sensing image restoration in-orbit, and significantly improves the image quality.
A color transfer method is presented to give fused multiband nighttime imagery a natural daytime color appearance in a
simple and efficient way. Instead of using the traditional nonlinear lαβ space, the proposed method transfers the color
distribution of the target image (daylight color image) to the source image (fused multiband nighttime imagery) in the
linear YCBCR color space. The YCBCR transformation is simpler and more suitable for image fusion compared to the lαβ
conversion. The YCBCR transformation can be extended into a general formalism. And the paper mathematically proves
that, for color transfer, using color spaces conforming to this general YCBCR space framework can produce same
recoloring results as using the YCBCR space. Experimental results demonstrate that the YCBCR based color transfer
method works surprisingly well for transferring natural color characteristics of daylight color images to false color fused
multiband nighttime imagery, and moreover, can also be successfully applied to recoloring a variety of color images.
We present a computationally efficient color image fusion algorithm for merging infrared and visible images. At the core
of the proposed method is the color transfer technique based on the linear YCBCR space. The method directly uses the grayscale fused image and the difference signals of the input images to construct the source YCBCR components, then
uses the statistical color transfer technique to form a color fused image that takes the target image's color characteristics.
Two different strategies, which respectively employ the pixel averaging fusion scheme and the multiresolution fusion
scheme as the grayscale image fusion solution, are proposed to fulfill different user needs. The simple strategy using the
pixel averaging fusion scheme answers to a need of easy implementation and speed of use. And the complex strategy
using the multiresolution fusion scheme answers to the high quality need of the fused products. In addition, we also
describe some useful theories about color-transfer-based image fusion. Experimental results show that the proposed
color image fusion algorithm can effectively produce a natural appearing "daytime-like" color fused image, and even
using the pixel averaging fusion scheme to implement the grayscale fusion can also provide a pleasing result.
We propose a color transfer method to give fused multiband nighttime imagery a natural daytime color appearance in a
simple and efficient way. Instead of using traditional &rhookl;&agr;&bgr; space, the proposed method transfers the color distribution of
the target image (daylight color image) to the source image (fused multiband nighttime imagery) in a linear color space
named IUV. The transformation between RGB and IUV spaces is simpler than that between RGB and &rhookl;&agr;&bgr; spaces,
moreover, the IUV space is more suitable for image fusion. The IUV transform can be extended into a general formalism.
We prove that color spaces conforming to this general IUV framework can produce same recoloring results as IUV
space. Our experiments on infrared and visual images show that the IUV based color transfer method works surprisingly
well for transferring natural color characteristics of daylight color images to false color fused multiband nighttime
imagery. We also demonstrate that this method can be successfully applied to a variety of images. The images generated
indicate the potential utility of IUV space in color image processing domains.
We propose a contrast enhanced fusion (CEF) method for merging infrared and color visible images. The CEF method
can be carried out in two ways: the standard CEF method and the fast CEF method. The standard method transforms the
original RGB color visible image into a linear luminance-chrominance color space in order to treat the achromatic and
chromatic components separately. The achromatic component and infrared image are combined by a grayscale fusion
scheme, and the original achromatic component is replaced by the grayscale fused image. Before the data are
retransformed back into the RGB color space, the means and variances between the original achromatic component and
the grayscale fused image are matched by a linear luminance remapping. The remapping procedure can fairly enhance
the contrast of the final color fused image. The standard CEF method can be implemented efficiently by the fast CEF
method that has the same fusion performance as the standard approach but manipulates images directly in RGB color
space. We used the proposed method to merge long wave infrared and color TV images. The experimental results show
that the CEF method can effectively produce a high-contrast color fused image with similar natural color characteristics
as the original color visible image. In addition, we have also illustrated that the hybrid simple and complex CEF methods
can be applied as a region of interest (ROI) image fusion solution, which allows ROIs to be fused with better quality
than the rest of the original images.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.