Colormap Development for Night Vision from Synthetic Imagery

Corresponding Author: S.P. Kozaitis, Department of ECE, Florida Institute of Technology, Melbourne, USA Email: kozaitis@fit.edu Abstract: We presented a method for colorizing fused imagery using a synthetic image as the color source. Imagery acquired at night from two sensors with different spectral bands were fused into a single image. We used a color transfer method based on a look-up table approach to change the false color appearance of the fused image to a natural appearance. Because the resulting multiband fused image is highly dependent on the colors is a reference image, we generated a synthetic reference image. We showed that this approach could lead to more realistic color representation for images acquired in dark environments.


Introduction
Night vision images are often represented in monochrome shades of a single color. However, it is widely accepted that full color images are more desirable. The human eye can distinguish only about 100 shades of gray scale compared to more than 400 hues and about 20 saturation levels per hue (Haq et al., 2010). Therefore, color image representation of night vision imagery may lead to better scene recognition and object detection and may improve human performance and reduce reaction time (Hogervorst and Toet, 2008).
Thermal Infrared (IR) and Low-Light-Level (LLL) visible cameras are popular night imaging sensors that are widely used for surveillance and security applications (Zhang et al., 2009). A thermal camera provides information on objects radiating thermal energy and is useful for applications in a visibly dark area, such as seeing though fog. However, it is not as useful at capturing background information such as trees, leaves and grass in natural scenes. On the other hand, an LLL visible camera can often provide the background information reflecting visible and near infrared light in great detail (Gu et al., 2008;Liu and Huang, 2010).
Thermal IR and LLL visible cameras are not sufficiently capable for night vision when used individually (Wang et al., 2004). Their outputs are usually fused into a single image, which can give a more detailed image than either of the sensors individually; unfortunately, the result often has an unnatural color appearance. Therefore, a color transfer technique is required to change from a false to a natural color appearance.
A method has been invented for correcting colors in a target image by borrowing color characteristics from a reference image (Reinhard et al., 2001). In this approach, colors are transferred between two images by rendering the means and standard deviations of target and reference images in an uncorrelated lαβ color space (Ruderman et al., 1998). Welsh et al. (2002) introduced a general technique to colorize a grayscale image by borrowing colors from a reference daytime image. They used the same color transfer concept, but since a grayscale image was represented by a one dimensional distribution, they only matched the luminance channels between the reference color image and the grayscale target image. Toet (2005) applied Welsh's idea to give single band intensified night vision imagery a natural daytime color appearance and showed that a color transfer method can be applied to transfer the natural color characteristics of a daytime color image to a fused multiband night vision image (Toet, 2003;Reinhard et al., 2001).
The main drawback of this statistical approach is that a large object in the reference image could dominate the color mapping and it only addresses the global color characteristics of the depicted scene. Hogervorst and Toet (2008) described an alternative Lookup Table (LUT) based method that alleviates the drawbacks of the statistical approach. They derived a colormap from the combination of a nighttime false color fusion image and a corresponding daylight color image. In addition, this approach is fast enough to be realized in real-time applications.
The appearance of the final fused image is highly dependent on the colors in the reference image. Recently, Levin et al. (2004) reported a unique method for colorizing a grayscale image. In their approach, an image was annotated with color scribbles and those colors were automatically propagated in both space and time to produce a fully colorized image. Inspired by their work, we proposed an approach which can be utilized to enhance the method developed by Hogervorst and Toet (2008). We presented a simple color enhancement method for deriving a colormap and show how one can use the false color fused image to derive a colormap if the daytime reference image is not available.

Color Enhancement Technique Using a Look-up Table
The process for deriving a colormap from a multiband fused image and its corresponding registered reference image has been previously discussed (Hogervorst and Toet, 2008). The colors of a fused image are heavily dependent on the colors in the reference image used. A false color fused image is shown in Fig. 1a, which is the result of the fusion of the thermal and visible images and the reference scene shown in Fig. 1b. Fig. 1c shows the colorized result where it can be seen that the trees appear darker than the reference scene. The clouds and sky regions in the fused and reference images are different, so it is difficult to get their colors correct in the colorized results. The trees in the colorized result appear darker than in the reference image.
A simple way to improve the color appearance in the final result is to "scribble" reasonable colors in regions of interest in a reference image as shown in Fig. 2a. The scribbles will not become visible in the resulting image because the average color of a group of pixels is calculated in the process of deriving a colormap. For example, the derived color for the trees in the colorized fused image in Fig. 2b is due to the average values of the trees and scribbles in the reference image. As can be seen, the color appearance of the final result shown in Fig. 2c is improved when compared to the result shown in Fig. 1c. Figure 3 show an additional example of the color enhancement method used here. Figure 3a shows the fused image directly from the sensors and Fig. 3b and 3c show the reference image and reference image with scribbles. As before, the color appearance in the colorized image due to a reference image with scribbles shown in Fig. 3e gives a more natural appearance when compared to the resulting image due to a reference image without scribbles shown in Fig. 3d. In addition, in Fig.  3d, the sky color appears gray while in Fig. 3e it appears in blue. Furthermore, the grass and trees appear as more distinct colors in Fig. 3e than in Fig. 3d.

Colormap from a Synthetic Image
Often, a fused image cannot be registered well to a reference image. For example, if an image of a scene is taken by a camera at an angle different from that of the reference, then a synthetic view of the scene could be generated and used to derive the colormap. One way to make a synthetic view of a scene is to manually colorize a copy of the multiband fused image. An example of this approach in shown in Fig.  4. Figure 4a shows the original fused image and Fig.  4b shows the original reference image, which is difficult to register properly and so will not produce a good result. Therefore, the fused image was manually colored to produce the reference image shown in Fig.  4c. As can be seen in the resulting colorized image in Fig. 4d, objects have a natural appearance closely resembling the corresponding daytime image. Another example is shown in Fig. 5 with similar results.

Conclusion
We demonstrated that by manually annotating a reference image, an improved colormap can be derived for night vision applications. In the case where a reference image is not available, the method can be used on the initial fused image to generate a synthetic reference image. In both cases, we found the resulting colorized night vision images to be superior when compared to not annotating the reference image. Such a method may provide useful for making night imagery appear more day-like in nature.