Research Article An Improved Infrared/Visible Fusion for ...

9
Research Article An Improved Infrared/Visible Fusion for Astronomical Images Attiq Ahmad, 1 Muhammad Mohsin Riaz, 2 Abdul Ghafoor, 1 and Tahir Zaidi 3 1 Military College of Signals, National University of Sciences and Technology (NUST), Islamabad 44000, Pakistan 2 Center for Advanced Studies in Telecommunication, COMSATS, Islamabad 44000, Pakistan 3 College of Electrical and Mechanical Engineering, NUST, Islamabad 44000, Pakistan Correspondence should be addressed to Abdul Ghafoor; [email protected] Received 30 April 2015; Revised 4 August 2015; Accepted 16 August 2015 Academic Editor: Dean Hines Copyright © 2015 Attiq Ahmad et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. An undecimated dual tree complex wavelet transform (UDTCWT) based fusion scheme for astronomical visible/IR images is developed. e UDTCWT reduces noise effects and improves object classification due to its inherited shiſt invariance property. Local standard deviation and distance transforms are used to extract useful information (especially small objects). Simulation results compared with the state-of-the-art fusion techniques illustrate the superiority of proposed scheme in terms of accuracy for most of the cases. 1. Introduction e visible light astronomy due to reflection, refraction, inter- ference, and diffraction enables scientists to unearth many of nature’s secrets; however, brightness of stars creates a haze in the sky. On the other hand, the infrared (IR) astronomy enables us to peer through the veil of interstellar dust and see objects at extreme cosmological distances. IR images have good radiometric resolution whereas visible images provide detailed information. In this regard, various image fusion techniques have been developed to combine the complemen- tary information present in both images. ese techniques can be grouped into wavelet, statistical decomposition, and compressive sensing. e wavelet transform based fusion schemes generally decompose the visible and IR images into different base and detail layers, to combine the useful information. In [1], contourlet transform fusion is used to separate foreground and background information; however, the separation is not always accurate, which causes loss in target information. In [2], nonsubsampled contourlet transform, local energy, and fuzzy logic based fusion claims better subjective visual effects; however, merger and description of necessary components of IR and visible images in fusion model require improvements especially in case of noisy images. In [3], wavelet transform and fuzzy logic based scheme utilizes dissimilarity measure to assign weights; however, some artifacts are also introduced in the fused image. Contrast enhancement (using ratio of local and global divergence of IR image) based fusion lacks color consistency [4]. In adaptive intensity hue saturation method [5], the amount of spatial details injected into each band of multispectral image is appropriately determined by the weighting matrix, which is defined on the basis of the edges present in panchromatic and multispectral bands. e scheme preserves the spatial details; however, it is unable to control the spectral distortion sufficiently [6]. In [7], gradient-domain approach based on mapping contrast defines the structure tensor matrix onto a low-dimensional gradient field. However, the scheme effects the natural output colours. In [8], wavelet transform and segmentation based fusion scheme is developed to enhance targets in low contrast. However, the fusion performance is dependent on segmen- tation quality and large segmentation errors can occur for cosmological images (especially when one feature is split into multiple regions). Statistical fusion schemes split the images into multiple subspaces using different matrix decomposition techniques. -means and singular value decomposition based scheme suffers from computational complexity [9]. In [10], spatial and spectral fusion model uses sparse matrix factorization Hindawi Publishing Corporation Advances in Astronomy Volume 2015, Article ID 203872, 8 pages http://dx.doi.org/10.1155/2015/203872

Transcript of Research Article An Improved Infrared/Visible Fusion for ...

Page 1: Research Article An Improved Infrared/Visible Fusion for ...

Research ArticleAn Improved InfraredVisible Fusion for Astronomical Images

Attiq Ahmad1 Muhammad Mohsin Riaz2 Abdul Ghafoor1 and Tahir Zaidi3

1Military College of Signals National University of Sciences and Technology (NUST) Islamabad 44000 Pakistan2Center for Advanced Studies in Telecommunication COMSATS Islamabad 44000 Pakistan3College of Electrical and Mechanical Engineering NUST Islamabad 44000 Pakistan

Correspondence should be addressed to Abdul Ghafoor abdulghafoor-mcsnustedupk

Received 30 April 2015 Revised 4 August 2015 Accepted 16 August 2015

Academic Editor Dean Hines

Copyright copy 2015 Attiq Ahmad et al This is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

An undecimated dual tree complex wavelet transform (UDTCWT) based fusion scheme for astronomical visibleIR images isdeveloped The UDTCWT reduces noise effects and improves object classification due to its inherited shift invariance propertyLocal standard deviation and distance transforms are used to extract useful information (especially small objects) Simulationresults compared with the state-of-the-art fusion techniques illustrate the superiority of proposed scheme in terms of accuracy formost of the cases

1 Introduction

Thevisible light astronomydue to reflection refraction inter-ference and diffraction enables scientists to unearth many ofnaturersquos secrets however brightness of stars creates a hazein the sky On the other hand the infrared (IR) astronomyenables us to peer through the veil of interstellar dust andsee objects at extreme cosmological distances IR images havegood radiometric resolution whereas visible images providedetailed information In this regard various image fusiontechniques have been developed to combine the complemen-tary information present in both images These techniquescan be grouped into wavelet statistical decomposition andcompressive sensing

The wavelet transform based fusion schemes generallydecompose the visible and IR images into different baseand detail layers to combine the useful information In [1]contourlet transform fusion is used to separate foregroundand background information however the separation is notalways accurate which causes loss in target information In[2] nonsubsampled contourlet transform local energy andfuzzy logic based fusion claims better subjective visual effectshowever merger and description of necessary components ofIR and visible images in fusion model require improvementsespecially in case of noisy images In [3] wavelet transform

and fuzzy logic based scheme utilizes dissimilarity measureto assign weights however some artifacts are also introducedin the fused image Contrast enhancement (using ratio oflocal and global divergence of IR image) based fusion lackscolor consistency [4] In adaptive intensity hue saturationmethod [5] the amount of spatial details injected into eachband of multispectral image is appropriately determinedby the weighting matrix which is defined on the basis ofthe edges present in panchromatic and multispectral bandsThe scheme preserves the spatial details however it isunable to control the spectral distortion sufficiently [6] In[7] gradient-domain approach based on mapping contrastdefines the structure tensor matrix onto a low-dimensionalgradient field However the scheme effects the natural outputcolours In [8] wavelet transform and segmentation basedfusion scheme is developed to enhance targets in low contrastHowever the fusion performance is dependent on segmen-tation quality and large segmentation errors can occur forcosmological images (especially when one feature is split intomultiple regions)

Statistical fusion schemes split the images into multiplesubspaces using different matrix decomposition techniques119870-means and singular value decomposition based schemesuffers from computational complexity [9] In [10] spatialand spectral fusion model uses sparse matrix factorization

Hindawi Publishing CorporationAdvances in AstronomyVolume 2015 Article ID 203872 8 pageshttpdxdoiorg1011552015203872

2 Advances in Astronomy

to fuse images with different spatial and spectral propertiesThe scheme combines the spectral information from sensorshaving low spatial but high spectral resolution with thespatial information from sensors having high spatial but lowspectral resolution Although the scheme produces betterfused results with well preserved spectral and spatial prop-erties its issues include spectral dictionary learning processand computational complexity In [11] an internal generativemechanism based fusion algorithm first decomposes sourceimage into a coarse layer and a detail layer by simulating themechanism of human visual system for perceiving imagesThen the detail layer is fused using pulse coupled neural net-work and the coarse layer is fused by using the spectral resid-ual based saliencymethodThe scheme is time inefficient andyields weak fusion performance In [12] independent com-ponents analysis based IR and visible image fusion schemeuses kurtosis information of the independent componentsanalysis based coefficients However further work is requiredfor determining fusion rules of primary features

Compressive sensing based fusion schemes exploit thesparsity of data using different dictionaries Adjustable com-pressive measurement based fusion scheme suffers fromempirical adjustment of different parameters [13] In [14] acompressive sensing approach preserves data (such as edgeslines and contours) however design of appropriate sparsetransform and optimal deterministic measurement matrix isan issue In [15] a compressive sensing based image fusionscheme (for infrared and visible images) first compresses thesensing data by random projection and then obtains sparsecoefficients on compressed samples by sparse representationThe fusion coefficients are finally combined with the fusionimpact factor and the fused image is reconstructed from thecombined sparse coefficients However the scheme is ineffi-cient and prone to noise effects In [16] a nonnegative sparserepresentation based scheme is used to extract the featuresof source images Some methods are developed to detect thesalient features (which include the target and contours) in theIR image and texture features in visible image Although thescheme performs better for noisy images the sparseness ofthe image is controlled implicitly

In a nutshell the above-mentioned state-of-the-art fusiontechniques suffer from limited accuracy high computationalcomplexity or nonrobustness To overcome these issues aUDTCWT based visibleIR image fusion scheme for astro-nomical images is developed The UDTCWT reduces noiseeffects and improves object classification due to its inheritedshift invariance property Local standard deviation alongwithdistance transforms is used to extract useful information(especially small objects) Simulation results illustrate thesuperiority of proposed scheme in terms of accuracy formostof the cases

2 Proposed Method

Let 119868119896 be the input source IR (119896 = 1) and visible (119896 =

2) registered images (with dimensions 119872 times 119873) The localstandard deviation 119896 for estimating local variations of 119868119896 is

119896 (119898 119899)

=

radicsum119898+1198981

=119898minus1198981

sum119899+1198991

119899=119899minus1198991

(119868119896 ( 119899) minus 119868119896 (119898 119899))2

(21198981 + 1) (21198991 + 1)

(1)

where 119868119896 is local mean image computed as

119868119896 =1

(21198981 + 1) (21198991 + 1)

119898+1198981

sum

=119898minus1198981

119899+1198991

sum

119899=119899minus1198991

119868119896 ( 119899) (2)

The local standard deviation measures the randomness ofpixels in a local area where the high values indicate presenceof astrobodies and the low value values correspond tosmoothblank space (without any object or astrobody)

The image 119871119896 is obtained by thresholding 119896 to removethe pixel containing large variations that is

119871119896 (119898 119899)

=

119896 (119898 119899) 119896 (119898 119899) lt 119896 + 120577119896 Var [119896]

0 otherwise

(3)

where 120577119896 isin [01 12] is a controlling parameter and 119896

and Var[119896] are mean and variance of 119896 respectively Thegray distance image 119868119863 (to classify different points which arepresent insideoutside any shapeobject) is computed using119871119896 and mask 119871mask as

119863119896

Distance Transformlarr997888997888997888997888997888997888997888997888997888997888997888997888997888 (119871119896 119871mask) (4)

The distance transform (used to eliminate oversegmentationand short sightedness) measures the overall distance of thepixel from other bright pixels For instance a pixel closer toa cluster of stars (objects) tends to be part of the segmentedmask and vice versa

Let 119864119896 be the binary image obtained from the distanceimage119863119896 that is

119864119896 (119898 119899) =

1 119863119896 (119898 119899) gt 120582119896119863119896

0 otherwise(5)

where 119863119896 denotes mean image and 120582119896 gt 300 is a positiveconstant The 119864119896 image segments the foreground from back-ground regions The connected components image 119862119896 (tosegment different binary patterns) with structure element 120601(a 3 times 3matrix of all ones) is

119862119896

Connected Componentlarr997888997888997888997888997888997888997888997888997888997888997888997888997888997888997888997888

Labeling(119864119896 120601) (6)

Let 119886119902119896(119898 119899) and 119901119902119896(119898 119899) represent area and perimeter ofthe 119902th connected component placed at (119898 119899)th locationrespectively a binary segmented image 119878119896 is constructed as

119878119896 (119898 119899) =

1 119886119902119896 (119898 119899) gt 120573

119901119902119896 (119898 119899)

119886119902 (119898 119899)lt 120574

0 otherwise(7)

Advances in Astronomy 3

Local standarddeviation

Thresholding

Gray distance image

Binarization

Connectedcomponents

Area and perimeterscomputation

Binary segmentedimage

Binary fused map

Final fused map

IUDTCWT

UDTCWT UDTCWT

Binary coefficientmatrix

Local standarddeviation

Thresholding

Gray distance image

Binarization

Connectedcomponents

Area and perimeterscomputation

Binary segmentedimage

I1

L1

L1

D1

E1

C1

aq1pq1

S1

1205771

Lmask

1205821

1205772

Lmask

1205822

I2

L2

L2

D2

E2

C2

aq2pq2

S2

IU

IF

IF

F

I1U I2U

Figure 1 Flow diagram

where 120573 ge 10 and 120574 le 15 are thresholding parametersUDTCWT is applied on the source images 119868119896 to obtaincoefficient matrix 119868119896

119880

of dimensions 119872 times 119873 times 119871 (where119897 = 1 2 119871 represents wavelet coefficients) The decom-position obtained using UDTCWT not only eliminatesnoiseunwanted artifacts but also is effective in preservingthe useful information present in the input images (due toits undecimated property) The binary coefficient matrix 119868119880is obtained by assigning nonzero values at pixel locationswhere visible image provides more information than theIR image This binary thresholding ensures that the fusedimage contains the significantimportant information of bothsource images (as the higher value of UDTCWT correspondsto presence of significantimportant information)

119868119880 (119898 119899 119897) =

1100381610038161003816100381610038161198682119880(119898 119899 119897)

10038161003816100381610038161003816gt100381610038161003816100381610038161198681119880(119898 119899 119897)

10038161003816100381610038161003816

0 otherwise(8)

A binary fuse map 119868119865 is computed as

119868119865 (119898 119899 119897) = 119868119880 (119898 119899 119897) oplus 1198781 (119898 119899) oplus 1198782 (119898 119899) (9)

where oplus represents 119874119877 operation Let

119868119865 (119898 119899 119897) =

1198682119880(119898 119899 119897) if 119868119865 (119898 119899 119897) = 1

1198681119880(119898 119899 119897) otherwise

(10)

The final fused image 119865 is obtained by computing the inverseUDTCWT of fused coefficients 119868119865 Figure 1 shows the flowdiagram of proposed technique

3 Results and Discussion

To verify the significance of the proposed technique simu-lations are performed on various visibleIR datasets Quan-titative analysis is performed using 119876119900 (luminancecontrast

4 Advances in Astronomy

(a) (b) (c)

(d) (e) (f)

Figure 2 Andromeda galaxy (M31) (a) visible image (b) IR image (c) local variance image (d) distance image (e) IR segmented imageand (f) visual segmented image

distortion) 119876MI (mutual information) 119876119908 (weighted qualityindex) 119876119890 (edge dependent quality index) 119876119878 (structuralsimilarity index measure) 119876119861 (human perception inspiredmetric) 119876119883 (edge transform metric) and 119876119885 (image featuremetric) [17ndash22]

The119876119900metric [17 18] is designed throughmodality imagedistortion as combination of loss of correlation luminancedistortion and contrast distortion The 119876

MI metric [17]represents the orientation preservation and edge strengthvalues It models the perceptual loss of information in fusedresults in terms of how well the strength and orientationvalues of pixels in source images are represented in fusedimage It deals with the problem of objective evaluationof dynamic multisensor image fusion based on gradientinformation preservation between the inputs and the fusedimages It also takes into account additional scene and object

motion information present in multisensor sequences The119876119908 metric [17] is defined by assigning more weight to those

windows where saliency of the input image is high Itcorresponds to the areas that are likely to be perceptuallyimportant parts of the underlying scene The 119876119890 index [17]takes into account aspects of the human visual system whereit expresses the contribution of the edge information of thesource images to the fused images The 119876

119878 measure [19]is the similarity between two images and is designed toimprove traditional measures of mean square error and peaksignal to noise ratio which are inconsistent with human eyeperception The 119876119861 metric [20] evaluates the performance ofimage fusion for night vision applications using a perceptualquality evaluation method based on human visual systemmodels Image quality of fused image is assessed by contrastsensitivity function and contrast preservation map The 119876119883

Advances in Astronomy 5

(a) (b) (c)

(d) (e) (f)

Figure 3Andromeda galaxy (M31) (a) RP [25] fusion (b)DTCWT[26] fusion (c)NSCT [27] fusion (d)MSVD[28] fusion (e) Ellmauthaleret al [8] fusion and (f) proposed fusion

metric [21] assesses the pixel-level fusion performance andreflects the quality of visual information obtained from thefusion of input images The119876119885metric [22] evaluates the per-formance of the combinative pixel-level image fusion basedon an image feature measurement (ie phase congruencyand its moments) and provides an absolute measurementof image features By comparing the local cross-correlationof corresponding feature maps of input images and fusedoutput the quality of the fused result is assessed without areference image

These qualitymetrics [17ndash22] workwell for noisy blurredand distorted images using multiscale transformation arith-metic statistical and compressive sensing based schemesformultiexposuremultiresolution andmultimodal environ-mentsThese are also useful for remote and airborne sensingmilitary and industrial engineering related applications Thenormalized range of these measures is between 0 and 1

where high values imply better fusion metric for each qualitymeasure

Figure 2(a) shows Andromeda galaxy (M31) JPEG IRimage taken by Spitzer space telescope [23] while Figure 2(b)shows the corresponding visible image taken using 12510158401015840Ritchey Chretien Cassegrain (at F6) and ST10XME [24]Figures 2(c)ndash2(f) are the outputs of local variance distancetransform and segmentation steps

Figure 3 shows the fusion results obtained by ratio pyra-mid (RP) [25] dual tree complex wavelet transform (DTCWT)[26] nonsubsampled contourlet transform (NSCT) [27]multiresolution singular value decomposition (MSVD) [28]Ellmauthaler et al [8] and proposed schemes By visual com-parison it can be noticed that the proposed scheme providesbetter fusion results especially the preservation of back-ground intensity value as compared to existing state-of-the-art schemes

6 Advances in Astronomy

(a) (b) (c) (d)

(e) (f) (g) (h)Figure 4 Jupiterrsquos moon (a) IR image (b) visible image (c) RP [25] fusion (d) DTCWT [26] fusion (e) NSCT [27] fusion (f) MSVD [28]fusion (g) Ellmauthaler et al [8] fusion and (h) proposed fusion

(a) (b) (c) (d)

(e) (f) (g) (h)Figure 5 Nabula (M16) (a) IR image (b) visible image (c) RP [25] fusion (d) DTCWT [26] fusion (e) NSCT [27] fusion (f) MSVD [28]fusion (g) Ellmauthaler et al [8] fusion and (h) proposed fusion

Advances in Astronomy 7

Table 1 Quantitative comparison

Dataset Technique 119876119900

119876119890

119876119908

119876119878

119876MI

119876119861

119876119883

119876119885

Andromeda galaxy (M31)

Proposed 08220 08319 07707 04804 06461 03487 06612 04615Ellmauthaler et al [8] 07179 08444 07345 03647 06350 02229 05610 03387

MSVD [28] 06452 06946 06243 04148 04535 04432 00045 02675NSCT [27] 06259 06003 05576 02641 04606 01777 03432 02689

DTCWT [26] 07085 08134 06573 03113 05436 01682 02682 02290RP [25] 04706 05416 05102 02514 03970 05282 02075 02026

Jupiterrsquos moon

Proposed 07927 07255 07814 04622 07566 03433 06725 05617Ellmauthaler et al [8] 02832 06477 06343 04230 07398 01768 05672 05614

MSVD [28] 04780 04970 05217 05243 05292 04599 00065 04923NSCT [27] 04155 04083 04631 05279 05212 02672 05001 06139

DTCWT [26] 04571 05932 05851 03476 05973 01805 04785 04844RP [25] 03467 03989 04022 04749 03919 04825 00183 03634

Nabula (M16)

Proposed 07399 04896 08461 08587 08645 07646 07553 05652Ellmauthaler et al [8] 07318 04230 08446 08494 08563 05736 05424 05494

MSVD [28] 04918 04120 06466 06704 06539 05598 00051 04331NSCT [27] 05204 02980 06037 05899 06013 04916 03953 05058

DTCWT [26] 06736 03608 08023 08225 08125 04477 04631 05079RP [25] 05885 03099 06834 06818 06934 06597 01708 04639

Figures 4(a) and 4(b) show visible and IR Jupiterrsquos moonJPEG images taken by new Horizons spacecraft using mul-tispectral visible imaging camera and linear Etalon imagingspectral array [29] The fusion results obtained by RP [25]DTCWT [26] NSCT [27] MSVD [28] Ellmauthaler et al[8] and proposed schemes are shown in Figures 4(c)ndash4(h)respectively Note that only the proposed scheme is able toaccurately preserve both the moon texture (from IR image)and other stars (from visible image) in the fused image

Figures 5(a) and 5(b) show visible and IR Nabula (M16)JPEG images taken by Hubble space telescope [30] Thefusion results obtained by RP [25] DTCWT [26] NSCT [27]MSVD [28] Ellmauthaler et al [8] and proposed schemesare shown in Figures 5(c)ndash5(h) respectivelyThe fused imageusing proposed scheme highlights the IR information moreaccurately as compared to existing state-of-the-art schemes

Table 1 shows the quantitative comparison of existingand proposed schemes (where the bold values indicate bestresults) It can be observed that the results obtained usingproposed schemes are significantly better inmost of the casesmeasures as compared to existing state-of-the-art schemes

4 Conclusion

A fusion scheme for astronomical visibleIR images based onUDTCWT local standard deviation and distance transformis proposed The use of UDTCWT is helpful in retaininguseful details of the image The local standard deviationvariation measures presence or absence of small objects Thedistance transform activates the effects of proximity in thesegmentation process and eliminates effects of oversegmen-tation in addition to short sightedness The scheme reducesnoise artifacts and efficiently extracts the useful information(especially small objects) Simulation results on differentvisibleIR images verify the effectiveness of proposed scheme

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] L Kun G Lei L Huihui and C Jingsong ldquoFusion of infraredand visible light images based on region segmentationrdquo ChineseJournal of Aeronautics vol 22 no 1 pp 75ndash80 2009

[2] Z Fu X Dai Y Li H Wu and X Wang ldquoAn improved visibleand Infrared image fusion based on local energy and fuzzylogicrdquo in Proceedings of the 12th IEEE International Conferenceon Signal Processing pp 861ndash865 October 2014

[3] J Saeedi and K Faez ldquoInfrared and visible image fusion usingfuzzy logic and population-based optimizationrdquo Applied SoftComputing vol 12 no 3 pp 1041ndash1054 2012

[4] S Yin L Cao Y Ling and G Jin ldquoOne color contrast enhancedinfrared and visible image fusion methodrdquo Infrared Physics andTechnology vol 53 no 2 pp 146ndash150 2010

[5] Y Leung J Liu and J Zhang ldquoAn improved adaptive intensity-hue-saturationmethod for the fusion of remote sensing imagesrdquoIEEE Geoscience and Remote Sensing Letters vol 11 no 5 pp985ndash989 2014

[6] B Chen and B Xu ldquoA unified spatial-spectral-temporal fusionmodel using Landsat and MODIS imageryrdquo in Proceedingsof the 3rd International Workshop on Earth Observation andRemote Sensing Applications (EORSA rsquo14) pp 256ndash260 IEEEChangsha China June 2014

[7] D Connah M S Drew and G D Finlayson Spectral EdgeImage Fusion Theory and Applications Springer 2014

[8] A Ellmauthaler E A B da Silva C L Pagliari and S R NevesldquoInfrared-visible image fusion using the undecimated wavelettransform with spectral factorization and target extractionrdquo inProceedings of the 19th IEEE International Conference on ImageProcessing (ICIP rsquo12) pp 2661ndash2664 October 2012

8 Advances in Astronomy

[9] M Ding L Wei and B Wang ldquoResearch on fusion method forinfrared and visible images via compressive sensingrdquo InfraredPhysics and Technology vol 57 pp 56ndash67 2013

[10] B Huang H Song H Cui J Peng and Z Xu ldquoSpatial and spec-tral image fusion using sparsematrix factorizationrdquo IEEETrans-actions on Geoscience and Remote Sensing vol 52 no 3 pp1693ndash1704 2014

[11] X Zhang X Li Y FengH Zhao andZ Liu ldquoImage fusionwithinternal generative mechanismrdquo Expert Systems with Applica-tions vol 42 no 5 pp 2382ndash2391 2015

[12] Y Lu F Wang X Luo and F Liu ldquoNovel infrared and visibleimage fusion method based on independent component anal-ysisrdquo Frontiers in Computer Science vol 8 no 2 pp 243ndash2542014

[13] A Jameel A Ghafoor and M M Riaz ldquoAdaptive compressivefusion for visibleIR sensorsrdquo IEEE Sensors Journal vol 14 no7 pp 2230ndash2231 2014

[14] Z Liu H Yin B Fang and Y Chai ldquoA novel fusion schemefor visible and infrared images based on compressive sensingrdquoOptics Communications vol 335 pp 168ndash177 2015

[15] R Wang and L Du ldquoInfrared and visible image fusion basedon random projection and sparse representationrdquo InternationalJournal of Remote Sensing vol 35 no 5 pp 1640ndash1652 2014

[16] J Wang J Peng X Feng G He and J Fan ldquoFusion method forinfrared and visible images by using non-negative sparse repre-sentationrdquo Infrared Physics and Technology vol 67 pp 477ndash4892014

[17] G Piella and H Heijmans ldquoA new quality metric for imagefusionrdquo in Proceedings of the IEEE Conference on Image Process-ing Conference pp 171ndash173 September 2003

[18] Z Wang and A C Bovik ldquoA universal image quality indexrdquoIEEE Signal Processing Letters vol 9 no 3 pp 81ndash84 2002

[19] ZWang A C Bovik H R Sheikh and E P Simoncelli ldquoImagequality assessment from error visibility to structural similarityrdquoIEEE Transactions on Image Processing vol 13 no 4 pp 600ndash612 2004

[20] Y Chen and R S Blum ldquoA new automated quality assessmentalgorithm for image fusionrdquo Image and Vision Computing vol27 no 10 pp 1421ndash1432 2009

[21] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo IET Electronics Letters vol 36 no 4 pp 308ndash309 2000

[22] J Zhao R Laganiere and Z Liu ldquoPerformance assessment ofcombinative pixel-level image fusion based on an absolute fea-turemeasurementrdquo International Journal of Innovative Comput-ing Information and Control vol 3 no 6 pp 1433ndash1447 2007

[23] Andromeda galaxy (M31) IR httpsciesaintherschel48182-multiwavelength-images-of-the-andromeda-galaxy-m31

[24] Andromeda galaxy (M31) visible httpwwwrobgendlerastro-picscomM31Pagehtml

[25] A Toet ldquoImage fusion by a ratio of low-pass pyramidrdquo PatternRecognition Letters vol 9 no 4 pp 245ndash253 1989

[26] J J Lewis R J OrsquoCallaghan S G Nikolov D R Bull and NCanagarajah ldquoPixel- and region-based image fusion with com-plex waveletsrdquo Information Fusion vol 8 no 2 pp 119ndash1302007

[27] Q Zhang and B-L Guo ldquoMultifocus image fusion using thenonsubsampled contourlet transformrdquo Signal Processing vol89 no 7 pp 1334ndash1346 2009

[28] V P S Naidu ldquoImage fusion technique using multi-resolutionsingular value decompositionrdquo Defence Science Journal vol 61no 5 pp 479ndash484 2011

[29] Jupiterrsquos moon httpwwwtechnologyorg20141203plutos-closeup-will-awesome-based-jupiter-pics-new-horizons-space-craft

[30] Nabula (M16) httpwebbtelescopeorgwebb telescopetech-nology at the extremeskeep it coldphp

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

High Energy PhysicsAdvances in

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

FluidsJournal of

Atomic and Molecular Physics

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in Condensed Matter Physics

OpticsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

AstronomyAdvances in

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Superconductivity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Statistical MechanicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

GravityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

AstrophysicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Physics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Solid State PhysicsJournal of

 Computational  Methods in Physics

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Soft MatterJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

AerodynamicsJournal of

Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PhotonicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Biophysics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ThermodynamicsJournal of

Page 2: Research Article An Improved Infrared/Visible Fusion for ...

2 Advances in Astronomy

to fuse images with different spatial and spectral propertiesThe scheme combines the spectral information from sensorshaving low spatial but high spectral resolution with thespatial information from sensors having high spatial but lowspectral resolution Although the scheme produces betterfused results with well preserved spectral and spatial prop-erties its issues include spectral dictionary learning processand computational complexity In [11] an internal generativemechanism based fusion algorithm first decomposes sourceimage into a coarse layer and a detail layer by simulating themechanism of human visual system for perceiving imagesThen the detail layer is fused using pulse coupled neural net-work and the coarse layer is fused by using the spectral resid-ual based saliencymethodThe scheme is time inefficient andyields weak fusion performance In [12] independent com-ponents analysis based IR and visible image fusion schemeuses kurtosis information of the independent componentsanalysis based coefficients However further work is requiredfor determining fusion rules of primary features

Compressive sensing based fusion schemes exploit thesparsity of data using different dictionaries Adjustable com-pressive measurement based fusion scheme suffers fromempirical adjustment of different parameters [13] In [14] acompressive sensing approach preserves data (such as edgeslines and contours) however design of appropriate sparsetransform and optimal deterministic measurement matrix isan issue In [15] a compressive sensing based image fusionscheme (for infrared and visible images) first compresses thesensing data by random projection and then obtains sparsecoefficients on compressed samples by sparse representationThe fusion coefficients are finally combined with the fusionimpact factor and the fused image is reconstructed from thecombined sparse coefficients However the scheme is ineffi-cient and prone to noise effects In [16] a nonnegative sparserepresentation based scheme is used to extract the featuresof source images Some methods are developed to detect thesalient features (which include the target and contours) in theIR image and texture features in visible image Although thescheme performs better for noisy images the sparseness ofthe image is controlled implicitly

In a nutshell the above-mentioned state-of-the-art fusiontechniques suffer from limited accuracy high computationalcomplexity or nonrobustness To overcome these issues aUDTCWT based visibleIR image fusion scheme for astro-nomical images is developed The UDTCWT reduces noiseeffects and improves object classification due to its inheritedshift invariance property Local standard deviation alongwithdistance transforms is used to extract useful information(especially small objects) Simulation results illustrate thesuperiority of proposed scheme in terms of accuracy formostof the cases

2 Proposed Method

Let 119868119896 be the input source IR (119896 = 1) and visible (119896 =

2) registered images (with dimensions 119872 times 119873) The localstandard deviation 119896 for estimating local variations of 119868119896 is

119896 (119898 119899)

=

radicsum119898+1198981

=119898minus1198981

sum119899+1198991

119899=119899minus1198991

(119868119896 ( 119899) minus 119868119896 (119898 119899))2

(21198981 + 1) (21198991 + 1)

(1)

where 119868119896 is local mean image computed as

119868119896 =1

(21198981 + 1) (21198991 + 1)

119898+1198981

sum

=119898minus1198981

119899+1198991

sum

119899=119899minus1198991

119868119896 ( 119899) (2)

The local standard deviation measures the randomness ofpixels in a local area where the high values indicate presenceof astrobodies and the low value values correspond tosmoothblank space (without any object or astrobody)

The image 119871119896 is obtained by thresholding 119896 to removethe pixel containing large variations that is

119871119896 (119898 119899)

=

119896 (119898 119899) 119896 (119898 119899) lt 119896 + 120577119896 Var [119896]

0 otherwise

(3)

where 120577119896 isin [01 12] is a controlling parameter and 119896

and Var[119896] are mean and variance of 119896 respectively Thegray distance image 119868119863 (to classify different points which arepresent insideoutside any shapeobject) is computed using119871119896 and mask 119871mask as

119863119896

Distance Transformlarr997888997888997888997888997888997888997888997888997888997888997888997888997888 (119871119896 119871mask) (4)

The distance transform (used to eliminate oversegmentationand short sightedness) measures the overall distance of thepixel from other bright pixels For instance a pixel closer toa cluster of stars (objects) tends to be part of the segmentedmask and vice versa

Let 119864119896 be the binary image obtained from the distanceimage119863119896 that is

119864119896 (119898 119899) =

1 119863119896 (119898 119899) gt 120582119896119863119896

0 otherwise(5)

where 119863119896 denotes mean image and 120582119896 gt 300 is a positiveconstant The 119864119896 image segments the foreground from back-ground regions The connected components image 119862119896 (tosegment different binary patterns) with structure element 120601(a 3 times 3matrix of all ones) is

119862119896

Connected Componentlarr997888997888997888997888997888997888997888997888997888997888997888997888997888997888997888997888

Labeling(119864119896 120601) (6)

Let 119886119902119896(119898 119899) and 119901119902119896(119898 119899) represent area and perimeter ofthe 119902th connected component placed at (119898 119899)th locationrespectively a binary segmented image 119878119896 is constructed as

119878119896 (119898 119899) =

1 119886119902119896 (119898 119899) gt 120573

119901119902119896 (119898 119899)

119886119902 (119898 119899)lt 120574

0 otherwise(7)

Advances in Astronomy 3

Local standarddeviation

Thresholding

Gray distance image

Binarization

Connectedcomponents

Area and perimeterscomputation

Binary segmentedimage

Binary fused map

Final fused map

IUDTCWT

UDTCWT UDTCWT

Binary coefficientmatrix

Local standarddeviation

Thresholding

Gray distance image

Binarization

Connectedcomponents

Area and perimeterscomputation

Binary segmentedimage

I1

L1

L1

D1

E1

C1

aq1pq1

S1

1205771

Lmask

1205821

1205772

Lmask

1205822

I2

L2

L2

D2

E2

C2

aq2pq2

S2

IU

IF

IF

F

I1U I2U

Figure 1 Flow diagram

where 120573 ge 10 and 120574 le 15 are thresholding parametersUDTCWT is applied on the source images 119868119896 to obtaincoefficient matrix 119868119896

119880

of dimensions 119872 times 119873 times 119871 (where119897 = 1 2 119871 represents wavelet coefficients) The decom-position obtained using UDTCWT not only eliminatesnoiseunwanted artifacts but also is effective in preservingthe useful information present in the input images (due toits undecimated property) The binary coefficient matrix 119868119880is obtained by assigning nonzero values at pixel locationswhere visible image provides more information than theIR image This binary thresholding ensures that the fusedimage contains the significantimportant information of bothsource images (as the higher value of UDTCWT correspondsto presence of significantimportant information)

119868119880 (119898 119899 119897) =

1100381610038161003816100381610038161198682119880(119898 119899 119897)

10038161003816100381610038161003816gt100381610038161003816100381610038161198681119880(119898 119899 119897)

10038161003816100381610038161003816

0 otherwise(8)

A binary fuse map 119868119865 is computed as

119868119865 (119898 119899 119897) = 119868119880 (119898 119899 119897) oplus 1198781 (119898 119899) oplus 1198782 (119898 119899) (9)

where oplus represents 119874119877 operation Let

119868119865 (119898 119899 119897) =

1198682119880(119898 119899 119897) if 119868119865 (119898 119899 119897) = 1

1198681119880(119898 119899 119897) otherwise

(10)

The final fused image 119865 is obtained by computing the inverseUDTCWT of fused coefficients 119868119865 Figure 1 shows the flowdiagram of proposed technique

3 Results and Discussion

To verify the significance of the proposed technique simu-lations are performed on various visibleIR datasets Quan-titative analysis is performed using 119876119900 (luminancecontrast

4 Advances in Astronomy

(a) (b) (c)

(d) (e) (f)

Figure 2 Andromeda galaxy (M31) (a) visible image (b) IR image (c) local variance image (d) distance image (e) IR segmented imageand (f) visual segmented image

distortion) 119876MI (mutual information) 119876119908 (weighted qualityindex) 119876119890 (edge dependent quality index) 119876119878 (structuralsimilarity index measure) 119876119861 (human perception inspiredmetric) 119876119883 (edge transform metric) and 119876119885 (image featuremetric) [17ndash22]

The119876119900metric [17 18] is designed throughmodality imagedistortion as combination of loss of correlation luminancedistortion and contrast distortion The 119876

MI metric [17]represents the orientation preservation and edge strengthvalues It models the perceptual loss of information in fusedresults in terms of how well the strength and orientationvalues of pixels in source images are represented in fusedimage It deals with the problem of objective evaluationof dynamic multisensor image fusion based on gradientinformation preservation between the inputs and the fusedimages It also takes into account additional scene and object

motion information present in multisensor sequences The119876119908 metric [17] is defined by assigning more weight to those

windows where saliency of the input image is high Itcorresponds to the areas that are likely to be perceptuallyimportant parts of the underlying scene The 119876119890 index [17]takes into account aspects of the human visual system whereit expresses the contribution of the edge information of thesource images to the fused images The 119876

119878 measure [19]is the similarity between two images and is designed toimprove traditional measures of mean square error and peaksignal to noise ratio which are inconsistent with human eyeperception The 119876119861 metric [20] evaluates the performance ofimage fusion for night vision applications using a perceptualquality evaluation method based on human visual systemmodels Image quality of fused image is assessed by contrastsensitivity function and contrast preservation map The 119876119883

Advances in Astronomy 5

(a) (b) (c)

(d) (e) (f)

Figure 3Andromeda galaxy (M31) (a) RP [25] fusion (b)DTCWT[26] fusion (c)NSCT [27] fusion (d)MSVD[28] fusion (e) Ellmauthaleret al [8] fusion and (f) proposed fusion

metric [21] assesses the pixel-level fusion performance andreflects the quality of visual information obtained from thefusion of input images The119876119885metric [22] evaluates the per-formance of the combinative pixel-level image fusion basedon an image feature measurement (ie phase congruencyand its moments) and provides an absolute measurementof image features By comparing the local cross-correlationof corresponding feature maps of input images and fusedoutput the quality of the fused result is assessed without areference image

These qualitymetrics [17ndash22] workwell for noisy blurredand distorted images using multiscale transformation arith-metic statistical and compressive sensing based schemesformultiexposuremultiresolution andmultimodal environ-mentsThese are also useful for remote and airborne sensingmilitary and industrial engineering related applications Thenormalized range of these measures is between 0 and 1

where high values imply better fusion metric for each qualitymeasure

Figure 2(a) shows Andromeda galaxy (M31) JPEG IRimage taken by Spitzer space telescope [23] while Figure 2(b)shows the corresponding visible image taken using 12510158401015840Ritchey Chretien Cassegrain (at F6) and ST10XME [24]Figures 2(c)ndash2(f) are the outputs of local variance distancetransform and segmentation steps

Figure 3 shows the fusion results obtained by ratio pyra-mid (RP) [25] dual tree complex wavelet transform (DTCWT)[26] nonsubsampled contourlet transform (NSCT) [27]multiresolution singular value decomposition (MSVD) [28]Ellmauthaler et al [8] and proposed schemes By visual com-parison it can be noticed that the proposed scheme providesbetter fusion results especially the preservation of back-ground intensity value as compared to existing state-of-the-art schemes

6 Advances in Astronomy

(a) (b) (c) (d)

(e) (f) (g) (h)Figure 4 Jupiterrsquos moon (a) IR image (b) visible image (c) RP [25] fusion (d) DTCWT [26] fusion (e) NSCT [27] fusion (f) MSVD [28]fusion (g) Ellmauthaler et al [8] fusion and (h) proposed fusion

(a) (b) (c) (d)

(e) (f) (g) (h)Figure 5 Nabula (M16) (a) IR image (b) visible image (c) RP [25] fusion (d) DTCWT [26] fusion (e) NSCT [27] fusion (f) MSVD [28]fusion (g) Ellmauthaler et al [8] fusion and (h) proposed fusion

Advances in Astronomy 7

Table 1 Quantitative comparison

Dataset Technique 119876119900

119876119890

119876119908

119876119878

119876MI

119876119861

119876119883

119876119885

Andromeda galaxy (M31)

Proposed 08220 08319 07707 04804 06461 03487 06612 04615Ellmauthaler et al [8] 07179 08444 07345 03647 06350 02229 05610 03387

MSVD [28] 06452 06946 06243 04148 04535 04432 00045 02675NSCT [27] 06259 06003 05576 02641 04606 01777 03432 02689

DTCWT [26] 07085 08134 06573 03113 05436 01682 02682 02290RP [25] 04706 05416 05102 02514 03970 05282 02075 02026

Jupiterrsquos moon

Proposed 07927 07255 07814 04622 07566 03433 06725 05617Ellmauthaler et al [8] 02832 06477 06343 04230 07398 01768 05672 05614

MSVD [28] 04780 04970 05217 05243 05292 04599 00065 04923NSCT [27] 04155 04083 04631 05279 05212 02672 05001 06139

DTCWT [26] 04571 05932 05851 03476 05973 01805 04785 04844RP [25] 03467 03989 04022 04749 03919 04825 00183 03634

Nabula (M16)

Proposed 07399 04896 08461 08587 08645 07646 07553 05652Ellmauthaler et al [8] 07318 04230 08446 08494 08563 05736 05424 05494

MSVD [28] 04918 04120 06466 06704 06539 05598 00051 04331NSCT [27] 05204 02980 06037 05899 06013 04916 03953 05058

DTCWT [26] 06736 03608 08023 08225 08125 04477 04631 05079RP [25] 05885 03099 06834 06818 06934 06597 01708 04639

Figures 4(a) and 4(b) show visible and IR Jupiterrsquos moonJPEG images taken by new Horizons spacecraft using mul-tispectral visible imaging camera and linear Etalon imagingspectral array [29] The fusion results obtained by RP [25]DTCWT [26] NSCT [27] MSVD [28] Ellmauthaler et al[8] and proposed schemes are shown in Figures 4(c)ndash4(h)respectively Note that only the proposed scheme is able toaccurately preserve both the moon texture (from IR image)and other stars (from visible image) in the fused image

Figures 5(a) and 5(b) show visible and IR Nabula (M16)JPEG images taken by Hubble space telescope [30] Thefusion results obtained by RP [25] DTCWT [26] NSCT [27]MSVD [28] Ellmauthaler et al [8] and proposed schemesare shown in Figures 5(c)ndash5(h) respectivelyThe fused imageusing proposed scheme highlights the IR information moreaccurately as compared to existing state-of-the-art schemes

Table 1 shows the quantitative comparison of existingand proposed schemes (where the bold values indicate bestresults) It can be observed that the results obtained usingproposed schemes are significantly better inmost of the casesmeasures as compared to existing state-of-the-art schemes

4 Conclusion

A fusion scheme for astronomical visibleIR images based onUDTCWT local standard deviation and distance transformis proposed The use of UDTCWT is helpful in retaininguseful details of the image The local standard deviationvariation measures presence or absence of small objects Thedistance transform activates the effects of proximity in thesegmentation process and eliminates effects of oversegmen-tation in addition to short sightedness The scheme reducesnoise artifacts and efficiently extracts the useful information(especially small objects) Simulation results on differentvisibleIR images verify the effectiveness of proposed scheme

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] L Kun G Lei L Huihui and C Jingsong ldquoFusion of infraredand visible light images based on region segmentationrdquo ChineseJournal of Aeronautics vol 22 no 1 pp 75ndash80 2009

[2] Z Fu X Dai Y Li H Wu and X Wang ldquoAn improved visibleand Infrared image fusion based on local energy and fuzzylogicrdquo in Proceedings of the 12th IEEE International Conferenceon Signal Processing pp 861ndash865 October 2014

[3] J Saeedi and K Faez ldquoInfrared and visible image fusion usingfuzzy logic and population-based optimizationrdquo Applied SoftComputing vol 12 no 3 pp 1041ndash1054 2012

[4] S Yin L Cao Y Ling and G Jin ldquoOne color contrast enhancedinfrared and visible image fusion methodrdquo Infrared Physics andTechnology vol 53 no 2 pp 146ndash150 2010

[5] Y Leung J Liu and J Zhang ldquoAn improved adaptive intensity-hue-saturationmethod for the fusion of remote sensing imagesrdquoIEEE Geoscience and Remote Sensing Letters vol 11 no 5 pp985ndash989 2014

[6] B Chen and B Xu ldquoA unified spatial-spectral-temporal fusionmodel using Landsat and MODIS imageryrdquo in Proceedingsof the 3rd International Workshop on Earth Observation andRemote Sensing Applications (EORSA rsquo14) pp 256ndash260 IEEEChangsha China June 2014

[7] D Connah M S Drew and G D Finlayson Spectral EdgeImage Fusion Theory and Applications Springer 2014

[8] A Ellmauthaler E A B da Silva C L Pagliari and S R NevesldquoInfrared-visible image fusion using the undecimated wavelettransform with spectral factorization and target extractionrdquo inProceedings of the 19th IEEE International Conference on ImageProcessing (ICIP rsquo12) pp 2661ndash2664 October 2012

8 Advances in Astronomy

[9] M Ding L Wei and B Wang ldquoResearch on fusion method forinfrared and visible images via compressive sensingrdquo InfraredPhysics and Technology vol 57 pp 56ndash67 2013

[10] B Huang H Song H Cui J Peng and Z Xu ldquoSpatial and spec-tral image fusion using sparsematrix factorizationrdquo IEEETrans-actions on Geoscience and Remote Sensing vol 52 no 3 pp1693ndash1704 2014

[11] X Zhang X Li Y FengH Zhao andZ Liu ldquoImage fusionwithinternal generative mechanismrdquo Expert Systems with Applica-tions vol 42 no 5 pp 2382ndash2391 2015

[12] Y Lu F Wang X Luo and F Liu ldquoNovel infrared and visibleimage fusion method based on independent component anal-ysisrdquo Frontiers in Computer Science vol 8 no 2 pp 243ndash2542014

[13] A Jameel A Ghafoor and M M Riaz ldquoAdaptive compressivefusion for visibleIR sensorsrdquo IEEE Sensors Journal vol 14 no7 pp 2230ndash2231 2014

[14] Z Liu H Yin B Fang and Y Chai ldquoA novel fusion schemefor visible and infrared images based on compressive sensingrdquoOptics Communications vol 335 pp 168ndash177 2015

[15] R Wang and L Du ldquoInfrared and visible image fusion basedon random projection and sparse representationrdquo InternationalJournal of Remote Sensing vol 35 no 5 pp 1640ndash1652 2014

[16] J Wang J Peng X Feng G He and J Fan ldquoFusion method forinfrared and visible images by using non-negative sparse repre-sentationrdquo Infrared Physics and Technology vol 67 pp 477ndash4892014

[17] G Piella and H Heijmans ldquoA new quality metric for imagefusionrdquo in Proceedings of the IEEE Conference on Image Process-ing Conference pp 171ndash173 September 2003

[18] Z Wang and A C Bovik ldquoA universal image quality indexrdquoIEEE Signal Processing Letters vol 9 no 3 pp 81ndash84 2002

[19] ZWang A C Bovik H R Sheikh and E P Simoncelli ldquoImagequality assessment from error visibility to structural similarityrdquoIEEE Transactions on Image Processing vol 13 no 4 pp 600ndash612 2004

[20] Y Chen and R S Blum ldquoA new automated quality assessmentalgorithm for image fusionrdquo Image and Vision Computing vol27 no 10 pp 1421ndash1432 2009

[21] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo IET Electronics Letters vol 36 no 4 pp 308ndash309 2000

[22] J Zhao R Laganiere and Z Liu ldquoPerformance assessment ofcombinative pixel-level image fusion based on an absolute fea-turemeasurementrdquo International Journal of Innovative Comput-ing Information and Control vol 3 no 6 pp 1433ndash1447 2007

[23] Andromeda galaxy (M31) IR httpsciesaintherschel48182-multiwavelength-images-of-the-andromeda-galaxy-m31

[24] Andromeda galaxy (M31) visible httpwwwrobgendlerastro-picscomM31Pagehtml

[25] A Toet ldquoImage fusion by a ratio of low-pass pyramidrdquo PatternRecognition Letters vol 9 no 4 pp 245ndash253 1989

[26] J J Lewis R J OrsquoCallaghan S G Nikolov D R Bull and NCanagarajah ldquoPixel- and region-based image fusion with com-plex waveletsrdquo Information Fusion vol 8 no 2 pp 119ndash1302007

[27] Q Zhang and B-L Guo ldquoMultifocus image fusion using thenonsubsampled contourlet transformrdquo Signal Processing vol89 no 7 pp 1334ndash1346 2009

[28] V P S Naidu ldquoImage fusion technique using multi-resolutionsingular value decompositionrdquo Defence Science Journal vol 61no 5 pp 479ndash484 2011

[29] Jupiterrsquos moon httpwwwtechnologyorg20141203plutos-closeup-will-awesome-based-jupiter-pics-new-horizons-space-craft

[30] Nabula (M16) httpwebbtelescopeorgwebb telescopetech-nology at the extremeskeep it coldphp

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

High Energy PhysicsAdvances in

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

FluidsJournal of

Atomic and Molecular Physics

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in Condensed Matter Physics

OpticsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

AstronomyAdvances in

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Superconductivity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Statistical MechanicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

GravityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

AstrophysicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Physics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Solid State PhysicsJournal of

 Computational  Methods in Physics

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Soft MatterJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

AerodynamicsJournal of

Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PhotonicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Biophysics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ThermodynamicsJournal of

Page 3: Research Article An Improved Infrared/Visible Fusion for ...

Advances in Astronomy 3

Local standarddeviation

Thresholding

Gray distance image

Binarization

Connectedcomponents

Area and perimeterscomputation

Binary segmentedimage

Binary fused map

Final fused map

IUDTCWT

UDTCWT UDTCWT

Binary coefficientmatrix

Local standarddeviation

Thresholding

Gray distance image

Binarization

Connectedcomponents

Area and perimeterscomputation

Binary segmentedimage

I1

L1

L1

D1

E1

C1

aq1pq1

S1

1205771

Lmask

1205821

1205772

Lmask

1205822

I2

L2

L2

D2

E2

C2

aq2pq2

S2

IU

IF

IF

F

I1U I2U

Figure 1 Flow diagram

where 120573 ge 10 and 120574 le 15 are thresholding parametersUDTCWT is applied on the source images 119868119896 to obtaincoefficient matrix 119868119896

119880

of dimensions 119872 times 119873 times 119871 (where119897 = 1 2 119871 represents wavelet coefficients) The decom-position obtained using UDTCWT not only eliminatesnoiseunwanted artifacts but also is effective in preservingthe useful information present in the input images (due toits undecimated property) The binary coefficient matrix 119868119880is obtained by assigning nonzero values at pixel locationswhere visible image provides more information than theIR image This binary thresholding ensures that the fusedimage contains the significantimportant information of bothsource images (as the higher value of UDTCWT correspondsto presence of significantimportant information)

119868119880 (119898 119899 119897) =

1100381610038161003816100381610038161198682119880(119898 119899 119897)

10038161003816100381610038161003816gt100381610038161003816100381610038161198681119880(119898 119899 119897)

10038161003816100381610038161003816

0 otherwise(8)

A binary fuse map 119868119865 is computed as

119868119865 (119898 119899 119897) = 119868119880 (119898 119899 119897) oplus 1198781 (119898 119899) oplus 1198782 (119898 119899) (9)

where oplus represents 119874119877 operation Let

119868119865 (119898 119899 119897) =

1198682119880(119898 119899 119897) if 119868119865 (119898 119899 119897) = 1

1198681119880(119898 119899 119897) otherwise

(10)

The final fused image 119865 is obtained by computing the inverseUDTCWT of fused coefficients 119868119865 Figure 1 shows the flowdiagram of proposed technique

3 Results and Discussion

To verify the significance of the proposed technique simu-lations are performed on various visibleIR datasets Quan-titative analysis is performed using 119876119900 (luminancecontrast

4 Advances in Astronomy

(a) (b) (c)

(d) (e) (f)

Figure 2 Andromeda galaxy (M31) (a) visible image (b) IR image (c) local variance image (d) distance image (e) IR segmented imageand (f) visual segmented image

distortion) 119876MI (mutual information) 119876119908 (weighted qualityindex) 119876119890 (edge dependent quality index) 119876119878 (structuralsimilarity index measure) 119876119861 (human perception inspiredmetric) 119876119883 (edge transform metric) and 119876119885 (image featuremetric) [17ndash22]

The119876119900metric [17 18] is designed throughmodality imagedistortion as combination of loss of correlation luminancedistortion and contrast distortion The 119876

MI metric [17]represents the orientation preservation and edge strengthvalues It models the perceptual loss of information in fusedresults in terms of how well the strength and orientationvalues of pixels in source images are represented in fusedimage It deals with the problem of objective evaluationof dynamic multisensor image fusion based on gradientinformation preservation between the inputs and the fusedimages It also takes into account additional scene and object

motion information present in multisensor sequences The119876119908 metric [17] is defined by assigning more weight to those

windows where saliency of the input image is high Itcorresponds to the areas that are likely to be perceptuallyimportant parts of the underlying scene The 119876119890 index [17]takes into account aspects of the human visual system whereit expresses the contribution of the edge information of thesource images to the fused images The 119876

119878 measure [19]is the similarity between two images and is designed toimprove traditional measures of mean square error and peaksignal to noise ratio which are inconsistent with human eyeperception The 119876119861 metric [20] evaluates the performance ofimage fusion for night vision applications using a perceptualquality evaluation method based on human visual systemmodels Image quality of fused image is assessed by contrastsensitivity function and contrast preservation map The 119876119883

Advances in Astronomy 5

(a) (b) (c)

(d) (e) (f)

Figure 3Andromeda galaxy (M31) (a) RP [25] fusion (b)DTCWT[26] fusion (c)NSCT [27] fusion (d)MSVD[28] fusion (e) Ellmauthaleret al [8] fusion and (f) proposed fusion

metric [21] assesses the pixel-level fusion performance andreflects the quality of visual information obtained from thefusion of input images The119876119885metric [22] evaluates the per-formance of the combinative pixel-level image fusion basedon an image feature measurement (ie phase congruencyand its moments) and provides an absolute measurementof image features By comparing the local cross-correlationof corresponding feature maps of input images and fusedoutput the quality of the fused result is assessed without areference image

These qualitymetrics [17ndash22] workwell for noisy blurredand distorted images using multiscale transformation arith-metic statistical and compressive sensing based schemesformultiexposuremultiresolution andmultimodal environ-mentsThese are also useful for remote and airborne sensingmilitary and industrial engineering related applications Thenormalized range of these measures is between 0 and 1

where high values imply better fusion metric for each qualitymeasure

Figure 2(a) shows Andromeda galaxy (M31) JPEG IRimage taken by Spitzer space telescope [23] while Figure 2(b)shows the corresponding visible image taken using 12510158401015840Ritchey Chretien Cassegrain (at F6) and ST10XME [24]Figures 2(c)ndash2(f) are the outputs of local variance distancetransform and segmentation steps

Figure 3 shows the fusion results obtained by ratio pyra-mid (RP) [25] dual tree complex wavelet transform (DTCWT)[26] nonsubsampled contourlet transform (NSCT) [27]multiresolution singular value decomposition (MSVD) [28]Ellmauthaler et al [8] and proposed schemes By visual com-parison it can be noticed that the proposed scheme providesbetter fusion results especially the preservation of back-ground intensity value as compared to existing state-of-the-art schemes

6 Advances in Astronomy

(a) (b) (c) (d)

(e) (f) (g) (h)Figure 4 Jupiterrsquos moon (a) IR image (b) visible image (c) RP [25] fusion (d) DTCWT [26] fusion (e) NSCT [27] fusion (f) MSVD [28]fusion (g) Ellmauthaler et al [8] fusion and (h) proposed fusion

(a) (b) (c) (d)

(e) (f) (g) (h)Figure 5 Nabula (M16) (a) IR image (b) visible image (c) RP [25] fusion (d) DTCWT [26] fusion (e) NSCT [27] fusion (f) MSVD [28]fusion (g) Ellmauthaler et al [8] fusion and (h) proposed fusion

Advances in Astronomy 7

Table 1 Quantitative comparison

Dataset Technique 119876119900

119876119890

119876119908

119876119878

119876MI

119876119861

119876119883

119876119885

Andromeda galaxy (M31)

Proposed 08220 08319 07707 04804 06461 03487 06612 04615Ellmauthaler et al [8] 07179 08444 07345 03647 06350 02229 05610 03387

MSVD [28] 06452 06946 06243 04148 04535 04432 00045 02675NSCT [27] 06259 06003 05576 02641 04606 01777 03432 02689

DTCWT [26] 07085 08134 06573 03113 05436 01682 02682 02290RP [25] 04706 05416 05102 02514 03970 05282 02075 02026

Jupiterrsquos moon

Proposed 07927 07255 07814 04622 07566 03433 06725 05617Ellmauthaler et al [8] 02832 06477 06343 04230 07398 01768 05672 05614

MSVD [28] 04780 04970 05217 05243 05292 04599 00065 04923NSCT [27] 04155 04083 04631 05279 05212 02672 05001 06139

DTCWT [26] 04571 05932 05851 03476 05973 01805 04785 04844RP [25] 03467 03989 04022 04749 03919 04825 00183 03634

Nabula (M16)

Proposed 07399 04896 08461 08587 08645 07646 07553 05652Ellmauthaler et al [8] 07318 04230 08446 08494 08563 05736 05424 05494

MSVD [28] 04918 04120 06466 06704 06539 05598 00051 04331NSCT [27] 05204 02980 06037 05899 06013 04916 03953 05058

DTCWT [26] 06736 03608 08023 08225 08125 04477 04631 05079RP [25] 05885 03099 06834 06818 06934 06597 01708 04639

Figures 4(a) and 4(b) show visible and IR Jupiterrsquos moonJPEG images taken by new Horizons spacecraft using mul-tispectral visible imaging camera and linear Etalon imagingspectral array [29] The fusion results obtained by RP [25]DTCWT [26] NSCT [27] MSVD [28] Ellmauthaler et al[8] and proposed schemes are shown in Figures 4(c)ndash4(h)respectively Note that only the proposed scheme is able toaccurately preserve both the moon texture (from IR image)and other stars (from visible image) in the fused image

Figures 5(a) and 5(b) show visible and IR Nabula (M16)JPEG images taken by Hubble space telescope [30] Thefusion results obtained by RP [25] DTCWT [26] NSCT [27]MSVD [28] Ellmauthaler et al [8] and proposed schemesare shown in Figures 5(c)ndash5(h) respectivelyThe fused imageusing proposed scheme highlights the IR information moreaccurately as compared to existing state-of-the-art schemes

Table 1 shows the quantitative comparison of existingand proposed schemes (where the bold values indicate bestresults) It can be observed that the results obtained usingproposed schemes are significantly better inmost of the casesmeasures as compared to existing state-of-the-art schemes

4 Conclusion

A fusion scheme for astronomical visibleIR images based onUDTCWT local standard deviation and distance transformis proposed The use of UDTCWT is helpful in retaininguseful details of the image The local standard deviationvariation measures presence or absence of small objects Thedistance transform activates the effects of proximity in thesegmentation process and eliminates effects of oversegmen-tation in addition to short sightedness The scheme reducesnoise artifacts and efficiently extracts the useful information(especially small objects) Simulation results on differentvisibleIR images verify the effectiveness of proposed scheme

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] L Kun G Lei L Huihui and C Jingsong ldquoFusion of infraredand visible light images based on region segmentationrdquo ChineseJournal of Aeronautics vol 22 no 1 pp 75ndash80 2009

[2] Z Fu X Dai Y Li H Wu and X Wang ldquoAn improved visibleand Infrared image fusion based on local energy and fuzzylogicrdquo in Proceedings of the 12th IEEE International Conferenceon Signal Processing pp 861ndash865 October 2014

[3] J Saeedi and K Faez ldquoInfrared and visible image fusion usingfuzzy logic and population-based optimizationrdquo Applied SoftComputing vol 12 no 3 pp 1041ndash1054 2012

[4] S Yin L Cao Y Ling and G Jin ldquoOne color contrast enhancedinfrared and visible image fusion methodrdquo Infrared Physics andTechnology vol 53 no 2 pp 146ndash150 2010

[5] Y Leung J Liu and J Zhang ldquoAn improved adaptive intensity-hue-saturationmethod for the fusion of remote sensing imagesrdquoIEEE Geoscience and Remote Sensing Letters vol 11 no 5 pp985ndash989 2014

[6] B Chen and B Xu ldquoA unified spatial-spectral-temporal fusionmodel using Landsat and MODIS imageryrdquo in Proceedingsof the 3rd International Workshop on Earth Observation andRemote Sensing Applications (EORSA rsquo14) pp 256ndash260 IEEEChangsha China June 2014

[7] D Connah M S Drew and G D Finlayson Spectral EdgeImage Fusion Theory and Applications Springer 2014

[8] A Ellmauthaler E A B da Silva C L Pagliari and S R NevesldquoInfrared-visible image fusion using the undecimated wavelettransform with spectral factorization and target extractionrdquo inProceedings of the 19th IEEE International Conference on ImageProcessing (ICIP rsquo12) pp 2661ndash2664 October 2012

8 Advances in Astronomy

[9] M Ding L Wei and B Wang ldquoResearch on fusion method forinfrared and visible images via compressive sensingrdquo InfraredPhysics and Technology vol 57 pp 56ndash67 2013

[10] B Huang H Song H Cui J Peng and Z Xu ldquoSpatial and spec-tral image fusion using sparsematrix factorizationrdquo IEEETrans-actions on Geoscience and Remote Sensing vol 52 no 3 pp1693ndash1704 2014

[11] X Zhang X Li Y FengH Zhao andZ Liu ldquoImage fusionwithinternal generative mechanismrdquo Expert Systems with Applica-tions vol 42 no 5 pp 2382ndash2391 2015

[12] Y Lu F Wang X Luo and F Liu ldquoNovel infrared and visibleimage fusion method based on independent component anal-ysisrdquo Frontiers in Computer Science vol 8 no 2 pp 243ndash2542014

[13] A Jameel A Ghafoor and M M Riaz ldquoAdaptive compressivefusion for visibleIR sensorsrdquo IEEE Sensors Journal vol 14 no7 pp 2230ndash2231 2014

[14] Z Liu H Yin B Fang and Y Chai ldquoA novel fusion schemefor visible and infrared images based on compressive sensingrdquoOptics Communications vol 335 pp 168ndash177 2015

[15] R Wang and L Du ldquoInfrared and visible image fusion basedon random projection and sparse representationrdquo InternationalJournal of Remote Sensing vol 35 no 5 pp 1640ndash1652 2014

[16] J Wang J Peng X Feng G He and J Fan ldquoFusion method forinfrared and visible images by using non-negative sparse repre-sentationrdquo Infrared Physics and Technology vol 67 pp 477ndash4892014

[17] G Piella and H Heijmans ldquoA new quality metric for imagefusionrdquo in Proceedings of the IEEE Conference on Image Process-ing Conference pp 171ndash173 September 2003

[18] Z Wang and A C Bovik ldquoA universal image quality indexrdquoIEEE Signal Processing Letters vol 9 no 3 pp 81ndash84 2002

[19] ZWang A C Bovik H R Sheikh and E P Simoncelli ldquoImagequality assessment from error visibility to structural similarityrdquoIEEE Transactions on Image Processing vol 13 no 4 pp 600ndash612 2004

[20] Y Chen and R S Blum ldquoA new automated quality assessmentalgorithm for image fusionrdquo Image and Vision Computing vol27 no 10 pp 1421ndash1432 2009

[21] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo IET Electronics Letters vol 36 no 4 pp 308ndash309 2000

[22] J Zhao R Laganiere and Z Liu ldquoPerformance assessment ofcombinative pixel-level image fusion based on an absolute fea-turemeasurementrdquo International Journal of Innovative Comput-ing Information and Control vol 3 no 6 pp 1433ndash1447 2007

[23] Andromeda galaxy (M31) IR httpsciesaintherschel48182-multiwavelength-images-of-the-andromeda-galaxy-m31

[24] Andromeda galaxy (M31) visible httpwwwrobgendlerastro-picscomM31Pagehtml

[25] A Toet ldquoImage fusion by a ratio of low-pass pyramidrdquo PatternRecognition Letters vol 9 no 4 pp 245ndash253 1989

[26] J J Lewis R J OrsquoCallaghan S G Nikolov D R Bull and NCanagarajah ldquoPixel- and region-based image fusion with com-plex waveletsrdquo Information Fusion vol 8 no 2 pp 119ndash1302007

[27] Q Zhang and B-L Guo ldquoMultifocus image fusion using thenonsubsampled contourlet transformrdquo Signal Processing vol89 no 7 pp 1334ndash1346 2009

[28] V P S Naidu ldquoImage fusion technique using multi-resolutionsingular value decompositionrdquo Defence Science Journal vol 61no 5 pp 479ndash484 2011

[29] Jupiterrsquos moon httpwwwtechnologyorg20141203plutos-closeup-will-awesome-based-jupiter-pics-new-horizons-space-craft

[30] Nabula (M16) httpwebbtelescopeorgwebb telescopetech-nology at the extremeskeep it coldphp

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

High Energy PhysicsAdvances in

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

FluidsJournal of

Atomic and Molecular Physics

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in Condensed Matter Physics

OpticsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

AstronomyAdvances in

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Superconductivity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Statistical MechanicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

GravityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

AstrophysicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Physics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Solid State PhysicsJournal of

 Computational  Methods in Physics

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Soft MatterJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

AerodynamicsJournal of

Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PhotonicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Biophysics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ThermodynamicsJournal of

Page 4: Research Article An Improved Infrared/Visible Fusion for ...

4 Advances in Astronomy

(a) (b) (c)

(d) (e) (f)

Figure 2 Andromeda galaxy (M31) (a) visible image (b) IR image (c) local variance image (d) distance image (e) IR segmented imageand (f) visual segmented image

distortion) 119876MI (mutual information) 119876119908 (weighted qualityindex) 119876119890 (edge dependent quality index) 119876119878 (structuralsimilarity index measure) 119876119861 (human perception inspiredmetric) 119876119883 (edge transform metric) and 119876119885 (image featuremetric) [17ndash22]

The119876119900metric [17 18] is designed throughmodality imagedistortion as combination of loss of correlation luminancedistortion and contrast distortion The 119876

MI metric [17]represents the orientation preservation and edge strengthvalues It models the perceptual loss of information in fusedresults in terms of how well the strength and orientationvalues of pixels in source images are represented in fusedimage It deals with the problem of objective evaluationof dynamic multisensor image fusion based on gradientinformation preservation between the inputs and the fusedimages It also takes into account additional scene and object

motion information present in multisensor sequences The119876119908 metric [17] is defined by assigning more weight to those

windows where saliency of the input image is high Itcorresponds to the areas that are likely to be perceptuallyimportant parts of the underlying scene The 119876119890 index [17]takes into account aspects of the human visual system whereit expresses the contribution of the edge information of thesource images to the fused images The 119876

119878 measure [19]is the similarity between two images and is designed toimprove traditional measures of mean square error and peaksignal to noise ratio which are inconsistent with human eyeperception The 119876119861 metric [20] evaluates the performance ofimage fusion for night vision applications using a perceptualquality evaluation method based on human visual systemmodels Image quality of fused image is assessed by contrastsensitivity function and contrast preservation map The 119876119883

Advances in Astronomy 5

(a) (b) (c)

(d) (e) (f)

Figure 3Andromeda galaxy (M31) (a) RP [25] fusion (b)DTCWT[26] fusion (c)NSCT [27] fusion (d)MSVD[28] fusion (e) Ellmauthaleret al [8] fusion and (f) proposed fusion

metric [21] assesses the pixel-level fusion performance andreflects the quality of visual information obtained from thefusion of input images The119876119885metric [22] evaluates the per-formance of the combinative pixel-level image fusion basedon an image feature measurement (ie phase congruencyand its moments) and provides an absolute measurementof image features By comparing the local cross-correlationof corresponding feature maps of input images and fusedoutput the quality of the fused result is assessed without areference image

These qualitymetrics [17ndash22] workwell for noisy blurredand distorted images using multiscale transformation arith-metic statistical and compressive sensing based schemesformultiexposuremultiresolution andmultimodal environ-mentsThese are also useful for remote and airborne sensingmilitary and industrial engineering related applications Thenormalized range of these measures is between 0 and 1

where high values imply better fusion metric for each qualitymeasure

Figure 2(a) shows Andromeda galaxy (M31) JPEG IRimage taken by Spitzer space telescope [23] while Figure 2(b)shows the corresponding visible image taken using 12510158401015840Ritchey Chretien Cassegrain (at F6) and ST10XME [24]Figures 2(c)ndash2(f) are the outputs of local variance distancetransform and segmentation steps

Figure 3 shows the fusion results obtained by ratio pyra-mid (RP) [25] dual tree complex wavelet transform (DTCWT)[26] nonsubsampled contourlet transform (NSCT) [27]multiresolution singular value decomposition (MSVD) [28]Ellmauthaler et al [8] and proposed schemes By visual com-parison it can be noticed that the proposed scheme providesbetter fusion results especially the preservation of back-ground intensity value as compared to existing state-of-the-art schemes

6 Advances in Astronomy

(a) (b) (c) (d)

(e) (f) (g) (h)Figure 4 Jupiterrsquos moon (a) IR image (b) visible image (c) RP [25] fusion (d) DTCWT [26] fusion (e) NSCT [27] fusion (f) MSVD [28]fusion (g) Ellmauthaler et al [8] fusion and (h) proposed fusion

(a) (b) (c) (d)

(e) (f) (g) (h)Figure 5 Nabula (M16) (a) IR image (b) visible image (c) RP [25] fusion (d) DTCWT [26] fusion (e) NSCT [27] fusion (f) MSVD [28]fusion (g) Ellmauthaler et al [8] fusion and (h) proposed fusion

Advances in Astronomy 7

Table 1 Quantitative comparison

Dataset Technique 119876119900

119876119890

119876119908

119876119878

119876MI

119876119861

119876119883

119876119885

Andromeda galaxy (M31)

Proposed 08220 08319 07707 04804 06461 03487 06612 04615Ellmauthaler et al [8] 07179 08444 07345 03647 06350 02229 05610 03387

MSVD [28] 06452 06946 06243 04148 04535 04432 00045 02675NSCT [27] 06259 06003 05576 02641 04606 01777 03432 02689

DTCWT [26] 07085 08134 06573 03113 05436 01682 02682 02290RP [25] 04706 05416 05102 02514 03970 05282 02075 02026

Jupiterrsquos moon

Proposed 07927 07255 07814 04622 07566 03433 06725 05617Ellmauthaler et al [8] 02832 06477 06343 04230 07398 01768 05672 05614

MSVD [28] 04780 04970 05217 05243 05292 04599 00065 04923NSCT [27] 04155 04083 04631 05279 05212 02672 05001 06139

DTCWT [26] 04571 05932 05851 03476 05973 01805 04785 04844RP [25] 03467 03989 04022 04749 03919 04825 00183 03634

Nabula (M16)

Proposed 07399 04896 08461 08587 08645 07646 07553 05652Ellmauthaler et al [8] 07318 04230 08446 08494 08563 05736 05424 05494

MSVD [28] 04918 04120 06466 06704 06539 05598 00051 04331NSCT [27] 05204 02980 06037 05899 06013 04916 03953 05058

DTCWT [26] 06736 03608 08023 08225 08125 04477 04631 05079RP [25] 05885 03099 06834 06818 06934 06597 01708 04639

Figures 4(a) and 4(b) show visible and IR Jupiterrsquos moonJPEG images taken by new Horizons spacecraft using mul-tispectral visible imaging camera and linear Etalon imagingspectral array [29] The fusion results obtained by RP [25]DTCWT [26] NSCT [27] MSVD [28] Ellmauthaler et al[8] and proposed schemes are shown in Figures 4(c)ndash4(h)respectively Note that only the proposed scheme is able toaccurately preserve both the moon texture (from IR image)and other stars (from visible image) in the fused image

Figures 5(a) and 5(b) show visible and IR Nabula (M16)JPEG images taken by Hubble space telescope [30] Thefusion results obtained by RP [25] DTCWT [26] NSCT [27]MSVD [28] Ellmauthaler et al [8] and proposed schemesare shown in Figures 5(c)ndash5(h) respectivelyThe fused imageusing proposed scheme highlights the IR information moreaccurately as compared to existing state-of-the-art schemes

Table 1 shows the quantitative comparison of existingand proposed schemes (where the bold values indicate bestresults) It can be observed that the results obtained usingproposed schemes are significantly better inmost of the casesmeasures as compared to existing state-of-the-art schemes

4 Conclusion

A fusion scheme for astronomical visibleIR images based onUDTCWT local standard deviation and distance transformis proposed The use of UDTCWT is helpful in retaininguseful details of the image The local standard deviationvariation measures presence or absence of small objects Thedistance transform activates the effects of proximity in thesegmentation process and eliminates effects of oversegmen-tation in addition to short sightedness The scheme reducesnoise artifacts and efficiently extracts the useful information(especially small objects) Simulation results on differentvisibleIR images verify the effectiveness of proposed scheme

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] L Kun G Lei L Huihui and C Jingsong ldquoFusion of infraredand visible light images based on region segmentationrdquo ChineseJournal of Aeronautics vol 22 no 1 pp 75ndash80 2009

[2] Z Fu X Dai Y Li H Wu and X Wang ldquoAn improved visibleand Infrared image fusion based on local energy and fuzzylogicrdquo in Proceedings of the 12th IEEE International Conferenceon Signal Processing pp 861ndash865 October 2014

[3] J Saeedi and K Faez ldquoInfrared and visible image fusion usingfuzzy logic and population-based optimizationrdquo Applied SoftComputing vol 12 no 3 pp 1041ndash1054 2012

[4] S Yin L Cao Y Ling and G Jin ldquoOne color contrast enhancedinfrared and visible image fusion methodrdquo Infrared Physics andTechnology vol 53 no 2 pp 146ndash150 2010

[5] Y Leung J Liu and J Zhang ldquoAn improved adaptive intensity-hue-saturationmethod for the fusion of remote sensing imagesrdquoIEEE Geoscience and Remote Sensing Letters vol 11 no 5 pp985ndash989 2014

[6] B Chen and B Xu ldquoA unified spatial-spectral-temporal fusionmodel using Landsat and MODIS imageryrdquo in Proceedingsof the 3rd International Workshop on Earth Observation andRemote Sensing Applications (EORSA rsquo14) pp 256ndash260 IEEEChangsha China June 2014

[7] D Connah M S Drew and G D Finlayson Spectral EdgeImage Fusion Theory and Applications Springer 2014

[8] A Ellmauthaler E A B da Silva C L Pagliari and S R NevesldquoInfrared-visible image fusion using the undecimated wavelettransform with spectral factorization and target extractionrdquo inProceedings of the 19th IEEE International Conference on ImageProcessing (ICIP rsquo12) pp 2661ndash2664 October 2012

8 Advances in Astronomy

[9] M Ding L Wei and B Wang ldquoResearch on fusion method forinfrared and visible images via compressive sensingrdquo InfraredPhysics and Technology vol 57 pp 56ndash67 2013

[10] B Huang H Song H Cui J Peng and Z Xu ldquoSpatial and spec-tral image fusion using sparsematrix factorizationrdquo IEEETrans-actions on Geoscience and Remote Sensing vol 52 no 3 pp1693ndash1704 2014

[11] X Zhang X Li Y FengH Zhao andZ Liu ldquoImage fusionwithinternal generative mechanismrdquo Expert Systems with Applica-tions vol 42 no 5 pp 2382ndash2391 2015

[12] Y Lu F Wang X Luo and F Liu ldquoNovel infrared and visibleimage fusion method based on independent component anal-ysisrdquo Frontiers in Computer Science vol 8 no 2 pp 243ndash2542014

[13] A Jameel A Ghafoor and M M Riaz ldquoAdaptive compressivefusion for visibleIR sensorsrdquo IEEE Sensors Journal vol 14 no7 pp 2230ndash2231 2014

[14] Z Liu H Yin B Fang and Y Chai ldquoA novel fusion schemefor visible and infrared images based on compressive sensingrdquoOptics Communications vol 335 pp 168ndash177 2015

[15] R Wang and L Du ldquoInfrared and visible image fusion basedon random projection and sparse representationrdquo InternationalJournal of Remote Sensing vol 35 no 5 pp 1640ndash1652 2014

[16] J Wang J Peng X Feng G He and J Fan ldquoFusion method forinfrared and visible images by using non-negative sparse repre-sentationrdquo Infrared Physics and Technology vol 67 pp 477ndash4892014

[17] G Piella and H Heijmans ldquoA new quality metric for imagefusionrdquo in Proceedings of the IEEE Conference on Image Process-ing Conference pp 171ndash173 September 2003

[18] Z Wang and A C Bovik ldquoA universal image quality indexrdquoIEEE Signal Processing Letters vol 9 no 3 pp 81ndash84 2002

[19] ZWang A C Bovik H R Sheikh and E P Simoncelli ldquoImagequality assessment from error visibility to structural similarityrdquoIEEE Transactions on Image Processing vol 13 no 4 pp 600ndash612 2004

[20] Y Chen and R S Blum ldquoA new automated quality assessmentalgorithm for image fusionrdquo Image and Vision Computing vol27 no 10 pp 1421ndash1432 2009

[21] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo IET Electronics Letters vol 36 no 4 pp 308ndash309 2000

[22] J Zhao R Laganiere and Z Liu ldquoPerformance assessment ofcombinative pixel-level image fusion based on an absolute fea-turemeasurementrdquo International Journal of Innovative Comput-ing Information and Control vol 3 no 6 pp 1433ndash1447 2007

[23] Andromeda galaxy (M31) IR httpsciesaintherschel48182-multiwavelength-images-of-the-andromeda-galaxy-m31

[24] Andromeda galaxy (M31) visible httpwwwrobgendlerastro-picscomM31Pagehtml

[25] A Toet ldquoImage fusion by a ratio of low-pass pyramidrdquo PatternRecognition Letters vol 9 no 4 pp 245ndash253 1989

[26] J J Lewis R J OrsquoCallaghan S G Nikolov D R Bull and NCanagarajah ldquoPixel- and region-based image fusion with com-plex waveletsrdquo Information Fusion vol 8 no 2 pp 119ndash1302007

[27] Q Zhang and B-L Guo ldquoMultifocus image fusion using thenonsubsampled contourlet transformrdquo Signal Processing vol89 no 7 pp 1334ndash1346 2009

[28] V P S Naidu ldquoImage fusion technique using multi-resolutionsingular value decompositionrdquo Defence Science Journal vol 61no 5 pp 479ndash484 2011

[29] Jupiterrsquos moon httpwwwtechnologyorg20141203plutos-closeup-will-awesome-based-jupiter-pics-new-horizons-space-craft

[30] Nabula (M16) httpwebbtelescopeorgwebb telescopetech-nology at the extremeskeep it coldphp

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

High Energy PhysicsAdvances in

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

FluidsJournal of

Atomic and Molecular Physics

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in Condensed Matter Physics

OpticsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

AstronomyAdvances in

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Superconductivity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Statistical MechanicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

GravityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

AstrophysicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Physics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Solid State PhysicsJournal of

 Computational  Methods in Physics

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Soft MatterJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

AerodynamicsJournal of

Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PhotonicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Biophysics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ThermodynamicsJournal of

Page 5: Research Article An Improved Infrared/Visible Fusion for ...

Advances in Astronomy 5

(a) (b) (c)

(d) (e) (f)

Figure 3Andromeda galaxy (M31) (a) RP [25] fusion (b)DTCWT[26] fusion (c)NSCT [27] fusion (d)MSVD[28] fusion (e) Ellmauthaleret al [8] fusion and (f) proposed fusion

metric [21] assesses the pixel-level fusion performance andreflects the quality of visual information obtained from thefusion of input images The119876119885metric [22] evaluates the per-formance of the combinative pixel-level image fusion basedon an image feature measurement (ie phase congruencyand its moments) and provides an absolute measurementof image features By comparing the local cross-correlationof corresponding feature maps of input images and fusedoutput the quality of the fused result is assessed without areference image

These qualitymetrics [17ndash22] workwell for noisy blurredand distorted images using multiscale transformation arith-metic statistical and compressive sensing based schemesformultiexposuremultiresolution andmultimodal environ-mentsThese are also useful for remote and airborne sensingmilitary and industrial engineering related applications Thenormalized range of these measures is between 0 and 1

where high values imply better fusion metric for each qualitymeasure

Figure 2(a) shows Andromeda galaxy (M31) JPEG IRimage taken by Spitzer space telescope [23] while Figure 2(b)shows the corresponding visible image taken using 12510158401015840Ritchey Chretien Cassegrain (at F6) and ST10XME [24]Figures 2(c)ndash2(f) are the outputs of local variance distancetransform and segmentation steps

Figure 3 shows the fusion results obtained by ratio pyra-mid (RP) [25] dual tree complex wavelet transform (DTCWT)[26] nonsubsampled contourlet transform (NSCT) [27]multiresolution singular value decomposition (MSVD) [28]Ellmauthaler et al [8] and proposed schemes By visual com-parison it can be noticed that the proposed scheme providesbetter fusion results especially the preservation of back-ground intensity value as compared to existing state-of-the-art schemes

6 Advances in Astronomy

(a) (b) (c) (d)

(e) (f) (g) (h)Figure 4 Jupiterrsquos moon (a) IR image (b) visible image (c) RP [25] fusion (d) DTCWT [26] fusion (e) NSCT [27] fusion (f) MSVD [28]fusion (g) Ellmauthaler et al [8] fusion and (h) proposed fusion

(a) (b) (c) (d)

(e) (f) (g) (h)Figure 5 Nabula (M16) (a) IR image (b) visible image (c) RP [25] fusion (d) DTCWT [26] fusion (e) NSCT [27] fusion (f) MSVD [28]fusion (g) Ellmauthaler et al [8] fusion and (h) proposed fusion

Advances in Astronomy 7

Table 1 Quantitative comparison

Dataset Technique 119876119900

119876119890

119876119908

119876119878

119876MI

119876119861

119876119883

119876119885

Andromeda galaxy (M31)

Proposed 08220 08319 07707 04804 06461 03487 06612 04615Ellmauthaler et al [8] 07179 08444 07345 03647 06350 02229 05610 03387

MSVD [28] 06452 06946 06243 04148 04535 04432 00045 02675NSCT [27] 06259 06003 05576 02641 04606 01777 03432 02689

DTCWT [26] 07085 08134 06573 03113 05436 01682 02682 02290RP [25] 04706 05416 05102 02514 03970 05282 02075 02026

Jupiterrsquos moon

Proposed 07927 07255 07814 04622 07566 03433 06725 05617Ellmauthaler et al [8] 02832 06477 06343 04230 07398 01768 05672 05614

MSVD [28] 04780 04970 05217 05243 05292 04599 00065 04923NSCT [27] 04155 04083 04631 05279 05212 02672 05001 06139

DTCWT [26] 04571 05932 05851 03476 05973 01805 04785 04844RP [25] 03467 03989 04022 04749 03919 04825 00183 03634

Nabula (M16)

Proposed 07399 04896 08461 08587 08645 07646 07553 05652Ellmauthaler et al [8] 07318 04230 08446 08494 08563 05736 05424 05494

MSVD [28] 04918 04120 06466 06704 06539 05598 00051 04331NSCT [27] 05204 02980 06037 05899 06013 04916 03953 05058

DTCWT [26] 06736 03608 08023 08225 08125 04477 04631 05079RP [25] 05885 03099 06834 06818 06934 06597 01708 04639

Figures 4(a) and 4(b) show visible and IR Jupiterrsquos moonJPEG images taken by new Horizons spacecraft using mul-tispectral visible imaging camera and linear Etalon imagingspectral array [29] The fusion results obtained by RP [25]DTCWT [26] NSCT [27] MSVD [28] Ellmauthaler et al[8] and proposed schemes are shown in Figures 4(c)ndash4(h)respectively Note that only the proposed scheme is able toaccurately preserve both the moon texture (from IR image)and other stars (from visible image) in the fused image

Figures 5(a) and 5(b) show visible and IR Nabula (M16)JPEG images taken by Hubble space telescope [30] Thefusion results obtained by RP [25] DTCWT [26] NSCT [27]MSVD [28] Ellmauthaler et al [8] and proposed schemesare shown in Figures 5(c)ndash5(h) respectivelyThe fused imageusing proposed scheme highlights the IR information moreaccurately as compared to existing state-of-the-art schemes

Table 1 shows the quantitative comparison of existingand proposed schemes (where the bold values indicate bestresults) It can be observed that the results obtained usingproposed schemes are significantly better inmost of the casesmeasures as compared to existing state-of-the-art schemes

4 Conclusion

A fusion scheme for astronomical visibleIR images based onUDTCWT local standard deviation and distance transformis proposed The use of UDTCWT is helpful in retaininguseful details of the image The local standard deviationvariation measures presence or absence of small objects Thedistance transform activates the effects of proximity in thesegmentation process and eliminates effects of oversegmen-tation in addition to short sightedness The scheme reducesnoise artifacts and efficiently extracts the useful information(especially small objects) Simulation results on differentvisibleIR images verify the effectiveness of proposed scheme

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] L Kun G Lei L Huihui and C Jingsong ldquoFusion of infraredand visible light images based on region segmentationrdquo ChineseJournal of Aeronautics vol 22 no 1 pp 75ndash80 2009

[2] Z Fu X Dai Y Li H Wu and X Wang ldquoAn improved visibleand Infrared image fusion based on local energy and fuzzylogicrdquo in Proceedings of the 12th IEEE International Conferenceon Signal Processing pp 861ndash865 October 2014

[3] J Saeedi and K Faez ldquoInfrared and visible image fusion usingfuzzy logic and population-based optimizationrdquo Applied SoftComputing vol 12 no 3 pp 1041ndash1054 2012

[4] S Yin L Cao Y Ling and G Jin ldquoOne color contrast enhancedinfrared and visible image fusion methodrdquo Infrared Physics andTechnology vol 53 no 2 pp 146ndash150 2010

[5] Y Leung J Liu and J Zhang ldquoAn improved adaptive intensity-hue-saturationmethod for the fusion of remote sensing imagesrdquoIEEE Geoscience and Remote Sensing Letters vol 11 no 5 pp985ndash989 2014

[6] B Chen and B Xu ldquoA unified spatial-spectral-temporal fusionmodel using Landsat and MODIS imageryrdquo in Proceedingsof the 3rd International Workshop on Earth Observation andRemote Sensing Applications (EORSA rsquo14) pp 256ndash260 IEEEChangsha China June 2014

[7] D Connah M S Drew and G D Finlayson Spectral EdgeImage Fusion Theory and Applications Springer 2014

[8] A Ellmauthaler E A B da Silva C L Pagliari and S R NevesldquoInfrared-visible image fusion using the undecimated wavelettransform with spectral factorization and target extractionrdquo inProceedings of the 19th IEEE International Conference on ImageProcessing (ICIP rsquo12) pp 2661ndash2664 October 2012

8 Advances in Astronomy

[9] M Ding L Wei and B Wang ldquoResearch on fusion method forinfrared and visible images via compressive sensingrdquo InfraredPhysics and Technology vol 57 pp 56ndash67 2013

[10] B Huang H Song H Cui J Peng and Z Xu ldquoSpatial and spec-tral image fusion using sparsematrix factorizationrdquo IEEETrans-actions on Geoscience and Remote Sensing vol 52 no 3 pp1693ndash1704 2014

[11] X Zhang X Li Y FengH Zhao andZ Liu ldquoImage fusionwithinternal generative mechanismrdquo Expert Systems with Applica-tions vol 42 no 5 pp 2382ndash2391 2015

[12] Y Lu F Wang X Luo and F Liu ldquoNovel infrared and visibleimage fusion method based on independent component anal-ysisrdquo Frontiers in Computer Science vol 8 no 2 pp 243ndash2542014

[13] A Jameel A Ghafoor and M M Riaz ldquoAdaptive compressivefusion for visibleIR sensorsrdquo IEEE Sensors Journal vol 14 no7 pp 2230ndash2231 2014

[14] Z Liu H Yin B Fang and Y Chai ldquoA novel fusion schemefor visible and infrared images based on compressive sensingrdquoOptics Communications vol 335 pp 168ndash177 2015

[15] R Wang and L Du ldquoInfrared and visible image fusion basedon random projection and sparse representationrdquo InternationalJournal of Remote Sensing vol 35 no 5 pp 1640ndash1652 2014

[16] J Wang J Peng X Feng G He and J Fan ldquoFusion method forinfrared and visible images by using non-negative sparse repre-sentationrdquo Infrared Physics and Technology vol 67 pp 477ndash4892014

[17] G Piella and H Heijmans ldquoA new quality metric for imagefusionrdquo in Proceedings of the IEEE Conference on Image Process-ing Conference pp 171ndash173 September 2003

[18] Z Wang and A C Bovik ldquoA universal image quality indexrdquoIEEE Signal Processing Letters vol 9 no 3 pp 81ndash84 2002

[19] ZWang A C Bovik H R Sheikh and E P Simoncelli ldquoImagequality assessment from error visibility to structural similarityrdquoIEEE Transactions on Image Processing vol 13 no 4 pp 600ndash612 2004

[20] Y Chen and R S Blum ldquoA new automated quality assessmentalgorithm for image fusionrdquo Image and Vision Computing vol27 no 10 pp 1421ndash1432 2009

[21] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo IET Electronics Letters vol 36 no 4 pp 308ndash309 2000

[22] J Zhao R Laganiere and Z Liu ldquoPerformance assessment ofcombinative pixel-level image fusion based on an absolute fea-turemeasurementrdquo International Journal of Innovative Comput-ing Information and Control vol 3 no 6 pp 1433ndash1447 2007

[23] Andromeda galaxy (M31) IR httpsciesaintherschel48182-multiwavelength-images-of-the-andromeda-galaxy-m31

[24] Andromeda galaxy (M31) visible httpwwwrobgendlerastro-picscomM31Pagehtml

[25] A Toet ldquoImage fusion by a ratio of low-pass pyramidrdquo PatternRecognition Letters vol 9 no 4 pp 245ndash253 1989

[26] J J Lewis R J OrsquoCallaghan S G Nikolov D R Bull and NCanagarajah ldquoPixel- and region-based image fusion with com-plex waveletsrdquo Information Fusion vol 8 no 2 pp 119ndash1302007

[27] Q Zhang and B-L Guo ldquoMultifocus image fusion using thenonsubsampled contourlet transformrdquo Signal Processing vol89 no 7 pp 1334ndash1346 2009

[28] V P S Naidu ldquoImage fusion technique using multi-resolutionsingular value decompositionrdquo Defence Science Journal vol 61no 5 pp 479ndash484 2011

[29] Jupiterrsquos moon httpwwwtechnologyorg20141203plutos-closeup-will-awesome-based-jupiter-pics-new-horizons-space-craft

[30] Nabula (M16) httpwebbtelescopeorgwebb telescopetech-nology at the extremeskeep it coldphp

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

High Energy PhysicsAdvances in

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

FluidsJournal of

Atomic and Molecular Physics

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in Condensed Matter Physics

OpticsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

AstronomyAdvances in

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Superconductivity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Statistical MechanicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

GravityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

AstrophysicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Physics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Solid State PhysicsJournal of

 Computational  Methods in Physics

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Soft MatterJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

AerodynamicsJournal of

Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PhotonicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Biophysics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ThermodynamicsJournal of

Page 6: Research Article An Improved Infrared/Visible Fusion for ...

6 Advances in Astronomy

(a) (b) (c) (d)

(e) (f) (g) (h)Figure 4 Jupiterrsquos moon (a) IR image (b) visible image (c) RP [25] fusion (d) DTCWT [26] fusion (e) NSCT [27] fusion (f) MSVD [28]fusion (g) Ellmauthaler et al [8] fusion and (h) proposed fusion

(a) (b) (c) (d)

(e) (f) (g) (h)Figure 5 Nabula (M16) (a) IR image (b) visible image (c) RP [25] fusion (d) DTCWT [26] fusion (e) NSCT [27] fusion (f) MSVD [28]fusion (g) Ellmauthaler et al [8] fusion and (h) proposed fusion

Advances in Astronomy 7

Table 1 Quantitative comparison

Dataset Technique 119876119900

119876119890

119876119908

119876119878

119876MI

119876119861

119876119883

119876119885

Andromeda galaxy (M31)

Proposed 08220 08319 07707 04804 06461 03487 06612 04615Ellmauthaler et al [8] 07179 08444 07345 03647 06350 02229 05610 03387

MSVD [28] 06452 06946 06243 04148 04535 04432 00045 02675NSCT [27] 06259 06003 05576 02641 04606 01777 03432 02689

DTCWT [26] 07085 08134 06573 03113 05436 01682 02682 02290RP [25] 04706 05416 05102 02514 03970 05282 02075 02026

Jupiterrsquos moon

Proposed 07927 07255 07814 04622 07566 03433 06725 05617Ellmauthaler et al [8] 02832 06477 06343 04230 07398 01768 05672 05614

MSVD [28] 04780 04970 05217 05243 05292 04599 00065 04923NSCT [27] 04155 04083 04631 05279 05212 02672 05001 06139

DTCWT [26] 04571 05932 05851 03476 05973 01805 04785 04844RP [25] 03467 03989 04022 04749 03919 04825 00183 03634

Nabula (M16)

Proposed 07399 04896 08461 08587 08645 07646 07553 05652Ellmauthaler et al [8] 07318 04230 08446 08494 08563 05736 05424 05494

MSVD [28] 04918 04120 06466 06704 06539 05598 00051 04331NSCT [27] 05204 02980 06037 05899 06013 04916 03953 05058

DTCWT [26] 06736 03608 08023 08225 08125 04477 04631 05079RP [25] 05885 03099 06834 06818 06934 06597 01708 04639

Figures 4(a) and 4(b) show visible and IR Jupiterrsquos moonJPEG images taken by new Horizons spacecraft using mul-tispectral visible imaging camera and linear Etalon imagingspectral array [29] The fusion results obtained by RP [25]DTCWT [26] NSCT [27] MSVD [28] Ellmauthaler et al[8] and proposed schemes are shown in Figures 4(c)ndash4(h)respectively Note that only the proposed scheme is able toaccurately preserve both the moon texture (from IR image)and other stars (from visible image) in the fused image

Figures 5(a) and 5(b) show visible and IR Nabula (M16)JPEG images taken by Hubble space telescope [30] Thefusion results obtained by RP [25] DTCWT [26] NSCT [27]MSVD [28] Ellmauthaler et al [8] and proposed schemesare shown in Figures 5(c)ndash5(h) respectivelyThe fused imageusing proposed scheme highlights the IR information moreaccurately as compared to existing state-of-the-art schemes

Table 1 shows the quantitative comparison of existingand proposed schemes (where the bold values indicate bestresults) It can be observed that the results obtained usingproposed schemes are significantly better inmost of the casesmeasures as compared to existing state-of-the-art schemes

4 Conclusion

A fusion scheme for astronomical visibleIR images based onUDTCWT local standard deviation and distance transformis proposed The use of UDTCWT is helpful in retaininguseful details of the image The local standard deviationvariation measures presence or absence of small objects Thedistance transform activates the effects of proximity in thesegmentation process and eliminates effects of oversegmen-tation in addition to short sightedness The scheme reducesnoise artifacts and efficiently extracts the useful information(especially small objects) Simulation results on differentvisibleIR images verify the effectiveness of proposed scheme

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] L Kun G Lei L Huihui and C Jingsong ldquoFusion of infraredand visible light images based on region segmentationrdquo ChineseJournal of Aeronautics vol 22 no 1 pp 75ndash80 2009

[2] Z Fu X Dai Y Li H Wu and X Wang ldquoAn improved visibleand Infrared image fusion based on local energy and fuzzylogicrdquo in Proceedings of the 12th IEEE International Conferenceon Signal Processing pp 861ndash865 October 2014

[3] J Saeedi and K Faez ldquoInfrared and visible image fusion usingfuzzy logic and population-based optimizationrdquo Applied SoftComputing vol 12 no 3 pp 1041ndash1054 2012

[4] S Yin L Cao Y Ling and G Jin ldquoOne color contrast enhancedinfrared and visible image fusion methodrdquo Infrared Physics andTechnology vol 53 no 2 pp 146ndash150 2010

[5] Y Leung J Liu and J Zhang ldquoAn improved adaptive intensity-hue-saturationmethod for the fusion of remote sensing imagesrdquoIEEE Geoscience and Remote Sensing Letters vol 11 no 5 pp985ndash989 2014

[6] B Chen and B Xu ldquoA unified spatial-spectral-temporal fusionmodel using Landsat and MODIS imageryrdquo in Proceedingsof the 3rd International Workshop on Earth Observation andRemote Sensing Applications (EORSA rsquo14) pp 256ndash260 IEEEChangsha China June 2014

[7] D Connah M S Drew and G D Finlayson Spectral EdgeImage Fusion Theory and Applications Springer 2014

[8] A Ellmauthaler E A B da Silva C L Pagliari and S R NevesldquoInfrared-visible image fusion using the undecimated wavelettransform with spectral factorization and target extractionrdquo inProceedings of the 19th IEEE International Conference on ImageProcessing (ICIP rsquo12) pp 2661ndash2664 October 2012

8 Advances in Astronomy

[9] M Ding L Wei and B Wang ldquoResearch on fusion method forinfrared and visible images via compressive sensingrdquo InfraredPhysics and Technology vol 57 pp 56ndash67 2013

[10] B Huang H Song H Cui J Peng and Z Xu ldquoSpatial and spec-tral image fusion using sparsematrix factorizationrdquo IEEETrans-actions on Geoscience and Remote Sensing vol 52 no 3 pp1693ndash1704 2014

[11] X Zhang X Li Y FengH Zhao andZ Liu ldquoImage fusionwithinternal generative mechanismrdquo Expert Systems with Applica-tions vol 42 no 5 pp 2382ndash2391 2015

[12] Y Lu F Wang X Luo and F Liu ldquoNovel infrared and visibleimage fusion method based on independent component anal-ysisrdquo Frontiers in Computer Science vol 8 no 2 pp 243ndash2542014

[13] A Jameel A Ghafoor and M M Riaz ldquoAdaptive compressivefusion for visibleIR sensorsrdquo IEEE Sensors Journal vol 14 no7 pp 2230ndash2231 2014

[14] Z Liu H Yin B Fang and Y Chai ldquoA novel fusion schemefor visible and infrared images based on compressive sensingrdquoOptics Communications vol 335 pp 168ndash177 2015

[15] R Wang and L Du ldquoInfrared and visible image fusion basedon random projection and sparse representationrdquo InternationalJournal of Remote Sensing vol 35 no 5 pp 1640ndash1652 2014

[16] J Wang J Peng X Feng G He and J Fan ldquoFusion method forinfrared and visible images by using non-negative sparse repre-sentationrdquo Infrared Physics and Technology vol 67 pp 477ndash4892014

[17] G Piella and H Heijmans ldquoA new quality metric for imagefusionrdquo in Proceedings of the IEEE Conference on Image Process-ing Conference pp 171ndash173 September 2003

[18] Z Wang and A C Bovik ldquoA universal image quality indexrdquoIEEE Signal Processing Letters vol 9 no 3 pp 81ndash84 2002

[19] ZWang A C Bovik H R Sheikh and E P Simoncelli ldquoImagequality assessment from error visibility to structural similarityrdquoIEEE Transactions on Image Processing vol 13 no 4 pp 600ndash612 2004

[20] Y Chen and R S Blum ldquoA new automated quality assessmentalgorithm for image fusionrdquo Image and Vision Computing vol27 no 10 pp 1421ndash1432 2009

[21] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo IET Electronics Letters vol 36 no 4 pp 308ndash309 2000

[22] J Zhao R Laganiere and Z Liu ldquoPerformance assessment ofcombinative pixel-level image fusion based on an absolute fea-turemeasurementrdquo International Journal of Innovative Comput-ing Information and Control vol 3 no 6 pp 1433ndash1447 2007

[23] Andromeda galaxy (M31) IR httpsciesaintherschel48182-multiwavelength-images-of-the-andromeda-galaxy-m31

[24] Andromeda galaxy (M31) visible httpwwwrobgendlerastro-picscomM31Pagehtml

[25] A Toet ldquoImage fusion by a ratio of low-pass pyramidrdquo PatternRecognition Letters vol 9 no 4 pp 245ndash253 1989

[26] J J Lewis R J OrsquoCallaghan S G Nikolov D R Bull and NCanagarajah ldquoPixel- and region-based image fusion with com-plex waveletsrdquo Information Fusion vol 8 no 2 pp 119ndash1302007

[27] Q Zhang and B-L Guo ldquoMultifocus image fusion using thenonsubsampled contourlet transformrdquo Signal Processing vol89 no 7 pp 1334ndash1346 2009

[28] V P S Naidu ldquoImage fusion technique using multi-resolutionsingular value decompositionrdquo Defence Science Journal vol 61no 5 pp 479ndash484 2011

[29] Jupiterrsquos moon httpwwwtechnologyorg20141203plutos-closeup-will-awesome-based-jupiter-pics-new-horizons-space-craft

[30] Nabula (M16) httpwebbtelescopeorgwebb telescopetech-nology at the extremeskeep it coldphp

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

High Energy PhysicsAdvances in

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

FluidsJournal of

Atomic and Molecular Physics

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in Condensed Matter Physics

OpticsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

AstronomyAdvances in

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Superconductivity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Statistical MechanicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

GravityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

AstrophysicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Physics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Solid State PhysicsJournal of

 Computational  Methods in Physics

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Soft MatterJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

AerodynamicsJournal of

Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PhotonicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Biophysics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ThermodynamicsJournal of

Page 7: Research Article An Improved Infrared/Visible Fusion for ...

Advances in Astronomy 7

Table 1 Quantitative comparison

Dataset Technique 119876119900

119876119890

119876119908

119876119878

119876MI

119876119861

119876119883

119876119885

Andromeda galaxy (M31)

Proposed 08220 08319 07707 04804 06461 03487 06612 04615Ellmauthaler et al [8] 07179 08444 07345 03647 06350 02229 05610 03387

MSVD [28] 06452 06946 06243 04148 04535 04432 00045 02675NSCT [27] 06259 06003 05576 02641 04606 01777 03432 02689

DTCWT [26] 07085 08134 06573 03113 05436 01682 02682 02290RP [25] 04706 05416 05102 02514 03970 05282 02075 02026

Jupiterrsquos moon

Proposed 07927 07255 07814 04622 07566 03433 06725 05617Ellmauthaler et al [8] 02832 06477 06343 04230 07398 01768 05672 05614

MSVD [28] 04780 04970 05217 05243 05292 04599 00065 04923NSCT [27] 04155 04083 04631 05279 05212 02672 05001 06139

DTCWT [26] 04571 05932 05851 03476 05973 01805 04785 04844RP [25] 03467 03989 04022 04749 03919 04825 00183 03634

Nabula (M16)

Proposed 07399 04896 08461 08587 08645 07646 07553 05652Ellmauthaler et al [8] 07318 04230 08446 08494 08563 05736 05424 05494

MSVD [28] 04918 04120 06466 06704 06539 05598 00051 04331NSCT [27] 05204 02980 06037 05899 06013 04916 03953 05058

DTCWT [26] 06736 03608 08023 08225 08125 04477 04631 05079RP [25] 05885 03099 06834 06818 06934 06597 01708 04639

Figures 4(a) and 4(b) show visible and IR Jupiterrsquos moonJPEG images taken by new Horizons spacecraft using mul-tispectral visible imaging camera and linear Etalon imagingspectral array [29] The fusion results obtained by RP [25]DTCWT [26] NSCT [27] MSVD [28] Ellmauthaler et al[8] and proposed schemes are shown in Figures 4(c)ndash4(h)respectively Note that only the proposed scheme is able toaccurately preserve both the moon texture (from IR image)and other stars (from visible image) in the fused image

Figures 5(a) and 5(b) show visible and IR Nabula (M16)JPEG images taken by Hubble space telescope [30] Thefusion results obtained by RP [25] DTCWT [26] NSCT [27]MSVD [28] Ellmauthaler et al [8] and proposed schemesare shown in Figures 5(c)ndash5(h) respectivelyThe fused imageusing proposed scheme highlights the IR information moreaccurately as compared to existing state-of-the-art schemes

Table 1 shows the quantitative comparison of existingand proposed schemes (where the bold values indicate bestresults) It can be observed that the results obtained usingproposed schemes are significantly better inmost of the casesmeasures as compared to existing state-of-the-art schemes

4 Conclusion

A fusion scheme for astronomical visibleIR images based onUDTCWT local standard deviation and distance transformis proposed The use of UDTCWT is helpful in retaininguseful details of the image The local standard deviationvariation measures presence or absence of small objects Thedistance transform activates the effects of proximity in thesegmentation process and eliminates effects of oversegmen-tation in addition to short sightedness The scheme reducesnoise artifacts and efficiently extracts the useful information(especially small objects) Simulation results on differentvisibleIR images verify the effectiveness of proposed scheme

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] L Kun G Lei L Huihui and C Jingsong ldquoFusion of infraredand visible light images based on region segmentationrdquo ChineseJournal of Aeronautics vol 22 no 1 pp 75ndash80 2009

[2] Z Fu X Dai Y Li H Wu and X Wang ldquoAn improved visibleand Infrared image fusion based on local energy and fuzzylogicrdquo in Proceedings of the 12th IEEE International Conferenceon Signal Processing pp 861ndash865 October 2014

[3] J Saeedi and K Faez ldquoInfrared and visible image fusion usingfuzzy logic and population-based optimizationrdquo Applied SoftComputing vol 12 no 3 pp 1041ndash1054 2012

[4] S Yin L Cao Y Ling and G Jin ldquoOne color contrast enhancedinfrared and visible image fusion methodrdquo Infrared Physics andTechnology vol 53 no 2 pp 146ndash150 2010

[5] Y Leung J Liu and J Zhang ldquoAn improved adaptive intensity-hue-saturationmethod for the fusion of remote sensing imagesrdquoIEEE Geoscience and Remote Sensing Letters vol 11 no 5 pp985ndash989 2014

[6] B Chen and B Xu ldquoA unified spatial-spectral-temporal fusionmodel using Landsat and MODIS imageryrdquo in Proceedingsof the 3rd International Workshop on Earth Observation andRemote Sensing Applications (EORSA rsquo14) pp 256ndash260 IEEEChangsha China June 2014

[7] D Connah M S Drew and G D Finlayson Spectral EdgeImage Fusion Theory and Applications Springer 2014

[8] A Ellmauthaler E A B da Silva C L Pagliari and S R NevesldquoInfrared-visible image fusion using the undecimated wavelettransform with spectral factorization and target extractionrdquo inProceedings of the 19th IEEE International Conference on ImageProcessing (ICIP rsquo12) pp 2661ndash2664 October 2012

8 Advances in Astronomy

[9] M Ding L Wei and B Wang ldquoResearch on fusion method forinfrared and visible images via compressive sensingrdquo InfraredPhysics and Technology vol 57 pp 56ndash67 2013

[10] B Huang H Song H Cui J Peng and Z Xu ldquoSpatial and spec-tral image fusion using sparsematrix factorizationrdquo IEEETrans-actions on Geoscience and Remote Sensing vol 52 no 3 pp1693ndash1704 2014

[11] X Zhang X Li Y FengH Zhao andZ Liu ldquoImage fusionwithinternal generative mechanismrdquo Expert Systems with Applica-tions vol 42 no 5 pp 2382ndash2391 2015

[12] Y Lu F Wang X Luo and F Liu ldquoNovel infrared and visibleimage fusion method based on independent component anal-ysisrdquo Frontiers in Computer Science vol 8 no 2 pp 243ndash2542014

[13] A Jameel A Ghafoor and M M Riaz ldquoAdaptive compressivefusion for visibleIR sensorsrdquo IEEE Sensors Journal vol 14 no7 pp 2230ndash2231 2014

[14] Z Liu H Yin B Fang and Y Chai ldquoA novel fusion schemefor visible and infrared images based on compressive sensingrdquoOptics Communications vol 335 pp 168ndash177 2015

[15] R Wang and L Du ldquoInfrared and visible image fusion basedon random projection and sparse representationrdquo InternationalJournal of Remote Sensing vol 35 no 5 pp 1640ndash1652 2014

[16] J Wang J Peng X Feng G He and J Fan ldquoFusion method forinfrared and visible images by using non-negative sparse repre-sentationrdquo Infrared Physics and Technology vol 67 pp 477ndash4892014

[17] G Piella and H Heijmans ldquoA new quality metric for imagefusionrdquo in Proceedings of the IEEE Conference on Image Process-ing Conference pp 171ndash173 September 2003

[18] Z Wang and A C Bovik ldquoA universal image quality indexrdquoIEEE Signal Processing Letters vol 9 no 3 pp 81ndash84 2002

[19] ZWang A C Bovik H R Sheikh and E P Simoncelli ldquoImagequality assessment from error visibility to structural similarityrdquoIEEE Transactions on Image Processing vol 13 no 4 pp 600ndash612 2004

[20] Y Chen and R S Blum ldquoA new automated quality assessmentalgorithm for image fusionrdquo Image and Vision Computing vol27 no 10 pp 1421ndash1432 2009

[21] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo IET Electronics Letters vol 36 no 4 pp 308ndash309 2000

[22] J Zhao R Laganiere and Z Liu ldquoPerformance assessment ofcombinative pixel-level image fusion based on an absolute fea-turemeasurementrdquo International Journal of Innovative Comput-ing Information and Control vol 3 no 6 pp 1433ndash1447 2007

[23] Andromeda galaxy (M31) IR httpsciesaintherschel48182-multiwavelength-images-of-the-andromeda-galaxy-m31

[24] Andromeda galaxy (M31) visible httpwwwrobgendlerastro-picscomM31Pagehtml

[25] A Toet ldquoImage fusion by a ratio of low-pass pyramidrdquo PatternRecognition Letters vol 9 no 4 pp 245ndash253 1989

[26] J J Lewis R J OrsquoCallaghan S G Nikolov D R Bull and NCanagarajah ldquoPixel- and region-based image fusion with com-plex waveletsrdquo Information Fusion vol 8 no 2 pp 119ndash1302007

[27] Q Zhang and B-L Guo ldquoMultifocus image fusion using thenonsubsampled contourlet transformrdquo Signal Processing vol89 no 7 pp 1334ndash1346 2009

[28] V P S Naidu ldquoImage fusion technique using multi-resolutionsingular value decompositionrdquo Defence Science Journal vol 61no 5 pp 479ndash484 2011

[29] Jupiterrsquos moon httpwwwtechnologyorg20141203plutos-closeup-will-awesome-based-jupiter-pics-new-horizons-space-craft

[30] Nabula (M16) httpwebbtelescopeorgwebb telescopetech-nology at the extremeskeep it coldphp

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

High Energy PhysicsAdvances in

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

FluidsJournal of

Atomic and Molecular Physics

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in Condensed Matter Physics

OpticsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

AstronomyAdvances in

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Superconductivity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Statistical MechanicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

GravityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

AstrophysicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Physics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Solid State PhysicsJournal of

 Computational  Methods in Physics

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Soft MatterJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

AerodynamicsJournal of

Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PhotonicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Biophysics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ThermodynamicsJournal of

Page 8: Research Article An Improved Infrared/Visible Fusion for ...

8 Advances in Astronomy

[9] M Ding L Wei and B Wang ldquoResearch on fusion method forinfrared and visible images via compressive sensingrdquo InfraredPhysics and Technology vol 57 pp 56ndash67 2013

[10] B Huang H Song H Cui J Peng and Z Xu ldquoSpatial and spec-tral image fusion using sparsematrix factorizationrdquo IEEETrans-actions on Geoscience and Remote Sensing vol 52 no 3 pp1693ndash1704 2014

[11] X Zhang X Li Y FengH Zhao andZ Liu ldquoImage fusionwithinternal generative mechanismrdquo Expert Systems with Applica-tions vol 42 no 5 pp 2382ndash2391 2015

[12] Y Lu F Wang X Luo and F Liu ldquoNovel infrared and visibleimage fusion method based on independent component anal-ysisrdquo Frontiers in Computer Science vol 8 no 2 pp 243ndash2542014

[13] A Jameel A Ghafoor and M M Riaz ldquoAdaptive compressivefusion for visibleIR sensorsrdquo IEEE Sensors Journal vol 14 no7 pp 2230ndash2231 2014

[14] Z Liu H Yin B Fang and Y Chai ldquoA novel fusion schemefor visible and infrared images based on compressive sensingrdquoOptics Communications vol 335 pp 168ndash177 2015

[15] R Wang and L Du ldquoInfrared and visible image fusion basedon random projection and sparse representationrdquo InternationalJournal of Remote Sensing vol 35 no 5 pp 1640ndash1652 2014

[16] J Wang J Peng X Feng G He and J Fan ldquoFusion method forinfrared and visible images by using non-negative sparse repre-sentationrdquo Infrared Physics and Technology vol 67 pp 477ndash4892014

[17] G Piella and H Heijmans ldquoA new quality metric for imagefusionrdquo in Proceedings of the IEEE Conference on Image Process-ing Conference pp 171ndash173 September 2003

[18] Z Wang and A C Bovik ldquoA universal image quality indexrdquoIEEE Signal Processing Letters vol 9 no 3 pp 81ndash84 2002

[19] ZWang A C Bovik H R Sheikh and E P Simoncelli ldquoImagequality assessment from error visibility to structural similarityrdquoIEEE Transactions on Image Processing vol 13 no 4 pp 600ndash612 2004

[20] Y Chen and R S Blum ldquoA new automated quality assessmentalgorithm for image fusionrdquo Image and Vision Computing vol27 no 10 pp 1421ndash1432 2009

[21] C S Xydeas and V Petrovic ldquoObjective image fusion perfor-mance measurerdquo IET Electronics Letters vol 36 no 4 pp 308ndash309 2000

[22] J Zhao R Laganiere and Z Liu ldquoPerformance assessment ofcombinative pixel-level image fusion based on an absolute fea-turemeasurementrdquo International Journal of Innovative Comput-ing Information and Control vol 3 no 6 pp 1433ndash1447 2007

[23] Andromeda galaxy (M31) IR httpsciesaintherschel48182-multiwavelength-images-of-the-andromeda-galaxy-m31

[24] Andromeda galaxy (M31) visible httpwwwrobgendlerastro-picscomM31Pagehtml

[25] A Toet ldquoImage fusion by a ratio of low-pass pyramidrdquo PatternRecognition Letters vol 9 no 4 pp 245ndash253 1989

[26] J J Lewis R J OrsquoCallaghan S G Nikolov D R Bull and NCanagarajah ldquoPixel- and region-based image fusion with com-plex waveletsrdquo Information Fusion vol 8 no 2 pp 119ndash1302007

[27] Q Zhang and B-L Guo ldquoMultifocus image fusion using thenonsubsampled contourlet transformrdquo Signal Processing vol89 no 7 pp 1334ndash1346 2009

[28] V P S Naidu ldquoImage fusion technique using multi-resolutionsingular value decompositionrdquo Defence Science Journal vol 61no 5 pp 479ndash484 2011

[29] Jupiterrsquos moon httpwwwtechnologyorg20141203plutos-closeup-will-awesome-based-jupiter-pics-new-horizons-space-craft

[30] Nabula (M16) httpwebbtelescopeorgwebb telescopetech-nology at the extremeskeep it coldphp

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

High Energy PhysicsAdvances in

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

FluidsJournal of

Atomic and Molecular Physics

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in Condensed Matter Physics

OpticsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

AstronomyAdvances in

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Superconductivity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Statistical MechanicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

GravityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

AstrophysicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Physics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Solid State PhysicsJournal of

 Computational  Methods in Physics

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Soft MatterJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

AerodynamicsJournal of

Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PhotonicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Biophysics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ThermodynamicsJournal of

Page 9: Research Article An Improved Infrared/Visible Fusion for ...

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

High Energy PhysicsAdvances in

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

FluidsJournal of

Atomic and Molecular Physics

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Advances in Condensed Matter Physics

OpticsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

AstronomyAdvances in

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Superconductivity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Statistical MechanicsInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

GravityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

AstrophysicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Physics Research International

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Solid State PhysicsJournal of

 Computational  Methods in Physics

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Soft MatterJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

AerodynamicsJournal of

Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PhotonicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Biophysics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ThermodynamicsJournal of