Infrared and Visible Light Image Fusion Based on Mahalanobis Distance and Guided Filter Weighting
-
-
Abstract
To improve the definition of fusion images and obtain better target information during the fusion of infrared and visible light images using the characteristics of non-subsampled contourlet transform(NSCT) coefficients, an Manalanobis distance weighted Laplacian energy combined with guided filtering is proposed to improve the frequency tuned (FT) algorithm. First, the visible light image is subjected to contrast limited adaptive histogram equalization(CLAHE), and the infrared image and the CLAHE processed visible light image are decomposed into a low-frequency approximate image and a high-frequency detail image through a multi-scale and multi-directional NSCT transform. Second, the FT algorithm improved by guided filtering isused to extract the significance graph of infrared images, the adaptive weighted fusion rule based on the significance graph of infrared images is used for low-frequency images, and the fusion rule based on the Laplace energy and maximum weighted by the Manalanobis distance is used for high-frequency images. Finally, the fusion image is obtained by the NSCT inverse transformation of the fused low-frequency and high-frequency images. The experimental results show that this fusion method has better performance in terms of subjective vision and objective indexes than other traditional fusion methods.
-
-