logo

SCIENTIA SINICA Informationis, Volume 48, Issue 8: 1000-1021(2018) https://doi.org/10.1360/N112017-00085

Multi-focus image fusion method based on discrete Tchebichef transform and focus measure

More info
  • ReceivedAug 28, 2017
  • AcceptedOct 18, 2017
  • PublishedFeb 1, 2018

Abstract

Transform-based image fusion methods are widely used in multi-focus image fusion owing to their promising fusion effect, and their noise robustness. However, the time complexity of conventional transform-based image fusion methods is generally high. In this paper, a multi-focus image fusion method based on the discrete Tchebichef transform (DTT) and a focus measure is proposed. According to the relationship between DTT and a correlation analysis, the focus measure of image blocks in source images can be evaluated by limited low-order DTT coefficients. Hence, the source images are fused by the principle of maximum focus measure. Our experimental results show that the proposed method can reduce the fusion time while ensuring the fusion effect, and exhibit high noise robustness during image fusion.


Funded by

国家自然科学基金(61572092)

国家自然科学基金—广东联合基金(U1401252)

国家重点研发计划(2016YFC1000307-3)


References

[1] Wan T, Zhu C, Qin Z. Multifocus image fusion based on robust principal component analysis. Pattern Recognition Lett, 2013, 34: 1001-1008 CrossRef Google Scholar

[2] Li S, Kang X, Fang L. Pixel-level image fusion: A survey of the state of the art. Inf Fusion, 2017, 33: 100-112 CrossRef Google Scholar

[3] Kumar M, Dass S. A Total Variation-Based Algorithm for Pixel-Level Image Fusion. IEEE Trans Image Process, 2009, 18: 2137-2143 CrossRef PubMed ADS Google Scholar

[4] Yan X, Kang W, Deng F. Palm vein recognition based on multi-sampling and feature-level fusion. Neurocomputing, 2015, 151: 798-807 CrossRef Google Scholar

[5] Lan X, Ma A J, Yuen P C. Joint Sparse Representation and Robust Feature-Level Fusion for Multi-Cue Visual Tracking. IEEE Trans Image Process, 2015, 24: 5826-5841 CrossRef PubMed ADS Google Scholar

[6] Huang Z H, Li W J, Wang J. Face recognition based on pixel-level and feature-level fusion of the top-level's wavelet sub-bands. Inf Fusion, 2015, 22: 95-104 CrossRef Google Scholar

[7] Heideklang R, Shokouhi P. Decision-Level Fusion of Spatially Scattered Multi-Modal Data for Nondestructive Inspection of Surface Defects.. Sensors, 2016, 16: 105-126 CrossRef PubMed Google Scholar

[8] Sun B, Li L, Wu X. Combining feature-level and decision-level fusion in a hierarchical classifier for emotion recognition in the wild. J Multimodal User Interfaces, 2016, 10: 125-137 CrossRef Google Scholar

[9] Wei C Y, Zhou B Y, Guo W. Novel fusion rules for transform-domain image fusion methods. In: Proceedings of International Conference on Mechatronics and Industrial Informatics, Toronto, 2015. 846--850. Google Scholar

[10] Liu Y, Liu S, Wang Z. Multi-focus image fusion with dense SIFT. Inf Fusion, 2015, 23: 139-155 CrossRef Google Scholar

[11] Wang Z, Ma Y, Gu J. Multi-focus image fusion using PCNN. Pattern Recognition, 2010, 43: 2003-2016 CrossRef Google Scholar

[12] Toet A. Image fusion by a ratio of low-pass pyramid. Pattern Recognition Lett, 1989, 9: 245-253 CrossRef Google Scholar

[13] Tang J. A contrast based image fusion technique in the DCT domain. Digital Signal Processing, 2004, 14: 218-226 CrossRef Google Scholar

[14] Zhang H X, Cao X. A way of image fusion based on wavelet transform. In: Proceedings of the 9th International Conference on Mobile Ad-hoc and Sensor Networks, Dalian, 2013. 498--501. Google Scholar

[15] Khare A, Srivastava R, Singh R. Edge preserving image fusion based on contourlet transform. In: Proceedings of International Conference on Image and Signal Processing, Agadir, 2012. 93--102. Google Scholar

[16] Xiao B, Lu G, Zhang Y. Lossless image compression based on integer Discrete Tchebichef Transform. Neurocomputing, 2016, 214: 587-593 CrossRef Google Scholar

[17] Huang W, Jing Z. Evaluation of focus measures in multi-focus image fusion. Pattern Recognition Lett, 2007, 28: 493-500 CrossRef Google Scholar

[18] Yap P T, Raveendran P. Image focus measure based on Chebyshev moments. IEE Proc Vis Image Process, 2004, 151: 128-136 CrossRef Google Scholar

[19] Thelen A, Frey S, Hirsch S. Improvements in Shape-From-Focus for Holographic Reconstructions With Regard to Focus Operators, Neighborhood-Size, and Height Value Interpolation. IEEE Trans Image Process, 2009, 18: 151-157 CrossRef PubMed ADS Google Scholar

[20] Yang G, Nelson B. Wavelet-based autofocusing and unsupervised segmentation of microscopic images. In: Proceedings of IEEE International Conference on Intelligent Robert and Systems, Las Vegas, 2003. 2143--2148. Google Scholar

[21] Santos A, Ortiz De Solórzano C, Vaquero J J. Evaluation of autofocus functions in molecular cytogenetic analysis. J Microsc, 1997, 188: 264-272 CrossRef Google Scholar

[22] Eskicioglu A M, Fisher P S. Image quality measures and their performance. IEEE Trans Commun, 1995, 43: 2959-2965 CrossRef Google Scholar

[23] Cao L, Jin L X, Tao H J, et al. Multi-focus image fusion based on spatial frequency in discrete cosine transform domain. IEEE Signal Process Lett, 2014, 22: 220--224. Google Scholar

[24] Liu Z, Blasch E, Xue Z. Objective Assessment of Multiresolution Image Fusion Algorithms for Context Enhancement in Night Vision: A Comparative Study.. IEEE Trans Pattern Anal Mach Intell, 2012, 34: 94-109 CrossRef PubMed Google Scholar

[25] Qu G, Zhang D, Yan P. Information measure for performance of image fusion. Electron Lett, 2002, 38: 313-315 CrossRef Google Scholar

[26] Zhao J Y, Laganiere R, Liu Z. Performance assessment of combinative pixel-level image fusion based on an absolute feature measurement. Int J Innov Comput Inf Control, 2006, 3: 1433--1447. Google Scholar

[27] Piella G, Heijmans H. A new quality metric for image fusion. In: Proceedings of International Conference on Image Processing, Barcelona, 2003. 173--176. Google Scholar

[28] Xydeas C S, Petrovic V. Objective image fusion performance measure. Mil Tech Courier, 2000, 36: 308--309. Google Scholar

[29] Li S, Kang X, Hu J. Image matting for fusion of multi-focus images in dynamic scenes. Inf Fusion, 2013, 14: 147-162 CrossRef Google Scholar

[30] Shutao Li , Xudong Kang , Jianwen Hu . Image Fusion With Guided Filtering. IEEE Trans Image Process, 2013, 22: 2864-2875 CrossRef PubMed ADS Google Scholar

[31] Tang J. A contrast based image fusion technique in the DCT domain. Digital Signal Processing, 2004, 14: 218-226 CrossRef Google Scholar

  • Figure 1

    (Color online) The plots of normalized discrete Tchebichef polynomials with order 0 to 5

  • Figure 2

    (Color online) The plots of DTT kernel function ${\Phi~_{m,n}}$. (a) ${\Phi~_{2,2}}$; (b) ${\Phi~_{10,10}}$

  • Figure 3

    Orientation analysis of DTT kernel function ${\Phi~_{m,n}}(x,y)$ with order $(s=m+n=4)$, the left is an image of kernel function, the right is the profile of kernel function along the $\alpha~$ direction. (a) ${\Phi~_{0,4}}(x,y)$; (b) ${\Phi~_{1,3}}(x,y)$; (c) ${\Phi~_{2,2}}(x,y)$

  • Figure 4

    Test image

  • Figure 5

    (Color online) (a) Values of the various focus measures for images blurred with Gaussian function of standard deviation $\sigma=~0.5,1.0,\ldots,~5.0$; (b) values of the various focus measures for images blurred with averaging masks of sizes $W~\times~W,~W~=~2,~4,~\ldots,~20$

  • Figure 6

    (Color online) (a) Values of various focus measures for Gaussian blurred test images corrupted by Gaussian noise with zero mean and 0.003 variance, the standard deviations of Gaussian blurring are $\sigma~=~0.5,~1.0,~\ldots,~5.0$; (b) values of various focus measures for average blurred test images corrupted by Gaussian noise with zero mean and 0.003 variance, the sizes of average blurring masks are $W~\times~W,~W~=~2,~4,\ldots,~20$

  • Figure 7

    The framework of multi-focus image fusion based on DTT and focus measure

  • Figure 8

    Fused image with different order $p$. (a) Source A; (b) source B; (c) $p=2$; (d) $p=3$; (e) $p=4$; (f) $p=5$; (g) $p=6$; (h) the magnified region of $p=2$; (i) the magnified region of $p=3$; (j) the magnified region of $p=4$; (k) the magnified region of $p=5$; (l) the magnified region of $p=6$

  • Figure 9

    (Color online) The fusion effect of experiment 1. (a) Source A; (b) source B; (c) IM; (d) GF; (e) PCNN;protect łinebreak (f) DSIFT; (g) DCT+C+V; (h) DTT; (i) the magnified region of IM; (j) the magnified region of GF; (k) the magnified region of PCNN; (l) the magnified region of DSIFT; (m) the magnified region of DCT+C+V; (n) the magnified region of DTT

  • Figure 10

    The fusion effect of experiment 2. (a) Source A; (b) source B; (c) IM; (d) GF; (e) PCNN; (f) DSIFT;protect łinebreak (g) DCT+C+V; (h) DTT; (i) the magnified region of IM; (j) the magnified region of GF; (k) the magnified region of PCNN; (l) the magnified region of DSIFT; (m) the magnified region of DCT+C+V; (n) the magnified region of DTT

  • Figure 11

    (Color online) Eight pairs of multi-focus source images

  • Figure 12

    (Color online) Fused results of eight pairs of source images by different methods, from left to right: IM, GF, PCNN, DSIFT, DCT+C+V, and DTT

  • Figure 13

    (Color online) Experimental results on Gaussian noise robustness in multi-focus image fusion by different methods. (a) Source A; (b) source B; (c) IM; (d) GF; (e) PCNN; (f) DSIFT; (g) DCT+C+V; (h) DTT; (i) the magnified region of IM; (j) the magnified region of GF; (k) the magnified region of PCNN; (l) the magnified region of DSIFT;protectłinebreak (m) the magnified region of DCT+C+V; (n) the magnified region of DTT

  • Figure 14

    (Color online) Experimental results on salt and pepper noise robustness in multi-focus image fusion by different methods. (a) Source A; (b) source B; (c) IM; (d) GF; (e) PCNN; (f) DSIFT; (g) DCT+C+V; (h) DTT; (i) the magnified region of IM; (j) the magnified region of GF; (k) the magnified region of PCNN; (l) the magnified region of DSIFT;protectłinebreak (m) the magnified region of DCT+C+V; (n) the magnified region of DTT

  • Figure 15

    (Color online) Experimental results on multiplicative noise robustness in multi-focus image fusion by different methods. (a) Source A; (b) source B; (c) IM; (d) GF; (e) PCNN; (f) DSIFT; (g) DCT+C+V; (h) DTT; (i) the magnified region of IM; (j) the magnified region of GF; (k) the magnified region of PCNN; (l) the magnified region of DSIFT;protectłinebreak (m) the magnified region of DCT+C+V; (n) the magnified region of DTT

  • Table 1   The objective assessments of fused image with different order in DTT
    阶次p MI ${Q_p}$ ${Q_w}$ ${Q_{\rm~AF}}$ $T$ (s)
    2 8.1446 0.7230 0.6305 0.7279 0.0938
    3 8.1474 0.7245 0.6319 0.7294 0.0945
    4 8.1382 0.7239 0.6309 0.7289 0.0958
    5 8.1430 0.7242 0.6313 0.7290 0.0963
    6 8.1351 0.7231 0.6299 0.7280 0.0979

    a) 在该客观评价表现最优的结果用粗体表示.

  • Table 2   The objective assessments of different methods for the fusion of “paper" color images
    MI ${Q_p}$ ${Q_w}$ ${Q_{\rm~AF}}$ $T$ (s)
    IM 2.4011 0.5850 0.6620 0.6054 1.3715
    GF 2.3819 0.5847 0.6709 0.6045 0.1496
    PCNN 2.2548 0.3985 0.6119 0.4294 0.5244
    DSIFT 2.4536 0.5994 0.6599 0.6183 1.9066
    DCT+C+V 1.8498 0.3769 0.6713 0.4044 0.6422
    DTT 2.4653 0.6029 0.6581 0.6218 0.0660

    a) 在该客观评价表现最优的结果用粗体表示.

  • Table 3   The objective assessments of different methods for the fusion of “plane" gray images
    MI ${Q_p}$ ${Q_w}$ ${Q_{\rm~AF}}$ $T$ (s)
    IM 6.6654 0.5770 0.7314 0.6182 1.8739
    GF 6.5647 0.5891 0.7710 0.6184 0.4931
    PCNN 6.5592 0.5354 0.7670 0.5788 0.3475
    DSIFT 6.6297 0.5912 0.7612 0.6225 0.8584
    DCT+C+V 5.9958 0.5444 0.7600 0.5908 0.1536
    DTT 6.6280 0.5921 0.7621 0.6235 0.0572

    a) 在该客观评价表现最优的结果用粗体表示.

  • Table 4   The objective assessments of different methods for the fusion of eight pairs of source images
    Method MI ${Q_p}$ ${Q_w}$ ${Q_{\rm~AF}}$ $T$ (s)
    图12(1) IM 6.9061 0.6843 0.7852 0.6502 3.8532
    GF 6.7162 0.6552 0.7830 0.6510 0.0942
    PCNN 6.8655 0.5884 0.7492 0.5968 0.5053
    DSIFT 6.9143 0.6549 0.7873 0.6503 2.2323
    DCT+C+V 6.2716 0.6258 0.7904 0.6176 0.3788
    DTT 6.9587 0.6471 0.7753 0.6456 0.0594
    图12(2) IM 6.2353 0.6032 0.6458 0.6080 6.4123
    GF 6.1441 0.6015 0.6556 0.6066 1.1906
    PCNN 5.9408 0.5543 0.6004 0.5584 7.4884
    DSIFT 6.1712 0.6045 0.6523 0.6092 16.5168
    DCT+C+V 5.7139 0.4733 0.6556 0.4816 3.9274
    DTT 6.1730 0.6043 0.6558 0.6087 0.1513
    图12(3) IM 7.9653 0.7181 0.6290 0.7225 3.2197
    GF 7.6569 0.7188 0.6564 0.7230 0.4542
    PCNN 7.1168 0.6434 0.6110 0.6451 3.0212
    DSIFT 7.8584 0.7251 0.6410 0.7307 10.5313
    DCT+C+V 6.8334 0.5996 0.6865 0.6062 2.0254
    DTT 8.1474 0.7245 0.6319 0.7294 0.0945
    图12(4) IM 7.2438 0.6937 0.7148 0.7102 3.2925
    GF 6.7616 0.6957 0.7446 0.7128 0.4707
    PCNN 6.2910 0.6038 0.7054 0.6338 2.9876
    DSIFT 7.3479 0.7050 0.7156 0.7222 8.7483
    DCT+C+V 5.7380 0.5428 0.7857 0.5713 2.0133
    DTT 7.4715 0.7040 0.7145 0.7202 0.1128
    图12(5) IM 4.4531 0.6538 0.8371 0.6698 1.9305
    GF 4.4157 0.6514 0.8400 0.6664 0.2701
    PCNN 4.3118 0.5773 0.8263 0.6026 0.4131
    DSIFT 4.4352 0.6509 0.8324 0.6663 2.0003
    DCT+C+V 4.0359 0.5445 0.8431 0.5704 0.3681
    DTT 4.4772 0.6261 0.8019 0.6431 0.0575
    图12(6) IM 8.0181 0.7835 0.8785 0.7838 2.9846
    GF 8.0428 0.7850 0.8791 0.7849 0.2337
    PCNN 7.8610 0.7708 0.8763 0.7713 1.5461
    DSIFT 8.0551 0.7846 0.8787 0.7847 4.3149
    DCT+C+V 7.3326 0.7361 0.8800 0.7373 1.1386
    DTT 8.0601 0.7807 0.8759 0.7809 0.0743
    图12(7) IM 4.9859 0.5835 0.7233 0.6113 2.5830
    GF 4.9712 0.5827 0.7257 0.6099 0.4626
    PCNN 4.8865 0.5312 0.7092 0.5627 0.3858
    DSIFT 4.9847 0.5821 0.7229 0.6101 1.3997
    DCT+C+V 4.7789 0.5398 0.7329 0.5716 0.4923
    DTT 4.9864 0.5750 0.7211 0.6035 0.0622
    图12(8) IM 8.1684 0.7070 0.7590 0.7146 5.0206
    GF 7.8418 0.7092 0.7720 0.7173 0.8628
    PCNN 6.6963 0.6217 0.7367 0.6274 4.3179
    DSIFT 8.3385 0.7138 0.7672 0.7217 14.5844
    DCT+C+V 6.0322 0.6269 0.7896 0.6362 3.0078
    DTT 8.4151 0.7094 0.7588 0.7172 0.1569

    a) 在该客观评价表现最优的结果用粗体表示.

  • Table 5   The objective assessments of different methods in the Gaussian noise robustness experiments
    MI ${Q_p}$ ${Q_w}$ ${Q_{\rm~AF}}$ $T$ (s)
    IM 1.7841 0.3346 0.4503 0.4428 1.3817
    GF 1.7637 0.3321 0.4654 0.4460 0.1767
    PCNN 1.7739 0.2570 0.4346 0.3396 0.5393
    DSIFT 1.9145 0.3652 0.4574 0.4779 1.9250
    DCT+C+V 1.6724 0.2791 0.5583 0.3517 0.6550
    DTT 1.9367 0.3690 0.4562 0.4816 0.0669

    a) 在该客观评价表现最优的结果用粗体表示.

  • Table 6   The objective assessments of different methods in the salt and pepper noise robustness experiments
    MI ${Q_p}$ ${Q_w}$ ${Q_{\rm~AF}}$ $T$ (s)
    IM 2.3726 0.5718 0.6462 0.5996 1.1075
    GF 2.3615 0.5703 0.6572 0.5987 0.1514
    PCNN 2.2305 0.3905 0.6022 0.4255 0.5471
    DSIFT 2.4265 0.5861 0.6465 0.6136 1.9655
    DCT+C+V 1.8352 0.3687 0.6573 0.4002 0.6428
    DTT 2.4408 0.5899 0.6449 0.6171 0.0670

    a) 在该客观评价表现最优的结果用粗体表示.

  • Table 7   The objective assessments of different methods in the multiplicative noise robustness experiments
    MI ${Q_p}$ ${Q_w}$ ${Q_{\rm~AF}}$ $T$ (s)
    IM 2.0993 0.5159 0.6186 0.5706 1.3097
    GF 2.1590 0.5295 0.6435 0.5885 0.1308
    PCNN 2.1000 0.3715 0.5953 0.4179 0.5646
    DSIFT 2.2293 0.5533 0.6363 0.6098 1.9166
    DCT+C+V 1.8106 0.3541 0.6419 0.3940 0.6412
    DTT 2.2429 0.5567 0.6342 0.6127 0.0670

    a) 在该客观评价表现最优的结果用粗体表示.

Copyright 2019 Science China Press Co., Ltd. 《中国科学》杂志社有限责任公司 版权所有

京ICP备18024590号-1