logo

SCIENCE CHINA Information Sciences, Volume 63, Issue 4: 140302(2020) https://doi.org/10.1007/s11432-019-2805-y

A new sensor bias-driven spatio-temporal fusion model based on convolutional neural networks

More info
  • ReceivedNov 1, 2019
  • AcceptedFeb 19, 2020
  • PublishedMar 9, 2020

Abstract

Owing to the tradeoff between scanning swath and pixel size, currently no satellite Earth observation sensors are able to collect images with high spatial and temporal resolution simultaneously. This limits the application of satellite images in many fields, including the characterization of crop yields or the detailed investigation of human-nature interactions. Spatio-temporal fusion (STF) is a widely used approach to solve the aforementioned problem. Traditional STF methods reconstruct fine-resolution images under the assumption that changes are able to be transferred directly from one sensor to another. However, this assumption may not hold in real scenarios, owing to the different capacity of available sensors to characterize changes. In this paper, we model such differences as a bias, and introduce a new sensor bias-driven STF model (called BiaSTF) to mitigate the differences between the spectral and spatial distortions presented in traditional methods. In addition, we propose a new learning method based on convolutional neural networks (CNNs) to efficiently obtain this bias. An experimental evaluation on two public datasets suggests that our newly developed method achieves excellent performance when compared to other available approaches.


Acknowledgment

This work was supported in part by National Natural Science Foundation of China (Grant Nos. 61771496, 61571195, 61901208), National Key Research and Development Program of China (Grant No. 2017YFB0502900), Guangdong Provincial Natural Science Foundation (Grant Nos. 2016A030313254, 2017A030313382), Science and Technology Project of Jiangxi Provincial Department of Education (Grant No. GJJ180962), and Natural Science Foundation of Jiangxi China (Grant No. 20192BAB217003). The authors would like to thank the contributors for sharing their codes for the algorithms of ESTARFM, FSDAF and STFDCNN.


References

[1] Johnson M D, Hsieh W W, Cannon A J. Crop yield forecasting on the Canadian Prairies by remotely sensed vegetation indices and machine learning methods. Agric For Meteor, 2016, 218-219: 74-84 CrossRef ADS Google Scholar

[2] Shen M, Tang Y, Chen J. Influences of temperature and precipitation before the growing season on spring phenology in grasslands of the central and eastern Qinghai-Tibetan Plateau. Agric For Meteor, 2011, 151: 1711-1722 CrossRef ADS Google Scholar

[3] Li X, Zhou Y, Asrar G R. Response of vegetation phenology to urbanization in the conterminous United States. Glob Change Biol, 2017, 23: 2818-2830 CrossRef PubMed ADS Google Scholar

[4] Zhu X, Helmer E H, Gao F. A flexible spatiotemporal method for fusing satellite images with different resolutions. Remote Sens Environ, 2016, 172: 165-177 CrossRef ADS Google Scholar

[5] Spatiotemporal Fusion of Multisource Remote Sensing Data: Literature Survey, Taxonomy, Principles, Applications, and Future Directions. Remote Sens, 2018, 10: 527 CrossRef ADS Google Scholar

[6] Zhang H K, Huang B, Zhang M. A generalization of spatial and temporal fusion methods for remotely sensed surface parameters. Int J Remote Sens, 2015, 36: 4411-4445 CrossRef ADS Google Scholar

[7] Feng Gao , Masek J, Schwaller M. On the Blending of the Landsat and MODIS Surface Reflectance: Predicting Daily Landsat Surface Reflectance. IEEE Trans Geosci Remote Sens, 2006, 44: 2207-2218 CrossRef ADS Google Scholar

[8] Hilker T, Wulder M A, Coops N C. A new data fusion model for high spatial- and temporal-resolution mapping of forest disturbance based on Landsat and MODIS. Remote Sens Environ, 2009, 113: 1613-1627 CrossRef ADS Google Scholar

[9] Zhu X, Chen J, Gao F. An enhanced spatial and temporal adaptive reflectance fusion model for complex heterogeneous regions. Remote Sens Environ, 2010, 114: 2610-2623 CrossRef ADS Google Scholar

[10] Weng Q, Fu P, Gao F. Generating daily land surface temperature at Landsat resolution by fusing Landsat and MODIS data. Remote Sens Environ, 2014, 145: 55-67 CrossRef ADS Google Scholar

[11] Bo Huang , Juan Wang , Huihui Song . Generating High Spatiotemporal Resolution Land Surface Temperature for Urban Heat Island Monitoring. IEEE Geosci Remote Sens Lett, 2013, 10: 1011-1015 CrossRef ADS Google Scholar

[12] Zhang W, Li A, Jin H. An Enhanced Spatial and Temporal Data Fusion Model for Fusing Landsat and MODIS Surface Reflectance to Generate High Temporal Landsat-Like Data. Remote Sens, 2013, 5: 5346-5368 CrossRef ADS Google Scholar

[13] Niu Z. Use of MODIS and Landsat time series data to generate high-resolution temporal synthetic Landsat data using a spatial and temporal reflectance fusion model. J Appl Remote Sens, 2012, 6: 063507 CrossRef Google Scholar

[14] Wu M, Huang W, Niu Z. Generating Daily Synthetic Landsat Imagery by Combining Landsat and MODIS Data.. Sensors, 2015, 15: 24002-24025 CrossRef PubMed Google Scholar

[15] Gevaert C M, García-Haro F J. A comparison of STARFM and an unmixing-based algorithm for Landsat and MODIS data fusion. Remote Sens Environ, 2015, 156: 34-44 CrossRef ADS Google Scholar

[16] Song H, Huang B. Spatiotemporal Satellite Image Fusion Through One-Pair Image Learning. IEEE Trans Geosci Remote Sens, 2013, 51: 1883-1896 CrossRef ADS Google Scholar

[17] Huang B, Song H. Spatiotemporal Reflectance Fusion via Sparse Representation. IEEE Trans Geosci Remote Sens, 2012, 50: 3707-3716 CrossRef ADS Google Scholar

[18] Wu B, Huang B, Zhang L. An Error-Bound-Regularized Sparse Coding for Spatiotemporal Reflectance Fusion. IEEE Trans Geosci Remote Sens, 2015, 53: 6791-6803 CrossRef ADS Google Scholar

[19] Li D, Li Y, Yang W. An Enhanced Single-Pair Learning-Based Reflectance Fusion Algorithm with Spatiotemporally Extended Training Samples. Remote Sens, 2018, 10: 1207 CrossRef ADS Google Scholar

[20] Zhao C, Gao X, Emery W J. An Integrated Spatio-Spectral-Temporal Sparse Representation Method for Fusing Remote-Sensing Images With Different Resolutions. IEEE Trans Geosci Remote Sens, 2018, 56: 3358-3370 CrossRef ADS Google Scholar

[21] Jiang C, Zhang H, Shen H. Two-Step Sparse Coding for the Pan-Sharpening of Remote Sensing Images. IEEE J Sel Top Appl Earth Observations Remote Sens, 2014, 7: 1792-1805 CrossRef ADS Google Scholar

[22] Boyte S P, Wylie B K, Rigge M B. Fusing MODIS with Landsat 8 data to downscale weekly normalized difference vegetation index estimates for central Great Basin rangelands, USA. GISci Remote Sens, 2018, 55: 376-399 CrossRef Google Scholar

[23] Ke Y, Im J, Park S. Downscaling of MODIS One Kilometer Evapotranspiration Using Landsat-8 Data and Machine Learning Approaches. Remote Sens, 2016, 8: 215 CrossRef ADS Google Scholar

[24] Liu X, Deng C, Wang S. Fast and Accurate Spatiotemporal Fusion Based Upon Extreme Learning Machine. IEEE Geosci Remote Sens Lett, 2016, 13: 2039-2043 CrossRef ADS Google Scholar

[25] Dong C, Loy C C, He K. Image Super-Resolution Using Deep Convolutional Networks.. IEEE Trans Pattern Anal Mach Intell, 2016, 38: 295-307 CrossRef PubMed Google Scholar

[26] Dong C, Chen C L, He K, et al. Learning a deep convolutional network for image super-resolution. In: Computer Vision---ECCV 2014. Berlin: Springer, 2014. Google Scholar

[27] Wei Y, Yuan Q, Shen H. Boosting the Accuracy of Multispectral Image Pansharpening by Learning a Deep Residual Network. IEEE Geosci Remote Sens Lett, 2017, 14: 1795-1799 CrossRef ADS arXiv Google Scholar

[28] Yuan Q, Wei Y, Meng X. A Multiscale and Multidepth Convolutional Neural Network for Remote Sensing Imagery Pan-Sharpening. IEEE J Sel Top Appl Earth Observations Remote Sens, 2018, 11: 978-989 CrossRef ADS arXiv Google Scholar

[29] Song H, Liu Q, Wang G. Spatiotemporal Satellite Image Fusion Using Deep Convolutional Neural Networks. IEEE J Sel Top Appl Earth Observations Remote Sens, 2018, 11: 821-829 CrossRef ADS Google Scholar

[30] Nair V, Hinton G E. Rectified linear units improve restricted boltzmann machines. In: Proceedings of International Conference on International Conference on Machine Learning, 2010. 807--814. Google Scholar

[31] Hu W, Huang Y Y, Li W, et al. Deep Convolutional Neural Networks for Hyperspectral Image Classification. Journal of Sensors, 2015, 2015: 1--12. Google Scholar

[32] Chen Y, Jiang H, Li C. Deep Feature Extraction and Classification of Hyperspectral Images Based on Convolutional Neural Networks. IEEE Trans Geosci Remote Sens, 2016, 54: 6232-6251 CrossRef ADS Google Scholar

[33] Jia X Y, Xu X M, Cai C B, et al. Single image super-resolution using multi-scale convolutional neural network. In: Advances in Multimedia Information Processing---PCM 2017. Berlin: Springer, 2017. 149--157. Google Scholar

[34] Kingma D P, Ba J. Adam: a method for stochastic optimization. In: Proceedings of the 3rd International Conference for Learning Representations, San Diego, 2015. Google Scholar

[35] He K, Zhang Z, Ren S, et al. Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016. 770--778. Google Scholar

[36] He L, Rao Y, Li J. Pansharpening via Detail Injection Based Convolutional Neural Networks. IEEE J Sel Top Appl Earth Observations Remote Sens, 2019, 12: 1188-1204 CrossRef ADS arXiv Google Scholar

[37] Emelyanova I V, McVicar T R, Van Niel T G. Assessing the accuracy of blending Landsat-MODIS surface reflectances in two landscapes with contrasting spatial and temporal dynamics: A framework for algorithm selection. Remote Sens Environ, 2013, 133: 193-209 CrossRef ADS Google Scholar

[38] Renza D, Martinez E, Arquero A. A New Approach to Change Detection in Multispectral Images by Means of ERGAS Index. IEEE Geosci Remote Sens Lett, 2013, 10: 76-80 CrossRef ADS Google Scholar

[39] Wang Z, Bovik A C, Sheikh H R. Image Quality Assessment: From Error Visibility to Structural Similarity. IEEE Trans Image Process, 2004, 13: 600-612 CrossRef PubMed ADS Google Scholar

  • Figure 1

    (Color online) Graphical illustration of the main goal of the spatio-temporal fusion (STF) task.

  • Figure 2

    (Color online) Toy example illustrating the impact of the bias. (a)–(c) The images collected at time 1; (d)–(f) the images collected at time 2; (g)–(i) show the changes; and (j)–(l) illustrate the bias maps, with (j) showing the bias between the ground-truth and sensor 1, (k) showing the bias between the ground-truth and sensor 2, and (l) showing the bias between sensors 1 and 2, respectively. It can be seen that the bias between sensors 1 and 2 (i.e., (l)) is significant, which is expected to play an essential role in the STF process.

  • Figure 3

    (Color online) Flowchart of the proposed BiaSTF method.

  • Figure 4

    (Color online) Reconstruction residual of the proposed BiaSTF and STFDCNN on two datasets, i.e., (a) CIA and (b) LGC datasets, that will be used for detailed evaluation in Section 3

  • Figure 5

    (Color online) Examples from CIA dataset, from which we can observe that there are significant phenological changes.

  • Figure 6

    (Color online) Examples from LGC dataset, in which significant land-cover type changes can be observed.

  • Figure 7

    (Color online) Prediction results obtained for the 10th pair of the CIA dataset.

  • Figure 8

    (Color online) Prediction results obtained for the 8th pair of the LGC dataset.

  • Figure 9

    Change maps (first row), bias maps (second row), and bias square error maps (third row) obtained by different methods for the CIA dataset, using the 10th pair.

  • Figure 10

    Change maps (first row), bias maps (second row) and bias square error maps (third row), obtained by different methods for the LGC dataset, using the 8th pair.

  • Table 1   Structure of the CNN architecture used by the proposed BiaSTF
    Layer Filter size Stride Activation function
    Conv1 7$\times$7$\times~n_1$ (1, 1)ReLU
    Conv2 5$\times$5$\times~n_2$ (1, 1)ReLU
    Conv3 3$\times$3$\times~n_3$ (1, 1)ReLU
    Conv4 3$\times$3$\times$1 (1, 1)
  • Table 2   Quantitative assessment of the fusion results obtained for the two considered datasets
    CIA datasetLGC dataset
    Pair ESTARFM FSADF STFDCNN BiaSTF Pair ESTARFM FSADF STFDCNN BiaSTF
    7*RMSE 8th 0.0301 0.0331 0.0256 series0.02277th 0.02650.0247 0.0378 series0.0240 9th 0.0249 0.0282 0.0263 series0.02358th 0.03860.0374 0.0346 series0.0334 10th 0.0263 0.0243 0.0274 series0.02309th 0.03820.0383 0.0275 series0.0239 11th 0.0265 0.0278 0.0285 series0.024410th 0.02270.0237 0.0252 series0.0211 12th 0.0241 0.0269 0.0251 series0.021511th 0.02720.0283 0.0242 series0.0240 13th 0.0213 0.0231 0.0226 series0.020812th series0.01650.0230 0.0268 0.0167
    14th 0.0229 0.0251 0.0229 series0.020313th 0.01560.0255 0.0245 series0.0154
    7*CC 8th 0.8740 0.8514 0.9154 series0.93127th 0.72950.7530 0.6456 series 0.7981 9th 0.9113 0.8784 0.8990 series0.91868th 0.68710.7078 0.7166 series0.7454 10th 0.9075 0.9206 0.8939 series0.92679th 0.74440.6938 0.8372 series0.8644 11th 0.8806 0.8734 0.8621 series0.899310th 0.89750.8722 0.8879 series0.9048 12th 0.8371 0.7936 0.8302 series0.864411th 0.89000.8809 0.9061 series0.9075 13th 0.8643 0.8505 0.8462 series0.867012th 0.93560.8878 0.8476 series0.9391 14th 0.8602 0.8223 0.8481 series0.879413th series0.93260.8560 0.8313 0.9309
    7*ERGAS 8th 0.83820.9239 0.7531 series0.64457th 0.84180.8005 1.2632 series 0.7925 9th 0.81210.9214 0.8663 series0.77408th 2.3058 2.0301 2.0952 series2.0083 10th0.88380.8097 0.8981 series0.77209th 1.71461.7339 1.2056 series1.0770 11th0.88270.9474 0.9694 series0.813110th 0.80590.8612 0.8949 series0.7528 12th0.88430.9616 0.9172 series0.790111th 0.95560.9861 0.8486 series0.8433 13thseries0.86010.9027 0.9579 0.8943 12th 0.56570.7771 0.9810 series0.5639 14th0.85030.9607 0.8462 series0.754013th 0.50150.7823 0.8909 series0.5000
    7*SSIM 8th 0.89210.8715 0.9288 series0.94127th 0.82440.8403 0.7250 series 0.8658 9th 0.92770.8958 0.9174 series0.93238th 0.7481 0.7523 0.7813 series0.8022 10th0.92460.9353 0.9164 series0.94079th 0.79690.7811 0.8853 series0.9074 11th0.91110.9049 0.8972 series0.923310th 0.92240.9078 0.9094 series0.9302 12th0.89770.8697 0.8923 series0.917411th 0.90810.8996 0.9233 series0.9250 13th0.91460.9048 0.9034 series0.916512th 0.95490.9189 0.8836 series0.9572 14th0.90800.8838 0.9034 series0.922413th series0.95380.8861 0.8766 0.9532
    7*SAM 8th 0.07280.0834 0.0742 series0.06197th 0.08890.0799 0.1317 series0.0738 9th series0.07160.0874 0.0874 0.0730 8th 0.2781 0.2433 0.2213 series0.2156 10th0.07830.0740 0.0914 series0.07059th 0.12270.1785 0.1241 series0.1185 11th0.08230.0890 0.1004 series0.079410th 0.07430.0935 0.0749 series0.0651 12th0.07700.0914 0.0893 series0.070511th 0.07190.0805 0.0701 series0.0689 13th0.07370.0774 0.0824 series0.068012th 0.05340.0681 0.0877 series0.0504 14th0.06010.0723 0.0762 series0.059013th 0.05020.0801 0.0819 series0.0463
  • Table 3   Quantitative assessment of the fusion results obtained for the 10th pair of the CIA dataset
    ESTARFMFSDAFSTFDCNNBiaSTF
    RMSE CC SSIM RMSE CC SSIM RMSE CC SSIM RMSE CC SSIM
    Band1 0.01340.90020.93790.0115 0.9168 0.9509 0.0126 0.8726 0.9317 series 0.0111series 0.9232series 0.9543 Band20.01380.89840.93510.0130 0.9094 0.9419 0.0138 0.8864 0.9288 series 0.0122series 0.9124series 0.9446 Band30.02110.91110.92470.0199 0.9216 0.9339 0.0222 0.8976 0.9137 series 0.0192series 0.9242series 0.9358 Band40.03480.89290.90090.0307 0.9160 0.9197 0.0378 0.8789 0.8878 series 0.0286series 0.9280series 0.9322 Band50.03820.92110.92480.0357 0.9319 0.9352 0.0389 0.9188 0.9229 series0.0340series 0.9365series 0.9394 Band60.03680.92140.92420.0353 0.9283 0.9303 0.0395 0.9091 0.9138 series0.0331series 0.9360series 0.9381
    ERGAS 0.88380.80970.8981series 0.7720
    SAM 0.07830.07400.0914series 0.0705
  • Table 4   Quantitative assessment of the fusion results obtained for the 8th pair of the LGC dataset
    ESTARFMFSDAFSTFDCNNBiaSTF
    RMSE CC SSIM RMSE CC SSIM RMSE CC SSIM RMSE CC SSIM
    Band1 0.0161 0.69050.85970.0160 0.6820 0.8577 0.0166 0.6697 0.8513 series 0.0157series 0.7126series 0.8673 Band20.0228 0.69280.80590.0226 0.6921 0.8063 0.0234 0.6934 0.8045 series 0.0223series 0.7115series 0.8168 Band30.0281 0.69550.77950.0277 0.6904 0.7775 0.02950.6952 0.7754 series 0.0275series 0.7169series 0.7938 Band40.0481 0.72010.74870.0428 0.7947 0.7923 0.0376 0.8273 0.8329 series 0.0373series 0.8374series 0.8471 Band50.0660 0.67250.65950.0647 0.7008 0.6603 0.0558 0.7491 0.7264 series 0.0556series 0.7656series 0.7543 Band60.05050.6515 0.63560.0506 0.6868 0.6200 0.0451 0.6654 0.6975 series0.0417series 0.7317series 0.7348
    ERGAS 2.30582.03012.0952series 2.0083
    SAM 0.27810.2433 0.2213series 0.2156

Copyright 2020 Science China Press Co., Ltd. 《中国科学》杂志社有限责任公司 版权所有

京ICP备18024590号-1