logo

SCIENCE CHINA Information Sciences, Volume 63 , Issue 4 : 140304(2020) https://doi.org/10.1007/s11432-019-2800-0

Deep-learning-based extraction of the animal migration patterns from weather radar images

More info
  • ReceivedOct 31, 2019
  • AcceptedFeb 16, 2020
  • PublishedMar 9, 2020

Abstract

Continental coverage and year-round operation of the weather radar networks provide an unprecedented opportunity for studying large-scale airborne migration. The broad and local-scale airborne information collected by these infrastructures can answer many ecological questions. However, extracting and interpreting the biological information from such massive weather radar data remains an intractable problem. Recently, many big-data problems have been solved using the deep learning technology. In this study, the biological information in the weather radar data is identified using the advanced deep learning method. The proposed method consists of two main parts, i.e., a rendering and casting procedure and an image segmentation procedure based on a convolutional neural network. The biological data are automatically extracted by rendering and mapping, image segmentation, and result masking. By analyzing the typical radar data from single and multiple stations, we partly reveal the intensity and speed of the migration pattern. We present the first feasibility study of the extraction of local and large-scale biological phenomena from the Chinese weather radar network data.


Acknowledgment

This work was supported by National Natural Science Foundation of China (Grant No. 31727901). The authors thank Prof. Kongming WU, Dr. Qiulin WU and Haowen ZHANG, Institute of Plant Protection, Chinese Academy of Agricultural Sciences, for their kindly discussion and useful suggestions. The authors thank Dongli WU and Dasheng YANG, Meteorological Observation Center, China Meteorological Administration, for providing Chinese weather radar data.


References

[1] Van Doren B M, Horton K G. A continental system for forecasting bird migration. Science, 2018, 361: 1115-1118 CrossRef PubMed ADS Google Scholar

[2] Kelly J F, Horton K G. Toward a predictive macrosystems framework for migration ecology. glob Ecol Biogeogr, 2016, 25: 1159-1165 CrossRef Google Scholar

[3] Chilson P B, Frick W F, Stepanian P M. Ecosphere, 2012, 3: art72 CrossRef Google Scholar

[4] Rosenberg K V, Dokter A M, Blancher P J. Decline of the North American avifauna. Science, 2019, 366: 120-124 CrossRef PubMed ADS Google Scholar

[5] Hu C, Cui K, Wang R. A Retrieval Method of Vertical Profiles of Reflectivity for Migratory Animals Using Weather Radar. IEEE Trans Geosci Remote Sens, 2020, 58: 1030-1040 CrossRef ADS Google Scholar

[6] Stepanian P M, Horton K G. Extracting Migrant Flight Orientation Profiles Using Polarimetric Radar. IEEE Trans Geosci Remote Sens, 2015, 53: 6518-6528 CrossRef ADS Google Scholar

[7] Hu C, Kong S, Wang R. Identification of Migratory Insects from their Physical Features using a Decision-Tree Support Vector Machine and its Application to Radar Entomology. Sci Rep, 2018, 8: 5449 CrossRef PubMed ADS Google Scholar

[8] Hu C, Li W, Wang R. Accurate Insect Orientation Extraction Based on Polarization Scattering Matrix Estimation. IEEE Geosci Remote Sens Lett, 2017, 14: 1755-1759 CrossRef ADS Google Scholar

[9] Hu C, Li W, Wang R. Insect flight speed estimation analysis based on a full-polarization radar. Sci China Inf Sci, 2018, 61: 109306 CrossRef Google Scholar

[10] Hu C, Wang Y, Wang R. An improved radar detection and tracking method for small UAV under clutter environment. Sci China Inf Sci, 2019, 62: 29306 CrossRef Google Scholar

[11] Krizhevsky A, Sutskever I, Hinton G E. Imagenet classification with deep convolutional neural networks. In: Proceedings of the 25th International Conference on Neural Information Processing Systems, 2012. 1097--1105. Google Scholar

[12] Lin T Y, Winner K, Bernstein G. MistNet: measuring historical bird migration in the us using archivedweather radar data and convolutional neural networks. Methods Ecol Evol, 2019, 10: 1908-1922 CrossRef Google Scholar

[13] Hu C, Li S, Wang R. Extracting animal migration pattern from weather radar observation based on deep convolutional neural networks. 93 CrossRef Google Scholar

[14] Chilson C, Avery K, McGovern A. Automated detection of bird roosts using NEXRAD radar data and Convolutionalneural networks. Remote Sens Ecol Conserv, 2019, 5: 20-32 CrossRef Google Scholar

[15] Xu X F. Construction, techniques and application of new generation doppler weather radar network in China. Eng Sci, 2004, 1: 15--25. Google Scholar

[16] Zhu X, Zhu J. New generation weather radar network in China (in Chinese). Meteorolog Sci Tech, 2004, 32:. Google Scholar

[17] Lakshmanan V, Hondl K, Potvin C K. An Improved Method for Estimating Radar Echo-Top Height. Wea Forecasting, 2013, 28: 481-488 CrossRef ADS Google Scholar

[18] Bruderer B, Liechti F. Variation in density and height distribution of nocturnal migration in the south of Israel. Israel Journal of Zoology, 2013, 41(3):477-487 DOI: https://doi.org/10.1080/00212210.1995.10688815. Google Scholar

[19] Rennie S J, Curtis M, Peter J. Bayesian Echo Classification for Australian Single-Polarization Weather Radar with Application to Assimilation of Radial Velocity Observations. J Atmos Ocean Technol, 2015, 32: 1341-1355 CrossRef ADS Google Scholar

[20] Zhang P, Liu S, Xu Q. Identifying Doppler Velocity Contamination Caused by Migrating Birds. Part I: Feature Extraction and Quantification. J Atmos Ocean Technol, 2005, 22: 1105-1113 CrossRef ADS Google Scholar

[21] Lakshmanan V, Fritz A, Smith T. An Automated Technique to Quality Control Radar Reflectivity Data. J Appl Meteor Climatol, 2007, 46: 288-305 CrossRef ADS Google Scholar

[22] Hu G, Lim K S, Horvitz N. Mass seasonal bioflows of high-flying insect migrants. Science, 2016, 354: 1584-1587 CrossRef PubMed ADS Google Scholar

[23] Chen L C, Zhu Y, Papandreou G, et al. Encoder-decoder with atrous separable convolution for semantic image segmentation. In: Proceedings of the European Conference on Computer Vision (ECCV), 2018. 801--818. Google Scholar

[24] Everingham M, Eslami S M A, Van Gool L. The Pascal Visual Object Classes Challenge: A Retrospective. Int J Comput Vis, 2015, 111: 98-136 CrossRef Google Scholar

[25] Abadi M, Agarwal A, et al. Tensorflow: Large-scale machine learning on heterogeneous distributed systems. 2016,. arXiv Google Scholar

[26] 2J Appl Meteor 1968, 7: 105--113. Google Scholar

  • Figure 1

    (Color online) Geometry and volume coverage pattern of weather radar scanning. (a) Scanning strategy and beam geometry of normal weather radar. Five successive elevations are shown. (b) Volume coverage patterns at operational elevations of $0.5^{\circ}$, $1.45^{\circ}$, $2.4^{\circ}$, $3.35^{\circ}$, and $4.3^{\circ}$. When two adjacent elevations overlap, the beams scan the atmosphere with no gap between those elevations.

  • Figure 2

    (Color online) (a) and (b) Vertical slices of precipitation and biological echoes. (c) and (d) Typical reflectivity factor products of precipitation and biology deduced from the Xuzhou weather radar station. (c) 12:13 UTC on September 27th and (d) 11:41 UTC on August 31st, 2017. Echo types are confirmed by checking the historical weather conditions. From left to right, the subgraphs are scanned at elevations of $0.5^{\circ}$, $1.45^{\circ}$, $2.4^{\circ}$, $3.35^{\circ}$, and $4.3^{\circ}$, with plotting radii of 227, 127, 81, 58 and 45 km, respectively. These radii correspond to a height range of 3000 m (bottom limit of the antenna beam).

  • Figure 3

    (Color online) Biological echo extraction process. The weather radar data are mapped to grayscale images by a linear mapping method. The biological dataset contains 1500 images for training the convolutional network at each elevation.

  • Figure 4

    (Color online) DeepLabv3+-based network for extracting the biological information from weather radar. The input resolution is 321 by 321 and the atrous rates in the ASPP part are correspondingly reduced to 6, 8 and 12. The decoder output stride is reduced to 1 for densifying the output feature map.

  • Figure 5

    (Color online) Training losses and typical segmentation results of the training dataset at each elevation. The training losses are recorded as functions of the number of training steps. The input images are rendered and casted from the weather radar data. The brightness of the input images is linearly related to the radar reflectivity factor. The label images are manually constructed using the MATLAB image segmentation app. The output images are the segmentation results of the trained models. In the label and output images, the black and gray pixels represent the non-biological and biological areas, respectively.

  • Figure 6

    (Color online) Typical extraction results on different dates. The images in the first row are acquired at an elevation angle of $2.4^{\circ}$, and the brightness is linearly related to the radar reflectivity factor. The images in the second row are the segmentation results of the trained model at the third elevation angle. Biological echoes are found in the red areas.

  • Figure 7

    (Color online) Large-scale migration pattern in China. (a) Spring; (b) autumn. The red color bar represents the relative migration intensity calculated from the segmentation results (a high red concentration denotes intense migration). The sizes and directions of the arrow represent the average airspeeds and directions, respectively, of the targets passing the radar station. All data are collected at 21:00 local time.

  • Table 1   Segmentation performance at each elevation
    ElevationPixel accuracymIOU
    Original (%) Ours (%) Original (%) Ours (%)
    $0.5^{\circ}$ 94.80 95.30 89.85 90.81
    $1.35^{\circ}$ 95.59 95.53 91.28 91.94
    $2.4^{\circ}$ 92.63 93.04 86.19 86.91
    $3.35^{\circ}$ 92.02 92.37 85.19 85.78
    $4.3^{\circ}$ 91.76 92.29 84.59 85.51
  • Table 2   Performance comparison between our method and a previous work
    Method Precision (%) Recall (%) F-score (%)
    Ours 92.6 92.4 92.5
    MistNet 72.6 96.1 82.7

Copyright 2020  CHINA SCIENCE PUBLISHING & MEDIA LTD.  中国科技出版传媒股份有限公司  版权所有

京ICP备14028887号-23       京公网安备11010102003388号