logo

SCIENCE CHINA Information Sciences, Volume 62, Issue 1: 010203(2019) https://doi.org/10.1007/s11432-018-9578-9

Docking navigation method for UAV autonomous aerial refueling

More info
  • ReceivedJul 2, 2018
  • AcceptedJul 19, 2018
  • PublishedDec 21, 2018

Abstract

In this paper, a docking navigation method for autonomous aerial refueling (AAR) of unmanned aerial vehicles (UAVs) based on a binocular vision system (BVS) is proposed. A BVS simulation platform is built for simulation research purposes. First, unnecessary scene information in the image is filtered through green light-emitting diodes (LEDs) and filters. Then the image is processed via graying, binarization, and median filtering to highlight the connected area of the LED in the image. Subsequently, the center of mass of the connected area is selected as the feature point (FP), and the FPs are described using an improved Haar wavelet transform. The multidimensional description vector of FP is obtained and matched. Finally, the position and pose of the refueling cone sleeve are estimated. Simulation results show the effectiveness of the presented AAR navigation method.


Acknowledgment

This work was supported by National Natural Science Foundation of China (Grant No. 61673327) and Aeronautical Science Foundation of China (Grant No. 20160168001).


References

[1] Zhang X, Duan H B, Yu Y. Receding horizon control for multi-UAVs close formation control based on differential evolution. Sci China Inf Sci, 2010, 53: 223--235. Google Scholar

[2] Xu Y, Lai S P, Li J X, et al. Concurrent optimal trajectory planning for indoor quadrotor formation switching. J Intell Robot Syst, 2018, 4: 1--18. Google Scholar

[3] Nalepka J, Hinchman J. Automated aerial refueling: extending the effectiveness of UAVs. In: Proceedings of AIAA Modeling and Simulation Technologies Conference and Exhibit, San Francisco, 2013. 15--18. Google Scholar

[4] Li H, Duan H B. Verification of monocular and binocular pose estimation algorithms in vision-based UAVs autonomous aerial refueling system. Sci China Technol Sci, 2016, 59: 1730-1738 CrossRef Google Scholar

[5] Xu Y, Li D Y, Luo D L, et al. Affine formation maneuver tracking control of multiple second-order agents with time-varying delays. Sci China Tech Sci, 2018. doi: 10.1007/s11431-018-9328-2. Google Scholar

[6] Bloy A W, Trochalidis V, West M G. The aerodynamic interference between a flapped tanker aircraft and a receiver aircraft during air-to-air refueling. Aeronaut J, 2016, 95: 274--282. Google Scholar

[7] Valasek J, Gunnam K, Kimmett J. Vision-Based Sensor and Navigation System for Autonomous Air Refueling. J Guidance Control Dyn, 2005, 28: 979-989 CrossRef ADS Google Scholar

[8] Tandale M D, Bowers R, Valasek J. Robust trajectory tracking controller for vision based probe and drogue autonomous aerial refueling. In: Proceedings of AIAA Guidance, Navigation, and Control Conference and Exhibit, San Francisco, 2005. 15--18. Google Scholar

[9] Pollini L, Campa G, Giulietti F, et al. Virtual simulation setup for UAVs aerial refueling. In: Proceedings of AIAA Conference on Modeling and Simulation Technologies and Exhibits, Austin, 2003. 11--14. Google Scholar

[10] Pollini L, Innocenti M, Mati R. Vision algorithms for formation flight and aerial refueling with optimal marker labeling. In: Proceedings of AIAA Modeling and Simulation Technologies Conference and Exhibit, San Francisco, 2005. 15--18. Google Scholar

[11] Schweighofer G, Pinz A. Robust pose estimation from a planar target.. IEEE Trans Pattern Anal Mach Intell, 2006, 28: 2024-2030 CrossRef PubMed Google Scholar

[12] Zhang S J, Cao X B, Zhang F. Monocular vision-based iterative pose estimation algorithm from corresponding feature points. Sci China Inf Sci, 2010, 53: 1682-1696 CrossRef Google Scholar

[13] Chen C I, Stettner R. Drogue tracking using 3D flash lidar for autonomous aerial refueling. Proc SPIE, 2011, 8037: 2362--2375. Google Scholar

[14] Huang B, Sun Y R, Sun X D, et al. Circular drogue pose estimation for vision-based navigation in autonomous aerial refueling. In: Proceedings of Guidance, Navigation and Control Conference (GNCC), Nanjing, 2016. 960--965. Google Scholar

[15] Duan H B, Zhang Q F, Deng Y M, et al. Biologically eagle-eye-based autonomous aerial refueling for unmanned aerial vehicles. Chin J Sci Instrum, 2014, 35: 1450--1458. Google Scholar

[16] Duan H, Li H, Luo Q. A binocular vision-based UAVs autonomous aerial refueling platform. Sci China Inf Sci, 2016, 59: 053201 CrossRef Google Scholar

[17] Bolien M, Iravani P, Bois J L. Toward Robotic Pseudodynamic Testing for Hybrid Simulations of Air-to-Air Refueling. IEEE/ASME Trans Mechatron, 2017, 22: 1004-1013 CrossRef Google Scholar

[18] Zhang Z. Flexible camera calibration by viewing a plane from unknown orientations. In: Proceedings of the 7th IEEE International Conference on Computer Vision (ICCV), Kerkyra, 2002. 666--673. Google Scholar

[19] Zhang Z. A flexible new technique for camera calibration. IEEE Trans Pattern Anal Machine Intell, 2000, 22: 1330-1334 CrossRef Google Scholar

[20] Yang J, Liang B, Zhang T. A novel systematic error compensation algorithm based on least squares support vector regression for star sensor image centroid estimation.. Sensors, 2011, 11: 7341-7363 CrossRef PubMed Google Scholar

[21] Molano R, Rodr'ıguez P G, Caro A, et al. Finding the largest area rectangle of arbitrary orientation in a closed contour. Appl Math Comput, 2012, 218: 9866--9874. Google Scholar

[22] Alemohammad M, Stroud J R, Bosworth B T. High-speed all-optical Haar wavelet transform for real-time image compression. Opt Express, 2017, 25: 9802-9811 CrossRef PubMed ADS Google Scholar

[23] Papadimitriou D V, Dennis T J. Epipolar line estimation and rectification for stereo image pairs. IEEE Trans Image Process, 1996, 5: 672-676 CrossRef PubMed ADS Google Scholar

[24] Lu C P, Hager G D, Mjolsness E. Fast and globally convergent pose estimation from video images. IEEE Trans Pattern Anal Machine Intell, 2000, 22: 610-622 CrossRef Google Scholar

[25] Murillo O, Lu P. Comparison of autonomous aerial refueling controllers using reduced order models. In: Proceedings of the AIAA Guidance, Navigation and Control Conference and Exhibit, Honolulu, 2008. 18--36. Google Scholar

  • Figure 3

    (Color online) UAV autonomous refueling platform. (a) BVS; (b) simulated RCS; (c) CCP.

  • Figure 4

    (Color online) CCP images. (a) CCP images captured by the left camera; (b) CCP images captured by the right camera; (c) sample drawing of the fixed point of CCP image.

  • Figure 5

    (Color online) Preprocessing of image. (a) Three-channel color image; (b) graying the color image; (c) binarization to the grayscale image; (d) median filtering of image.

  • Figure 6

    (Color online) Connected area labeling effect diagram.

  • Figure 7

    (Color online) Effects of different algorithms. (a) Effect diagram of centroid algorithm; (b) effect of the centroid algorithm in the original image; (c) effect image extracted using the Harris algorithm; (d) effect image extracted using the SURF algorithm.

  • Figure 10

    (Color online) Demonstrations of matching effect. (a) Effect of stereo matching of FPs; (b) effect of stereo matching using the ELR algorithm; (c) effect of stereo matching using the SURF algorithm.

  • Figure 13

    (Color online) Structure diagram of tracking control system for AAR and docking of UAV.

  • Figure 14

    (Color online) Actual relative position and expected relative position of UAV. (a) In the $X$-axis; (b) in the $Y$-axis; (c) in the $Z$-axis.

  • Figure 16

    (Color online) Control values for controller output. (a) $~\Delta~\delta_{t}$; (b) $~\Delta~\delta_{a}$; (c) $~\Delta~\delta_{e}$; (d) $~\Delta~\delta_{T}$.

  • Table 1   Internal parameters of the BVC
    *3
    X<
    Internal parameter Left camera Right camera
    $~f_{x}~$ 2394.3558 2325.5075
    $~f_{y}~$ 2393.9677 2325.2000
    $~u_{0}~$ 692.7227 628.8684
    $~v_{0}~$ 590.8281 555.3041
    $~k_{1}~$ 0.0708 0.0661
    $~k_{2}~$ $-$0.1293 0.0147
    $~\gamma~$ $-$4.5025 $-$0.4948
  •   

    Algorithm 1 O3C algorithm

    Calibration of BVS. $~\boldsymbol{A}_{l},~\boldsymbol{A}_{r}~$ and $~\boldsymbol{M}_{l},~\boldsymbol{M}_{r}~$ are obtained from the left and right cameras of BVS, respectively.

    Image acquisition. The left and right cameras of BVS are used to capture the same object simultaneously and the captured images are stored in the form of digital images in the computer.

    Image preprocessing. Graying, binarization, and median filtering are applied to the image.

    Extraction of image FPs. Pixels containing important features of the image are extracted as FPs.

    FP matching. The corresponding relationship between the left and right image FPs is determined.

    3D coordinate calculation. Based on the binocular vision principle, the 3D coordinates of matching FPs are obtained.

  • Table 2   External parameters of the BVC
    External parameter Calibration matrix of BVC
    Rotation matrix $~\boldsymbol{R}~$ $~\begin{bmatrix}~{0.9986}~&~{0.0006}~&~{0.0525}~\\~{-0.0006}~&~{1.0000}~&~{-0.0002}~\\~{-0.0525}~&~{0.0002}~&~{0.9986}~\end{bmatrix}~$
    Translation matrix $~\boldsymbol{T}~$ $~\left[\begin{matrix}~{-111.08}~&~{2.9456}~&~{-25.069}~\end{matrix}\right]~^{{\rm~T}}~$
  •   

    Algorithm 2 OSE algorithm

    Require:Figure 5(d), initialization markup matrix.

    Output:The marked image matrix and the number of connected areas.

    for $~i=1,\ldots,y_{\rm~max}~$

    for $~j=1,\ldots,x_{\rm~max}~$

    Find the pixels that are not marked and mark them in the tag matrix.

    end for

    end for

  •   

    Algorithm 3 DTC algorithm

    Require:$~y~$, $~y_{d}~$, and $~y_{r}~$.

    Output:Errors of relative position and pose of UAV and RCS.

    for $~i~=~1,\ldots,T~$ (time discretization)

    Find $~y_{r}~$, $~y^{*}~$, $~\hat{x}$ and $~\hat{u}~$.

    end for

    Note: $~T~$ is the time for close-range docking and refueling.

  • Table 3   Experimental data of FP extraction
    Feature extraction algorithm Total number of FPs Number of effective FPs Effective ratio (%)
    OSE1000 1000 100
    Harris15283 14796 96.81
    SURF18165 16914 93.11
  • Table 4   Experimental data of FP matching
    Matching algorithm Total number of pairs Correct number of pairs Matching accuracy (%)
    IHWT50 50 100
    ELR50 43 86
    SURF300 135 45
  • Table 5   Pixel coordinates of FPs
    Labels of FPPixel coordinates of the left image
    $~u\text{-}{\rm~axis}~$ coordinates $~v\text{-}{\rm~axis}~$ coordinates
    1 766.3222 294.0458
    2 841.5005 296.4669
    3 1065.564 452.7329
    4 834.3635 596.7836
    5 610.3694 437.7926
    Labels of FPPixel coordinates of the right image
    $~u\text{-}{\rm~axis}~$ coordinates$~v\text{-}{\rm~axis}~$ coordinates
    1 550.1918 322.0488
    2 625.2786 324.3251
    3 847.9412 481.3774
    4 612.4110 625.5150
    5 392.3321 465.6742
  • Table 6   Coordinates of the FPs
    Labelstextbackslash coordinates of world CSCoordinates of the FPs obtained using BVS
    $~X\text{-}{\rm~axis}~$ ($~{\rm~mm}~$) $~Y\text{-}{\rm~axis}~$ ($~{\rm~mm}~$) $~Z\text{-}{\rm~axis}~$ ($~{\rm~mm}~$)
    1 30.0643 55.2186 957.0100
    2 60.1948 55.9709 956.9757
    3 147.2115 $-$9.3191 943.2886
    4 55.9019 $-$64.4386 939.0642
    5 $-$32.2210 $-$1.2189 947.7082
    Labelstextbackslash coordinates of world CSCoordinates of the FPs obtained via measurement
    $~X\text{-}{\rm~axis}~$ ($~{\rm~mm}~$) $~Y\text{-}{\rm~axis}~$ ($~{\rm~mm}~$) $~Z\text{-}{\rm~axis}~$ ($~{\rm~mm}~$)
    1 29.4972 55.6821 955.7350
    2 59.5163 54.7009 955.6184
    3 147.5596 $-$7.6893 946.1418
    4 55.4418 $-$64.1295 938.7077
    5 $-$32.5715 $-$1.6992 946.9239
  • Table 7   Related initial parameters of the UAV and RCS
    Parameter variables Parameter values
    Position of UAV & RCS (m) (0, 0, 0) & (52.2303, 7.2426, 948.8093)
    Pose of UAV & RCS ($^{\circ}$) (0, 0, 0) & ($-$0.083, $-$0.1468, 0.9891)
    Speed of UAV & RCS (m/s) 20
    Acceleration of UAV & RCS (m$~/{\rm~s}^{2}$) 20

Copyright 2020 Science China Press Co., Ltd. 《中国科学》杂志社有限责任公司 版权所有

京ICP备18024590号-1       京公网安备11010102003388号