logo

SCIENCE CHINA Information Sciences, Volume 62, Issue 1: 012202(2019) https://doi.org/10.1007/s11432-017-9306-x

FVO: floor vision aided odometry

More info
  • ReceivedJul 25, 2017
  • AcceptedNov 6, 2017
  • PublishedAug 15, 2018

Abstract

In many indoor scenarios, such as restaurants, laboratories, and supermarkets, the planar floors are covered with rectangular tiles.We realized that the abundant parallel lines and crossing points formed by tile joints can be used as natural features to assist indoor localization, and thus we propose a novel indoor localization method for mobile robots by fusing odometry and monocular vision.The method comprises three steps. First, the heading and location of the mobile robot are approximately estimated by odometry based on incremental encoders. Second, with the aid of a camera, the lens of which points vertically toward the floor, the odometric heading estimation can be corrected by detecting the relative angle between the robot's heading and the tile joints. Third, the odometric location estimation is corrected by detecting the perpendicular distance between the image center and the tile joints.As compared with the existing indoor localization methods, the proposed method, called floor vision aided odometry, is not only relatively low in economic cost and computational complexity, but also relatively high in accuracy and robustness.The effectiveness of this method is verified by a real-world experiment based on a differential-drive wheeled mobile robot.


Acknowledgment

This work was supported in part by National Natural Science Foundation of China (Grant Nos. 61725304, 61673361). The authors also gratefully acknowledge the support from Youth Top-notch Talent Support Program, 1000-talent Youth Program and Youth Yangtze River Scholar.


Supplement

Appendix

Interval arithmetic

This section exhibits some fundamentals in interval arithmetic that may be involved in this paper.

An interval $\mathcal{A}=[a_1,a_2]$ is a set of real numbers denoted by \begin{align}&[a_1,a_2]=\{x\in\mathbb{R}:a_1\leq x\leq a_2\}, \tag{31} \end{align} where $a_1=-\infty$ and $a_2=+\infty$ are allowed, with $\mathrm{mid}(\mathcal{A})=\frac{a_1+a_2}{2}$ and $\mathrm{rad}(\mathcal{A})=\frac{a_2-a_1}{2}$ denoting its midpoint and radius, respectively.

The four basic arithmetic operations are as follows.

Addition. $\mathcal{A}+\mathcal{B}=[a_1+b_1,a_2+b_2]$;

Substraction. $\mathcal{A}-\mathcal{B}=[a_1-b_2,a_2-b_1]$;

Multiplication. $\mathcal{A}\cdot\mathcal{B}=[ \min\{a_1\cdot~b_1,~a_1\cdot~b_2,~a_2~\cdot~b_1,~a_2\cdot~b_2~\},\, \max\{a_1\cdot~b_1,~a_1\cdot~b_2,~a_2~\cdot~b_1,~a_2\cdot~b_2~\}]$;

Division. $\mathcal{A}/\mathcal{B}=[a_1,a_2]\cdot[{1}/{b_2},{1}/{b_1}]$textrm if $0\notin[b_1,b_2]$.

If $b_2<a_1$ or $a_2<b_1$, the intersection of two intervals $\mathcal{A}=[a_1,a_2]$ and $\mathcal{B}=[b_1,b_2]$ is empty, that is, \begin{align}&\mathcal{A}\cap\mathcal{B}=\emptyset. \tag{32} \end{align} Otherwise, we have \begin{align}&\mathcal{A}\cap\mathcal{B}=\left\{x:x\in\mathcal{A} \textrm{ and }\mathcal{B}\right\}=[\max\left\{a_1,b_1\right\},\min\left\{a_2,b_2\right\}]. \tag{33} \end{align} The function of an interval $\mathcal{A}=\left[a_1,a_2\right]$ is defined by \begin{align}&f(\mathcal{A})=\left\{f(x):x\in\mathcal{A}\right\}, \tag{34} \end{align} which is still an interval. For the monotonic functions, such as an exponential or logarithmic function, we have \begin{align}&f(\mathcal{A})=[\min\left\{f(a_1),f(a_2)\right\},\max\left\{f(a_1),f(a_2)\right\}]. \tag{35} \end{align} For a sine function, a piecewise monotonic function with critical points at $n\pi+\frac{\pi}{2}$, where $n\in\mathbb{Z}$, we have \begin{align}&\sin(\mathcal{A})=\left\{ \begin{array}{lll} [\min\left\{\sin(a_1),\sin(a_2)\right\},\max\left\{\sin(a_1),\sin(a_2)\right\}], & \textrm{if }n_2-n_1=0, \\ \left[\min \{\sin(a_1),\sin(a_2) \},+1\right], & \textrm{if }n_2-n_1=1 \textrm{ and }n_1\textrm{ is even}, \\ \left[-1,\max\left\{\sin(a_1),\sin(a_2)\right\}\right], & \textrm{if }n_2-n_1=1 \textrm{ and }n_1\textrm{ is odd}, \\ \left[-1,+1\right], & \textrm{if }n_2-n_1\geq2, \end{array} \right. \tag{36} \end{align} where $n_1=\lfloor~\frac{a_1+\frac{\pi}{2}}{\pi}~\rfloor$ and $n_2=\lfloor~\frac{a_2+\frac{\pi}{2}}{\pi}~\rfloor$ are two integers. Similarly, we have \begin{align}&\cos(\mathcal{A})=\left\{ \begin{array}{lll} \left[\min\left\{\cos(a_1),\cos(a_2)\right\},\max\left\{\cos(a_1),\cos(a_2)\right\}\right], & \textrm{if }n_2-n_1=0, \\ \left[\min\left\{\cos(a_1),\cos(a_2)\right\},+1\right], & \textrm{if }n_2-n_1=1 \textrm{ and }n_1\textrm{ is odd}, \\ \left[-1,\max\left\{\cos(a_1),\cos(a_2)\right\}\right], & \textrm{if }n_2-n_1=1 \textrm{ and }n_1\textrm{ is even}, \\ \left[-1,+1\right], & \textrm{if }n_2-n_1\geq2, \end{array} \right. \tag{37} \end{align} where $n_1=\lfloor~\frac{a_1}{\pi}~\rfloor$ and $n_2=\lfloor~\frac{a_2}{\pi}~\rfloor$ are also two integers.


References

[1] Ye C L, Ma S G, Hui L. An omnidirectional mobile robot. Sci China Inf Sci, 2011, 54: 2631-2638 CrossRef Google Scholar

[2] Lv W, Kang Y, Qin J. Indoor Localization for Skid-Steering Mobile Robot by Fusing Encoder, Gyroscope, and Magnetometer. IEEE Trans Syst Man Cybern Syst, 2017, : 1-13 CrossRef Google Scholar

[3] Breuer T, Macedo G R G, Hartanto R. Johnny: an autonomous service robot for domestic environments. J Intel Robot Syst, 2012, 66: 245-272 CrossRef Google Scholar

[4] Siegwart R, Nourbakhsh I R, Scaramuzza D. Introduction to Autonomous Mobile Robots. 2nd ed. Cambridge: MIT Press, 2011. Google Scholar

[5] Li K, Ji H B. Inverse optimal adaptive backstepping control for spacecraft rendezvous on elliptical orbits. Int J Control, 2017, 7: 1-11 CrossRef Google Scholar

[6] Chung H, Ojeda L, Borenstein J. Accurate mobile robot dead-reckoning with a precision-calibrated fiber-optic gyroscope. IEEE Trans Robot Autom, 2001, 17: 80-84 CrossRef Google Scholar

[7] Kim J H, Lee J C. Dead-reckoning scheme for wheeled mobile robots moving on curved surfaces. J Intel Robot Syst, 2015, 79: 211-220 CrossRef Google Scholar

[8] Reinstein M, Kubelka V, Zimmermann K. Terrain adaptive odometry for mobile skid-steer robots. In: Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, 2013. 4706--4711. Google Scholar

[9] Lee H, Jung J, Choi K. Fuzzy-logic-assisted interacting multiple model (FLAIMM) for mobile robot localization. Robot Auton Syst, 2012, 60: 1592-1606 CrossRef Google Scholar

[10] Borenstein J, Feng L. Gyrodometry: a new method for combining data from gyros and odometry in mobile robots. In: Proceedings of the 1996 IEEE International Conference on Robotics and Automation, Minneapolis, 1996. 423--428. Google Scholar

[11] Myung H, Lee H K, Choi K. Mobile robot localization with gyroscope and constrained Kalman filter. Int J Control Autom Syst, 2010, 8: 667-676 CrossRef Google Scholar

[12] Georgy J, Noureldin A, Korenberg M J. Modeling the stochastic drift of a MEMS-based gyroscope in Gyro/Odometer/GPS integrated navigation. IEEE Trans Intel Transp Syst, 2010, 11: 856-872 CrossRef Google Scholar

[13] Garcia-Valverde T, Garcia-Sola A, Hagras H. A Fuzzy logic-based system for indoor localization using WiFi in ambient intelligent environments. IEEE Trans Fuzzy Syst, 2013, 21: 702-718 CrossRef Google Scholar

[14] Yang P, Wu W Y. Efficient particle filter localization algorithm in dense passive RFID tag environment. IEEE Trans Ind Electron, 2014, 61: 5641-5651 CrossRef Google Scholar

[15] Yasir M, Ho S W, Vellambi B N. Indoor positioning system using visible light and accelerometer. J Lightwave Technol, 2014, 32: 3306-3316 CrossRef ADS Google Scholar

[16] Jung J, Lee S N, Myung H. Indoor mobile robot localization and mapping based on ambient magnetic fields and aiding radio sources. IEEE Trans Instrum Meas, 2015, 64: 1922-1934 CrossRef Google Scholar

[17] Hoermann S, Borges P V K. Vehicle localization and classification using off-board vision and 3-D models. IEEE Trans Robot, 2014, 30: 432-447 CrossRef Google Scholar

[18] How J P, Behihke B, Frank A. Real-time indoor autonomous vehicle test environment. IEEE Control Syst Mag, 2008, 28: 51-64 CrossRef Google Scholar

[19] Panich S, Afzulpurkar N. Mobile robot integrated with gyroscope by using IKF. Int J Adv Robotic Syst, 2011, 8: 122-136 CrossRef Google Scholar

[20] Wang W S, Cao Q X, Zhu X X. An automatic switching approach of robotic components for improving robot localization reliability in complicated environment. Ind Robot, 2014, 41: 135-144 CrossRef Google Scholar

[21] Winterhalter W, Fleckenstein F, Steder B, et al. Accurate indoor localization for RGB-D smartphones and tablets given 2D floor plans. In: Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems, Hamburg, 2015. 3138--3143. Google Scholar

[22] Marinho L B, Almeida J S, Souza J W M. A novel mobile robot localization approach based on topological maps using classification with reject option in omnidirectional images. Expert Syst Appl, 2017, 72: 1-17 CrossRef Google Scholar

[23] Xu D, Han L W, Tan M. Ceiling-based visual positioning for an indoor mobile robot with monocular vision. IEEE Trans Ind Electron, 2009, 56: 1617-1628 CrossRef Google Scholar

[24] Farrell J. Aided Navigation: GPS With High Rate Sensors. New York: McGraw-Hill, 2008. Google Scholar

  • Figure 1

    (Color online) Top-view schematic of mobile robot on a planar floor. The meshed yellow rectangles represent the robot's wheels and the blue polygon its body. The red non-consecutive arc represents the trajectory.

  • Figure 2

    (Color online) (a) Relation between floor vision and robot's heading and location. The purple grid represents tile joints. The rectangle with black non-consecutive edges represents the camera's field of view. The robot's headings fall into four different quadrants, respectively, but they have the same floor vision. (b) The schematic of Case 1.

  • Figure 3

    (Color online) Feature extracting process of floor vision. The green orthogonal lines in the sixth picture are the detected lines that almost coincide with the tile joints.

  • Figure 4

    (Color online) Test results of heading estimation of FVO. The curve “GT" represents the ground truths of the robot heading obtained from the off-board camera. The curve “Odo" represents the odometric heading estimation. The curves “FVO", “FVO$-$", and “FVO+" represent the midpoint, lower bound, and upper bound of the heading estimation of FVO, respectively.

  • Figure 5

    (Color online) Test results of location estimation of FVO. The curve “GT" represents the ground truths of the robot's location obtained from the off-board camera. The curve “Odo" represents the odometric location estimation. The green rectangular boxes represent the location estimation of FVO, while the curve “FVO" represents the midpoint of the location estimation of FVO.

  • Table 1   Transformation rules from $\theta$, $\vartheta$, and $r$ to $x$ and $y$
    Case Rule
    2*Case 1: $\theta\in\left(0,\frac{\pi}{2}\right]$ $x\in\{E_xi+d|_{\vartheta\geq0}+L\cos\theta:i=0,1,2,\ldots\}$
    $y\in\{E_yi-d|_{\vartheta<0}+L\sin\theta:i=0,1,2,\ldots\}$
    2*Case 2: $\theta\in\left(\frac{\pi}{2},\pi\right]$ $x\in\{E_xi+d|_{\vartheta<0}+L\cos\theta:i=0,1,2,\ldots\}$
    $y\in\{E_yi+d|_{\vartheta\geq0}+L\sin\theta:i=0,1,2,\ldots\}$
    2*Case 3: $\theta\in\left(\pi,\frac{3\pi}{2}\right]$ $x\in\{E_xi-d|_{\vartheta\geq0}+L\cos\theta:i=0,1,2,\ldots\}$
    $y\in\{E_yi+d|_{\vartheta<0}+L\sin\theta:i=0,1,2,\ldots\}$
    2*Case 4: $\theta\in\left(\frac{3\pi}{2},2\pi\right]$ $x\in\{E_xi-d|_{\vartheta<0}+L\cos\theta:i=0,1,2,\ldots\}$
    $y\in\{E_yi-d|_{\vartheta\geq0}+L\sin\theta:i=0,1,2,\ldots\}$
  • Table 2   Calculation of the interval of floor visual location measurement
    Case Intervals
    2*
    Case 1:
    $\widehat{\varTheta}_t\subseteq\left(0,\frac{\pi}{2}\right]$
    $\mathcal{X}_{t}^{(c)}=\left\{E_xi+\bigcap_{j\in\mathcal{I}_{y,t}}\mathcal{D}_{t,j}^{(c)}+L\cos\widehat{\varTheta}_t:i=0,1,2,\ldots\right\}$, where $\mathcal{I}_{y,t}=\left\{i:\min\left\{\varPhi_{t,i}^{(c)}\right\}\geq0,i\in\mathcal{I}_{\theta,t}\right\}~~$
    $\mathcal{Y}_{t}^{(c)}=\left\{E_yi-\bigcap_{j\in\mathcal{I}_{x,t}}\mathcal{D}_{t,j}^{(c)}+L\sin\widehat{\varTheta}_t:i=0,1,2,\ldots\right\}$, where $\mathcal{I}_{x,t}=\left\{i:\max\left\{\varPhi_{t,i}^{(c)}\right\}<0,i\in\mathcal{I}_{\theta,t}\right\}$
    2*
    Case 2:
    $\widehat{\varTheta}_t\subseteq\left(\frac{\pi}{2},\pi\right]$
    $\mathcal{X}_{t}^{(c)}=\left\{E_xi+\bigcap_{j\in\mathcal{I}_{y,t}}\mathcal{D}_{t,j}^{(c)}+L\cos\widehat{\varTheta}_t:i=0,1,2,\ldots\right\}$, where $\mathcal{I}_{y,t}=\left\{i:\max\left\{\varPhi_{t,i}^{(c)}\right\}<0,i\in\mathcal{I}_{\theta,t}\right\}~~$
    $\mathcal{Y}_{t}^{(c)}=\left\{E_yi+\bigcap_{j\in\mathcal{I}_{x,t}}\mathcal{D}_{t,j}^{(c)}+L\sin\widehat{\varTheta}_t:i=0,1,2,\ldots\right\}$, where $\mathcal{I}_{x,t}=\left\{i:\min\left\{\varPhi_{t,i}^{(c)}\right\}\geq0,i\in\mathcal{I}_{\theta,t}\right\}$
    2*
    Case 3:
    $\widehat{\varTheta}_t\subseteq\left(\pi,\frac{3\pi}{2}\right]$
    $\mathcal{X}_{t}^{(c)}=\left\{E_xi-\bigcap_{j\in\mathcal{I}_{y,t}}\mathcal{D}_{t,j}^{(c)}+L\cos\widehat{\varTheta}_t:i=0,1,2,\ldots\right\}$, where $\mathcal{I}_{y,t}=\left\{i:\min\left\{\varPhi_{t,i}^{(c)}\right\}\geq0,i\in\mathcal{I}_{\theta,t}\right\}~~$
    $\mathcal{Y}_{t}^{(c)}=\left\{E_yi+\bigcap_{j\in\mathcal{I}_{x,t}}\mathcal{D}_{t,j}^{(c)}+L\sin\widehat{\varTheta}_t:i=0,1,2,\ldots\right\}$, where $\mathcal{I}_{x,t}=\left\{i:\max\left\{\varPhi_{t,i}^{(c)}\right\}<0,i\in\mathcal{I}_{\theta,t}\right\}$
    2*
    Case 4:
    $\widehat{\varTheta}_t\subseteq\left(\frac{3\pi}{2},2\pi\right]$
    $\mathcal{X}_{t}^{(c)}=\left\{E_xi-\bigcap_{j\in\mathcal{I}_{y,t}}\mathcal{D}_{t,j}^{(c)}+L\cos\widehat{\varTheta}_t:i=0,1,2,\ldots\right\}$, where $\mathcal{I}_{y,t}=\left\{i:\max\left\{\varPhi_{t,i}^{(c)}\right\}<0,i\in\mathcal{I}_{\theta,t}\right\}~~$
    $\mathcal{Y}_{t}^{(c)}=\left\{E_yi-\bigcap_{j\in\mathcal{I}_{x,t}}\mathcal{D}_{t,j}^{(c)}+L\sin\widehat{\varTheta}_t:i=0,1,2,\ldots\right\}$, where $\mathcal{I}_{x,t}=\left\{i:\min\left\{\varPhi_{t,i}^{(c)}\right\}\geq0,i\in\mathcal{I}_{\theta,t}\right\}$
  • Table 3   Error statistics of FVO
    Variable Method 1st period 2nd period 3rd period
    2*$\theta$ (deg) Odometry 1.93 7.16 19.65
    FVO 1.88 2.24 2.83
    2*$x$ (mm) Odometry 34.25 74.28 259.56
    FVO 6.06 5.43 5.22
    2*$y$ (mm) Odometry 47.80 66.22 373.17
    FVO 5.34 6.34 6.61

Copyright 2020 Science China Press Co., Ltd. 《中国科学》杂志社有限责任公司 版权所有

京ICP备18024590号-1