Optical Flow Algorithm for Velocity Estimation of Ground Vehicles: A Feasibility Study

Publications

Share / Export Citation / Email / Print / Text size:

International Journal on Smart Sensing and Intelligent Systems

Professor Subhas Chandra Mukhopadhyay

Exeley Inc. (New York)

Subject: Computational Science & Engineering , Engineering, Electrical & Electronic

GET ALERTS

eISSN: 1178-5608

DESCRIPTION

6
Reader(s)
24
Visit(s)
0
Comment(s)
0
Share(s)

VOLUME 1 , ISSUE 1 (March 2008) > List of articles

Optical Flow Algorithm for Velocity Estimation of Ground Vehicles: A Feasibility Study

Correspondence
savan.chhaniyara@kcl.ac.uk '> Savan Chhaniyara * / Correspondence
pished.bunnun@nectec.or.th '> Pished Bunnun * / Correspondence
lakmal.seneviratne@kcl.ac.uk '> Lakmal D. Seneviratne * / Correspondence
k.althoefer@kcl.ac.uk '> Kaspar Althoefer *

Keywords : optical flow, velocity estimation, visual odometry, optimization.

Citation Information : International Journal on Smart Sensing and Intelligent Systems. Volume 1, Issue 1, Pages 0-0, DOI: https://doi.org/10.21307/ijssis-2017-289

License : (CC BY-NC-ND 4.0)

Published Online: 13-December-2017

ARTICLE

ABSTRACT

This paper presents a novel velocity estimation method for all terrain ground vehicles. The technique is based on a camera that scans the ground and estimates the velocity by using an optical flow algorithm. The method is tested and validated for different types of terrains such as fine sand, coarse sand, gravel as well as a mixture of coarse sand and gravel. Measured velocities from precise encoders are compared with the velocities predicted by the optical flow algorithm, showing promising potential for implementation of the suggested approach in ground vehicles. Investigations have been carried out to determine the optimal feature window size and the influence of camera height on optical flow velocity estimates. Detailed laboratory experiments were carried out to validate the velocity estimation technique and results indicate the usefulness of the proposed
method for velocity estimation of ground vehicles.

Content not available PDF Share

FIGURES & TABLES

REFERENCES

[1] Z.B. Song, Y H Zweiri, L D Seneviratne and K Althoefer, “Non-linear observer for estimating slips of wheeled vehicles”, The 12th Annual IEEE Conference On Mechatronics And Machine Vision In Practice, Manila, Philippines (2005).
[2] Georgiev A , Allen PK, “Localization methods for a mobile robot in urban environments” IEEE Transactions On Robotics And Automation 20 (5): 851-864 Oct 2004
[3] Panzieri S, Pascucci F, Ulivi G, ”An outdoor navigation system using GPS and inertial platform”, IEEE-ASME TRANSACTIONS ON MECHATRONICS 7 (2): 134-142 JUN 2002
[4] Lobo J , Dias J,” Vision and inertial sensor cooperation using gravity as a vertical reference”, IEEE Transactions On Pattern Analysis And Machine Intelligence 25 (12): 1597-1608 Dec 2003
[5] Charles Thorpe, Martial H. Hebert, Takeo Kanade and Steven A. Shafer,” Vision and Navigation for the Carnegie-Mellon Navlab,” IEEE Transactions On Pattern Analysis And Machine Intelligence, Vol. Io, No. 3, May 1988
[6] Davison, “Real-time simultaneous localization and mapping with a single camera”, IEEE International Conference on Computer Vision, Nice, France, pp 1403–1410 (2003).
[7] Y. Takaoka, Y. Kida, S. Kagami, H. Mizoguchi and T. Kanade, “3d map building for a humanoid robot by using visual odometry”, IEEE International Conference on Systems, Man and Cybernetics, The Hague, pp 4444–4449 (2004).
[8] E. Marchand, P. Bouthemy, F. Chaumette and V. Moreau, “Robust real-time visual tracking using a 2D-3D model-based approach”, IEEE International Conference on Computer Vision, Kerkyra, pp 262–268 (1999).
[9] B. Jung and G. S. Sukhatme, “Detecting moving objects using a single camera on a mobile robot in an outdoor environment”, In the 8th Conference on Intelligent Autonomous Systems, Amsterdam, pp 980–987 (2004).
[10]M. Betke, E. Haritaoglu and L. Davis, “Multiple vehicle detection and tracking in hard real time”, IEEE Symposium on Intelligent Vehicles, Tokyo, pp 351–356 (1996).
[11]J. M. Ferryman, S. J. Maybank and A. D. Worrall, “Visual surveillance for moving vehicles”, in Journal of Compter Vision, 37(2), pp 187–197 (2000).
[12]S. Thrun and M. Montemerlo. The GraphSLAM algorithm with applications to large-scale mapping of urban structures. International Journal on Robotics Research, 25(5/6):403-430, 2005.
[13]Davison, “Real-Time Simultaneous Localization and Mapping with a Single Camera," IEEE International Conference on Computer Vision, pp. 1403–1410, 2003.
[14] DeSouza GN , Kak AC, “Vision for mobile robot navigation: A survey “, IEEE Transactions on Pattern Analysis and Machine Intelligence 24 (2): 237-267 Feb 2002.
[15]Yang Cheng, Mark W. Maimone, and Larry Matthies, “Visual Odometry on the Mars Exploration Rovers”, IEEE Robotics & Automation Magazine, June 2006.
[16]Y. Takaoka, Y. Kida, S. Kagami, H. Mizoguchi and T. Kanade, “3D Map Building for a Humanoid Robot by using Visual Odometry," IEEE International Conference on Systems, Man and Cybernetics, pp. 4444–4449, 2004.
[17]David Fernandez and Andrew Price, “ Visual Odometry for an Outdoor Mobile Robot”, Proceedings of the 2004 IEEE Conference on Robotics, Automation and Mechatronics, Singapore, 1-3 December, 2004
[18]Savan Chhaniyara, Pished Bunnun, Yahya H Zweiri, Lakmal D Seneviratne and Kaspar Althoefer, “Feasibility of Velocity Estimation for All Terrain Ground Vehicles using an Optical Flow Algorithm”, ICARA 2006-Third international conference on autonomous robots and agents, pp. 429-434, 12-14 December 2006, New Zealand.
[19]Savan Chhaniyara, Kaspar Althoefer, Yahya Zweiri, Lakmal Seneviratne, “A novel approach for Self-Localization based on Computer Vision and Artificial Marker Deposition”, ICNSC -IEEE International Conference on Networking, Sensing and Control, p139-p144, 15-17 April 2007, London
[20]David Fernandez and Andrew Price, “ Visual Odometry for an Outdoor Mobile Robot”, Proceedings of the 2004 IEEE Conference on Robotics, Automation and Mechatronics,Singapore, 1-3 December, 2004
[21]Kelly, “Pose tracking for mobile robot localization from large scale appearance mosaics”, in International Journal of Robotics Research, 19(11) pp 1104-1125 (2000).
[22]Yang Cheng, Mark W. Maimone, and Larry Matthies, “Visual Odometry on the Mars Exploration Rovers”, IEEE Robotics & Automation Magazine, June 2006.
[23]B.K.P. Horn and B.G. Schunck, “Determining Optical flow”, Artificial Intelligence, 17, pp 185-203 (198 I )
[24]J.L. Barron, D.J. Fleet, and S.S. Beauchemin, “Performance of optical flow techniques”,Int. J. Comput. Vis. 12(1), pp 43–77 (1994).
[25]B.D. Lucas and T. Kanade, “An iterative image registration technique with an application to stereo vision”, Proc. Imaging Understanding Workshop., pp 121–130 (1981).
[26]S. Birchfield, “Kanade-Lucas-Tomasi feature tracker”, http://www.ces.clemson.edu/~stb/klt visited on 10/05/2006
[27]“Camera calibration toolbox for matlab”, http://www.vision.caltech.edu/bouguetj
[28]Owens, R. [1997] \Pinhole camera model," online available:http://homepages.inf.ed.ac.uk/rbf/cvonline/LOCALCOPIES/OWENS/LECT1/node2.html.
[29]S. Singh and B. Digney, “Autonomous cross-country navigation using stereo vision”, Tech. report, Robotics Institute, Carnegie Mellon University, CMU-RI-TR-99-03 (1999).
[30] Wu, “Optical navigation system”, “Bsc. Thesis” Department of Comp. Sci. and Electrical Engineering, University of Queensland, (2001)
[31]M. H. Bruch, G.A. Gilbreath, J.W. Muelhauser and J.Q. Lum, “Accurate waypoint navigation using non-differential GPS”, Tech. Report, Space and naval warfare systems centre, CA. (2002).
[32]S. Clark, “Autonomous land vehicle navigation using millimetre wave radar”, Proceedings of the IEEE ICRA 1998, Leuven, pp 3697-3702 (1998).[33]D. Dickmanns et al., “The seeing passenger car vamorsp”, IEEE Symp. Intell. Veh. ’94,
Paris, pp 68–73 (1994).
[34]T. Kanade, C. Thorpe, M. Herbert, and S. Shafer, “Vision and navigation for the Carnegie-Mellon navlab”, IEEE Trans. Pattern Anal. Machine Intell., 10(3), pp 361–372 (1988).
[35] Jianbo Shi and Carlo Tomasi. Good Features to Track. IEEE Conference on Computer Vision and Pattern Recognition, pages 593-600, 1994.

EXTRA FILES

COMMENTS