Real-Time Correction Based on Wheel Odometry to Improve Pedestrian Tracking Performance in Small Mobile Robot

Authors

    Jaehun Park, Min Sung Ahn, Jeakweon Han Department of Physics, Hanyang University, Seoul, Republic of Korea Department of Mechanical and Aerospace Engineering, UCLA, Los Angeles, California, United States of America Department of Robotics, Hanyang University, Seoul, Republic of Korea

Keywords:

Multiple object tracking, Mobile robot, Real-time

Abstract

With the growth in the intelligence of mobile robots, interaction with humans is emerging as a very important issue for mobile robots, and the pedestrian tracking technique following the designated person is adopted in many cases in a way that interacts with humans. Among the existing multi-object tracking techniques for pedestrian tracking, Simple Online and Real-time Tracking (SORT) is suitable for small mobile robots that require real-time processing while having limited computational performance. However, SORT fails to reflect changes in object detection values caused by the movement of the mobile robot, resulting in poor tracking performance. To solve this performance degradation, this paper proposes a more stable pedestrian tracking algorithm by correcting object tracking errors caused by robot movement in real-time using wheel odometry information of a mobile robot and dynamically managing the survival period of the tracker that tracks the object. In addition, the experimental results show that the proposed methodology using data collected from actual mobile robots maintains real-time and has improved tracking accuracy with resistance to the movement of the mobile robot.

References

Bogue R, 2016, Growth in E-Commerce Boosts Innovation in the Warehouse Robot Market. Industrial Robot, 43(6): 583–587. https://doi.org/10.1108/IR-07-2016-0194

Hitti N, 2020. Ballie the Rolling Robot is Samsung’s Near-Future Vision of Personal Care. Dezeen. https://www.dezeen.com/2020/01/08/samsung-ballie-robot-ces-2020/

Leal-Taixé L, Milan A, Reid I, et al., 2015, MOTChallenge 2015: Towards a Benchmark for Multi-Target Tracking. arXiv preprint, arXiv: 1504.01942. https://doi.org/10.48550/arXiv.1504.01942

Dendorfer P, Rezatofighi H, Milan A, et al., 2020, MOTO20: A Benchmark for Multi Object Tracking in Crowded Scenes. arXiv preprint, arXiv: 2003.09003. https://doi.org/10.48550/arXiv.2003.09003

Xu Y, Ban Y, Delorme G, et al., 2021, TransCenter: Transformers with Dense Representations for Multiple-Object Tracking. arXiv preprint, arXiv: 2103.15145. https://doi.org/10.48550/arXiv.2103.15145

Stadler D, Beyerer J. 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), June 20–25, 2021: Improving Multiple Pedestrian Tracking by Track Management and Occlusion Handling. 2021, Nashville, 10953–10962. https://doi.org/10.1109/cvpr46437.2021.01081

Bewley A, Ge Z, Ott L, et al. 2016 IEEE International Conference on Image Processing (ICIP), September 25–28, 2016: Simple Online and Realtime Tracking. 2016, Phoenix, 3464–3468. https://doi.org/10.1109/icip.2016.7533003

Kapania S, Saini D, Goyal S, et al., 2020, Multi Object Tracking with UAVs using Deep SORT and YOLOv3 RetinaNet Detection Framework. Proceedings of the 1st ACM Workshop on Autonomous and Intelligent Mobile Systems, 2020: 1–6. https://doi.org/10.1145/3377283.3377284

Bochinski E, Eiselein V, Sikora T. 2017 14th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS), August 29–September 1, 2017: High-Speed Tracking-by-Detection Without Using Image Information. 2017, Lecce, 1–6. https://doi.org/10.1109/avss.2017.8078516

Pereira R, Carvalho G, Garrote L, et al., 2022, Sort and Deep-SORT Based Multi Object Tracking for Mobile Robotics: Evaluation with New Data Association Metrics. Applied Science, 12(3): 1319. https://doi.org/10.3390/app12031319

Horiuchi T, Thompson S, Kagami S, et al. 2007 IEEE International Conference on Systems, Man, and Cybernetics, October 7–10, 2007: Pedestrian Tracking from a Mobile Robot Using a Laser Range Finder. 2007, Montreal, 931–936. https://doi.org/10.1109/icsmc.2007.4413964

Reid D, 1979, An Algorithm for Tracking Multiple Targets. IEEE Transactions on Automatic Control, 24(6): 843–854. https://doi.org/10.1109/tac.1979.1102177

Tsokas NA, Kyriakopoulos KJ, 2011, Multi-Robot Multiple Hypothesis Tracking for Pedestrian Tracking. Autonomous Robots, 32: 63–79. https://doi.org/10.1007/s10514-011-9259-7

Nam B, Kang S-I, Hong H. 2011 17th Korea-Japan Joint Workshop on Frontiers of Computer Vision (FCV), February 9–11, 2011: Pedestrian Detection System Based on Stereo Vision for Mobile Robot. 2011, Ulsan, 1–7. https://doi.org/10.1109/fcv.2011.5739758

Basso F, Munaro M, Michieletto S, et al. Fast and Robust Multi-People Tracking from RGB-D Data for a Mobile Robot. In Intelligent Autonomous System 12. 2013, Springer, Berlin, Heidelberg, 265–276. https://doi.org/10.1007/978-3-642-33926-4_25

Zhang H, Reardon C, Parker LE, 2013, Real-Time Multiple Human Perception with Color-Depth Cameras on a Mobile Robot. IEEE Transactions on Cybernetics, 43(5): 1429–1441. https://doi.org/10.1109/tcyb.2013.2275291

Sun Z, Chen J, Chao L, et al., 2021, A Survey of Multiple Pedestrian Tracking Based on Tracking-by Detection Framework. IEEE Transactions on Circuits and Systems for Video Technology, 31(5): 1819–1833. https://doi.org/10.1109/tcsvt.2020.3009717

Hossain S, Lee D-J, 2019, Deep Learning-Based Real-Time Multiple-Object Detection and Tracking from Aerial Imagery via a Flying Robot with GPU-Based Embedded Devices. Sensors, 19(15): 3371. https://doi.org/10.3390/s19153371

Thang DN, Nguyen LA, Dung PT, et al. 2018 5th NAFOSTED Conference on Information and Computer Science (NICS), November 23–24, 2018: Deep Learning-Based Multiple Objects Detection and Tracking System for Socially Aware Mobile Robot Navigation Framework. 2018, Ho Chi Minh City, 436–441. https://doi.org/10.1109/nics.2018.8606878

Carvalho G de S. Kalman Filter-Based Object Tracking Techniques for Indoor Robotic Applications. 2021, Universidade de Coimbra. https://estudogeral.sib.uc.pt/handle/10316/98163

Koch G. Siamese Neural Networks for One-Shot Image Recognition. 2015, University of Toronto. http://www.cs.toronto.edu/~gkoch/files/msc-thesis.pdf

Bolme DS, Beveridge JR, Draper BA, et al. 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, June 13–18, 2010: Visual Object Tracking Using Adaptive Correlation Filters. 2010, San Francisco, 2544–2550. https://doi.org/10.1109/cvpr.2010.5539960

Wojke N, Bewley A, Paulus D. 2017 IEEE International Conference on Image Processing (ICIP), September 17–20, 2017: Simple Online and Realtime Tracking with a Deep Association Metric. 2017, Beijing, 3645–3649. https://doi.org/10.1109/icip.2017.8296962

Scaramuzza D, Fraundorfer F, 2011, Visual Odometry [Tutorial]. IEEE Robotics & Automation Magazine, 18(4): 80–92. https://doi.org/10.1109/mra.2011.943233

Nejad ZZ, Ahmadabadian AH, 2019, ARM-VO: An Efficient Monocular Visual Odometry from Ground Vehicles on ARM CPUs. Machine Vision and Applications, 30: 1061–1070. https://doi.org/10.1007/s00138-019-01037-5

Ristani E, Solera F, Zou R, et al. Performance Measures and a Data Set for Multi Target, Multi-Camera Tracking. In Computer Vision – ECCV 2016. 2016, Springer, Cham. https://doi.org/10.1007/978-3-319-48881-3_2

Redmon J, Divvala S, Girshick R, et al. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 27–30, 2016: You Only Look Once: Unified, Real-Time Object Detection. 2016, Las Vegas, 779–788. https://doi.org/10.1109/cvpr.2016.91

Liu W, Anguelov D, Erhan D, et al. SSD: Single Shot MultiBox Detector. In Computer Vision – ECCV 2016. 2016, Springer, Cham. https://doi.org/10.1007/978-3-319-46448-0_2

Bochkovskiy A, Wang C-Y, Liao H-YM, 2020, YOLOv4: Optimal Speed and Accuracy of Object Detection. arXiv preprints, arXiv: 2004.10934. https://doi.org/10.48550/arXiv.2004.10934

Wang C-Y, Bochkovskiy A, Liao H-YM. 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), June 20–25, 2021: Scaled-YOLOv4: Scaling Cross Stage Partial Network. 2021, Nashville, 13024–13033. https://doi.org/10.1109/cvpr46437.2021.01283

Sandler M, Howard A, Zhu M, et al. 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, June 18–23, 2018: MobileNetV2: Inverted Residuals and Linear Bottlenecks. 2018, Salt Lake City, 4510–4520. https://doi.org/10.1109/cvpr.2018.00474

Milan A, Leal-Taixé L, Reid I, et al., 2016, MOT16: A Benchmark for Multi-Object Tracking. arXiv preprint, arXiv: 1603.00831. https://doi.org/10.48550/arXiv.1603.00831

Downloads

Published

2021-12-31