簡易檢索 / 詳目顯示

研究生: 孫煜翔
Sun, Yu-Hsiang
論文名稱: 人形機器人騎乘電動機車時之視覺里程計
Visual Odometry for a Humanoid Robot Riding an E-Scooter
指導教授: 包傑奇
Baltes, Jacky
口試委員: 杜國洋
Tu, Kuo-Yang
王偉彥
Wang, Wei-Yen
包傑奇
Baltes, Jacky
口試日期: 2023/03/31
學位類別: 碩士
Master
系所名稱: 電機工程學系
Department of Electrical Engineering
論文出版年: 2023
畢業學年度: 111
語文別: 英文
論文頁數: 42
中文關鍵詞: 人形機器人兩輪車輛深度學習
英文關鍵詞: Humanoid Robots, Two-wheeled Vehicles, Deep Learning, ORB SLAM3
研究方法: 實驗設計法比較研究
DOI URL: http://doi.org/10.6345/NTNU202300396
論文種類: 學術論文
相關次數: 點閱:329下載:10
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • In recent years, deep learning has been used to develop autonomous driving on four wheels, with the goal of reaching autonomy. Our laboratory is dedicated to developing intelligent humanoid robots and is willing to take on new research fields - autonomous driving of an unmodified humanoid robot on two wheels. Taiwan has a well-established scooter industry, but few researchers have studied the behavior of autonomous scooter driving. Our ambitious plan is to use a large humanoid robot called Thormang3 to develop an autonomous scooter system and attempt to pass the driving test for the Taiwanese scooter license. To achieve self-balancing in a real environment, the current speed detection and control of the scooter are crucial issues. The main contribution of this paper is a speed controller for a two-wheeled electric scooter, using a large humanoid robot to achieve constant speed driving in a real-world environment. Currently, we are using three main methods to obtain the current speed of the scooter: Yolo dashboard speed detection, ORB SLAM3, and a hybrid method.
    We will evaluate the accuracy of these methods in an outdoor environment and discuss their advantages and limitations. By using the linearity of the speedometer, we can obtain a rough velocity estimate for the robot using Yolov4 object detection. During the robot's navigation, the rough velocity estimate provides a relatively accurate measure of the real-world scale factor necessary for ORB SLAM3, which helps overcome the inherent disadvantages of monocular cameras and improve real-time velocity extraction.

    Acknowledgments i Abstract ii Table of Contents iii List of Tables v List of Figures vi List of Symbols viii Chapter 1 Introduction 1 1.1 Background 1 1.2 Motivation 2 1.3 Research Aim 4 1.4 Objectives 4 1.5 Structure of the Thesis 4 Chapter 2 Literature Review 6 2.1 Yolo-v4 6 2.2 Visual Odometry 6 2.3 ORB-SLAM3 9 2.4 Sensor Fusion 10 2.4.1 Extended Kalman Filter 11 Chapter 3 Robot-Scooter System 14 3.1 The Robot - THORMANG3 14 3.2 The Two-wheeled Vehicle - Gogoro Scooter 16 3.3 Robot-Scooter System V2 16 Chapter 4 Methodology 18 4.1 PID Controller 18 4.1.1 Balance Control 18 4.1.2 Direction Control 19 4.1.3 Throttle Control 20 4.2 Andriod Application Controllor 20 4.2.1 Self-Balance Function 21 4.2.2 Completely Control Function 21 4.3 Speed Detection 22 4.3.1 Yolov4 training data 22 4.3.2 Visaul Odomerty in ORB SLAM3 24 4.3.3 Fusion Approach 24 4.4 Implementation on ROS 25 Chapter 5 Results and Discussion 27 5.1 Experiment Setup 27 5.2 Experimental Result for Yolo Speed Detection 29 5.3 Experimental Result for ORB SLAM3 30 5.4 Experimental Result for Fusion Method 32 Chapter 6 Conclusion and Future Work 36 6.1 Conclusion 36 6.2 Future Work 37 References 38

    [1] P. Xu, Q. Li, B. Zhang, F. Wu, K. Zhao, X. Du, C. Yang, and R. Zhong, “On-board real-time ship detection in hisea-1 sar images based on cfar and lightweight deep learning,” Remote Sensing, vol. 13, p. 1995, 05 2021.
    [2] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Unified, real-time object detection,” pp. 779–788, 2016.
    [3] C. Campos, R. Elvira, J. J. G. Rodríguez, J. M. M. Montiel, and J. D. Tardós, “Orb-slam3: An accurate open-source library for visual, visual–inertial, and multimap slam,” IEEE Transactions on Robotics, vol. 37, no. 6, pp. 1874–1890, 2021.
    [4] Y. Netzer, T. Wang, A. Coates, A. Bissacco, B. Wu, and A. Y. Ng, “Reading digits in naturalimages with unsupervised feature learning,” 2011.
    [5] T. Moore and D. Stouch, “A generalized extended kalman filter implementation for therobot operating system,” Springer, pp. 335–348, 2016.
    [6] 交通部道路交通安全督導委員會, “機車事故年奪數百性命加劇台灣少子化國安危機,” 交通部道路交通安全督導委員會, 2021.
    [7] W. Chang, “Taiwan’s ’living hell’ traffic is a tourism problem, say critics,” CNN Travel, 2022.
    [8] M. Piesing, “Meet the motorbike-racing robot,” BBC FUTURE, 2018.
    [9] G. Christmann and J. Baltes, “Balance and steering control of a humanoid robot on an electric scooter,” pp. 1–55, 2021.
    [10] C. Lin and J. Baltes, “Balance and steering control of a humanoid robot on an electric scooter application,” pp. 1–42, 2021.
    [11] A. Nedelea, “After model 3, y tesla removes radar from model s, x,” INSIDEEVs. 38
    [12] Y. Wang, W.-L. Chao, D. Garg, B. Hariharan, M. Campbell, and K. Weinberger, “Pseudo-lidar from visual depth estimation: Bridging the gap in 3d object detection for autonomous driving,” 2019.
    [13] P. Wang, “Research on comparison of lidar and camera in autonomous driving,” IOP SCI-ENCE, 2021.
    [14] V. K. Shashi Poddar, Rahul Kottath, “Evolution of visual odometry techniques,” arXiv preprint arXiv:1804.11142v1, 2018.
    [15] S. Ahn, K. Lee, W. K. Chung, and S.-R. Oh, “Slam with visual plane: Extracting vertical plane by fusing stereo vision and ultrasonic sensor for indoor environment,” pp. 4787– 4794, 2007.
    [16] H. A. Kadir and M. R. Arshad, “Features detection and matching for visual simultaneous localization and mapping (vslam),” pp. 40–45, 2013.
    [17] J. Redmon and A. Farhadi, “Yolov3: An incremental improvement,” arXiv preprint arXiv:1804.02767, 2018.
    [18] A. Bochkovskiy, C.-Y. Wang, and H.-Y. M. Liao, “Yolov4: Optimal speed and accuracy of object detection,” arXiv preprint arXiv:2004.10934, 2020.
    [19] D. Nister, O. Naroditsky, and J. Bergen, “Visual odometry,” in Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004., vol. 1, pp. I–I, 2004.
    [20] D. Scaramuzza and F. Fraundorfer, “Visual odometry [tutorial],” IEEE Robotics Automation Magazine, vol. 18, no. 4, pp. 80–92, 2011.
    [21] G. Welch and G. Bishop, “An introduction to the kalman filter,” Proc. Siggraph Course, vol. 8, 01 2006.
    [22] A. Cox, “Angular momentum and motorcycle counter-steering: a discussion and demon-stration,” American Journal of Physics, vol. 66, no. 11, pp. 1018–1020, 1998.39
    [23] K. Code and D. Chandler, A Twist of the Wrist: The Basics of High-performance Motor-cycle Riding. Acrobat Books, 1993.
    [24] J. Baltes, U. Roux, and S. Saeedvand, “High-fidelity simulation of a humanoid robot driving an e-scooter using web technologies,” pp. 1–5, 2022.
    [25] Z. H. X. X. R. P. J. Lin, J. Peng, “Orb-slam, imu and wheel odometry fusion for indoor mobile robot localization and navigation,” Academic Journal of Computing & Information Science, 2022.

    下載圖示
    QR CODE