簡易檢索 / 詳目顯示

研究生: 謝佩哲
Hsieh, Pei-Che
論文名稱: 具機械手臂之履帶式機器人協作任務之實現
Implementation of Collaborative Tasks for A Tracked Robot with A Mechanical Arm
指導教授: 王偉彥
Wang, Wei-Yen
口試委員: 王偉彥
Wang, Wei-Yen
呂成凱
Lu, Cheng-Kai
盧明智
Lu, Ming-Chih
李宜勳
Li, I-Hsum
口試日期: 2023/01/11
學位類別: 碩士
Master
系所名稱: 電機工程學系
Department of Electrical Engineering
論文出版年: 2023
畢業學年度: 111
語文別: 中文
論文頁數: 91
中文關鍵詞: 履帶機器人自動爬梯跨樓層導航機械手臂物件辨識手勢辨識跨平台溝通人機協作人機互動
英文關鍵詞: tracked robots, automatic climb stairs, cross-floor navigation, robotic arms, object recognition, hand gesture recognition, cross-platform communication, human-robot collaboration, human-robot interaction
DOI URL: http://doi.org/10.6345/NTNU202300389
論文種類: 學術論文
相關次數: 點閱:230下載:30
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 目前履帶式與機械手臂的相關技術已經越來越成熟,但是大部分的研究,還是將兩者分開來分別進行探討,鮮少討論結合的應用策略,因此本文嘗試結合履帶式機器人的移動導航與機械手臂的物件抓取等功能,以實現跨樓層移動取物、多平台的溝通整合以及具交集工作環境的人機協作任務為目標,提出演算法與系統架構。
    本文所使用的機器人平台為自行研發裝載了五軸機械手臂的履帶型機器人,透過雷射測距儀和超音波感測器的輔助,搭配牆面校準演算法,完成自動爬梯。為實現近端定位,利用ArUco圖示輔助,引導機器人更精準地移動至目標地,接著使用TensorFlow-Lite提供的物件偵測模型,找出場景物件,並建立3D虛擬環境,再根據場景模型,計算機械手臂的路徑規劃,進行物件抓取。
    另外本研究透過socket自行開發可以與非機器人作業系統架構開發的機器人進行溝通的簡易方式,讓履帶機器人可以跨平台收到由另一台機器人發送的取物需求,進行跨樓層取物的任務,並透過Mediapipe提供的手勢辨識模型,讓人類使用者以簡易手勢與機器人進行簡易的任務溝通,實現具交集工作環境的人機協作任務。

    Currently, the related technologies for tracked robots and robotic arms are more and more mature. However, most current research still studies them separately, rarely discussing the integration strategies for their application. This thesis tries to combine tracked robot's movement navigation and robotic arm object grasping capabilities. The goal is to achieve tasks such as cross-floor object retrieval, communication integration between multiple platforms, and collaborative tasks in a shared workspace. This is done through proposed algorithms and system architecture.
    The robot platform used in this thesis is a self-developed tracked robot equipped with a five-axis robotic arm, which can climb stairs automatically with the assistance of laser rangefinders and ultrasonic sensors and a wall calibration algorithm. To achieve close-range localization, the robot uses ArUco markers to move more accurately to the target location, and then uses a TensorFlow-Lite object detection model to identify objects in the scene and create a 3D virtual environment. Based on the scene model, the path of the robotic arm is planned to grasp the object.
    In addition, this study develops a simple communication method using sockets, which allows the robot to communicate with other robots that are not based on the Robot Operating System (ROS) framework. This method enables the tracked robot to receive object retrieval requests from another robot across platforms and perform cross-floor object retrieval tasks. It also uses a Mediapipe gesture recognition model to let human users be able to communicate with the robot through simple gestures, enabling collaborative tasks in a shared workspace.

    第一章 緒論 1 1.1 研究背景 1 1.2 研究動機與目的 2 1.3 研究方法 3 1.4 文獻探討 3 1.4.1 人機協作 3 1.4.2 人機互動 5 1.4.3 自動爬梯 5 1.4.4 室內定位 6 1.4.5 輔助定位 7 1.4.6 手臂路徑規劃 8 1.4.7 輕量化AI運算 9 1.5 論文架構 10 第二章 具手臂履帶式機器人硬體與感測器 12 2.1 機器人機構 12 2.1.1 履帶機器人 12 2.1.2 五軸機械手臂 16 2.2 控制核心 18 2.3 電源供應 20 2.4 路由器 21 2.5 感測器介紹 23 2.5.1 雷射測距儀 23 2.5.2 超音波感測器 24 2.5.3 深度攝影機 25 2.6 串接架構 26 2.7 UR3雙手臂機器人 26 第三章 人機協作任務設計 28 3.1 設計理念 28 3.2 任務流程 30 3.2.1 移動導航 32 3.2.2 自動爬梯 33 3.2.3 物件夾取 34 3.2.4 人機互動 34 3.3 任務系統架構 35 第四章 跨樓層移動導航控制 37 4.1 單神經元自適應PID控制器 37 4.2 AMCL融合演算法 41 4.3 沿牆姿態校準 44 4.4 ArUco輔助定位演算法 47 第五章 機械手臂控制 49 5.1 手臂運動學 49 5.1.1 正向運動學 50 5.1.2 逆向運動學 53 5.2 關節路徑規劃 55 5.3 虛實整合 55 第六章 輕量化AI運算 58 6.1 物件辨識 58 6.2 手勢辨識 60 第七章 場域驗證與實驗結果 62 7.1 場域介紹 62 7.2 單神經元自適應PID控制器 65 7.3 移動導航 67 7.4 沿牆姿態校準 72 7.5 手臂取物和虛實整合 77 7.6 人機協作任務 80 第八章 結論與未來展望 83 8.1 結論 83 8.2 未來展望 84 參考文獻 86 自  傳 90 學術成就 91

    J. Arents, V. Abolins, J. Judvaitis, O. Vismanis, A. Oraby, and K. Ozols, “Human–robot collaboration trends and safety aspects: A systematic review,” Journal of Sensor and Actuator Networks, vol. 10, no. 3, p. 48, 2021.

    A. Kolbeinsson, E. Lagerstedt, and J. Lindblom, “Classification of collaboration levels for human-robot cooperation in manufacturing,” in Proc. Advances in Manufacturing Technology XXXII, University of Skövde, Sweden, 2018, pp. 151-156.

    J. Shi, G. Jimmerson, T. Pearson, and R. Menassa, “Levels of human and robot collaboration for automotive manufacturing,” in Proc. Performance Metrics for Intelligent Systems, 2012, pp. 95-100.

    Zeta Group Engineering, “Working with Robots: A Guide to the Collaboration Levels Between Humans and Robots,” Zeta Group Engineering, October 16, 2020. [Online]. Avaliable: https://www.zetagroupengineering.com/levels-of-collaboration-robots/. [Accessed: Jan. 8, 2023].

    F. Karray, M. Alemzadeh, J. Abou Saleh, and M. N. Arab, “Human-computer interaction: Overview on state of the art,” International journal on smart sensing and intelligent systems, vol. 1, no. 1, pp. 137-159, 2008.

    V. J. Lumelsky and E. Cheung, “Real-time collision avoidance in teleoperated whole-sensitive robot arm manipulators,” IEEE Trans. on Systems, Man, and Cybernetics, vol. 23, no. 1, pp. 194-203, 1993.

    Y. Zhang et al., “Dialogpt: Large-scale generative pre-training for conversational response generation,” arXiv preprint arXiv:1911.00536, 2019.

    Ö. Aydın and E. Karaarslan, “OpenAI ChatGPT Generated Literature Review: Digital Twin in Healthcare,” Emerging Computer Technologies 2, pp.22-31 ,2022.

    P. Ben-Tzvi, S. Ito, and A. A. Goldenberg, “Autonomous stair climbing with reconfigurable tracked mobile robot,” in Proc. IEEE 2007 International Workshop on Robotic and Sensors Environments, Ottawa, ON, Canada, 2007, pp. 1-6.

    蕭智偉,“主動式履帶機器人應用於連續樓梯攀爬與避障策略之研究”,碩士論文,國立臺灣師範大學電機工程學系,2015年,[https://hdl.handle.net/11296/ddmfc5]。

    胡晉瑋,“具有樓梯偵測及移動功能之室外型主動式履帶機器人的開發”,碩士論文,國立臺灣師範大學電機工程學系,2017年,[https://hdl.handle.net/11296/3nj7sn]。

    顏愷君,“具有樓梯偵測及移動功能之室外型主動式履帶機器人的開發”,碩士論文,國立臺灣師範大學電機工程學系,2017年,[https://hdl.handle.net/11296/gvpczj]。

    E. Mihankhah, A. Kalantari, E. Aboosaeedan, H. D. Taghirad, S. Ali, and A. Moosavian, “Autonomous staircase detection and stair climbing for a tracked mobile robot using fuzzy controller,” in Proc. 2008 IEEE International Conference on Robotics and Biomimetics, Bangkok, Thailand, 2009, pp. 1980-1985.

    F. Dellaert, D. Fox, W. Burgard, and S. Thrun, “Monte carlo localization for mobile robots,” in Proc. 1999 IEEE international conference on robotics and automation, 1999, Detroit, MI, USA, pp. 1322-1328.

    W. Hess, D. Kohler, H. Rapp, and D. Andor, “Real-time loop closure in 2D LIDAR SLAM,” in Proc. 2016 IEEE international conference on robotics and automation(ICRA), 2016, Stockholm, Sweden, pp. 1271-1278.

    劉慶偉,“基於機器人作業系統之自主履帶車跨樓層巡邏系統開發”,碩士論文,國立臺灣師範大學電機工程學系,2019年,[https://hdl.handle.net/11296/ywu9f7]。

    W. Chen, J. Xu, X. Zhao, Y. Liu, and J. Yang, “Separated sonar localization system for indoor robot navigation, ” IEEE Trans. on Industrial Electronics, vol. 68, no. 7, pp. 6042-6052, Jul. 2020.

    D. R.-Y. Phang, W.-K. Lee, N. Matsuhira, and P. Michail, “Enhanced mobile robot localization with lidar and imu sensor,” in Proc. 2019 IEEE International Meeting for Future of Electron Devices, Kansai (IMFEDK), 2019, Kyoto, Japan, pp. 71-72.

    G. Welch and G. Bishop, “An introduction to the Kalman filter,” 1995.

    S. Garrido-Jurado, R. Muñoz-Salinas, F. J. Madrid-Cuevas, and M. J. Marín-Jiménez, “Automatic generation and detection of highly reliable fiducial markers under occlusion,” Pattern Recognition, vol. 47, no. 6, pp. 2280-2292, Jun. 2014

    S. Roos-Hoefgeest, I. A. Garcia, and R. C. Gonzalez, “Mobile robot localization in industrial environments using a ring of cameras and ArUco markers,” in Proc.IECON 2021–47th Annual Conference of the IEEE Industrial Electronics Society, Toronto, ON, Canada, Oct. 2021, pp. 1-6.

    E. Mráz, J. Rodina, and A. Babinec, “Using fiducial markers to improve localization of a drone,” in Proc. 2020 23rd International Symposium on Measurement and Control in Robotics (ISMCR), Budapest, Hungary, Oct. 2020, pp. 1-5.

    S. Liu and P. Liu, “A review of motion planning algorithms for robotic arm systems,” in Proc. of the 8th International Conference on Robot Intelligence Technology and Applications, 2020, pp. 56-66.

    S. M. LaValle, “Rapidly-exploring random trees: A new tool for path planning,” 1998.

    O. Salzman and D. Halperin, “Asymptotically near-optimal RRT for fast, high-quality motion planning,” IEEE Trans. on Robotics, vol. 32, no. 3, pp. 473-483, Apr. 2016.

    G. Kang, Y. B. Kim, Y. H. Lee, H. S. Oh, W. S. You, and H. R. Choi, “Sampling-based motion planning of manipulator with goal-oriented sampling,” Intelligent Service Robotics, vol. 12, no. 3, pp. 265-273, 2019.

    N. Ye, A. Somani, D. Hsu, and W. S. Lee, “DESPOT: Online POMDP planning with regularization,” Journal of Artificial Intelligence Research, vol. 58, pp. 231-266, 2017.

    M. Zucker et al., “Chomp: Covariant hamiltonian optimization for motion planning,” The International Journal of Robotics Research, vol. 32, no. 9-10, pp. 1164-1193, 2013.

    J. Redmon and A. Farhadi, “Yolov3: An incremental improvement,” arXiv preprint arXiv:1804.02767, 2018.

    C.-Y. Wang, A. Bochkovskiy, and H.-Y. M. Liao, “YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors,” arXiv preprint arXiv:2207.02696, 2022.

    Intel Corproation, “OpenVINO Toolkit Overview,” Intel Developer Zone, [Online]. Avaliable: https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/overview.html. [Accessed: Jan. 8, 2023].

    Camillo Lugaresi, Jiuqiang Tang, Hadon Nash, Chris McClanahan, Esha Uboweja, Michael Hays,Fan Zhang, Chuo-Ling Chang, Ming Guang Yong, Juhyun Lee, Wan-Teh Chang, Wei Hua, Manfred Georg and Matthias Grundmann, “Mediapipe: A framework for building perception pipelines,” arXiv preprint arXiv:1906.08172, 2019.

    M. Tan, R. Pang, and Q. V. Le, “Efficientdet: Scalable and efficient object detection,” in Proc.of the IEEE/CVF conference on computer vision and pattern recognition, 2020, pp. 10781-10790.

    Fan Zhang, Valentin Bazarevsky, Andrey Vakunov, Andrei Tkachenka, George Sung, Chuo-Ling Chang and Matthias Grundmann, “Mediapipe hands: On-device real-time hand tracking,” arXiv preprint arXiv:2006.10214, 2020.

    George Sung, Kanstantsin Sokal, Esha Uboweja, Valentin Bazarevsky, Jonathan Baccash, Eduard Gabriel Bazavan, Chuo-Ling Chang and Matthias Grundmann, “On-device Real-time Hand Gesture Recognition,” arXiv preprint arXiv:2111.00038, 2021.

    Faulhaber, “Series 3890H024CR.s,” Faulhaber Group, [Online]. Available: https://www.faulhaber.com/en/products/series/3890cr/. [Accessed: Jan. 8, 2023].

    Faulhaber, “Motion Controller - Series MCDC 3600S,” Faulhaber Group, [Online]. Available: https://www.faulhaber.com/en/products/series/mcdc-3006-s/. [Accessed: Jan. 8,2023].

    Dr. Fritz Faulhaber GmbH and Co. KG, “Communication / Function Manual,” [Online]. Available: https://www.faulhaber.com/fileadmin/Import/Media/EN_7000_05041.pdf. [Accessed: Jan. 9, 2023].

    ROBOTIS Co., Ltd., “Dynamixel MX-28T,” [Online]. Available: http://en.robotis.com/shop_en/item.php?it_id=902-0067-000. [Accessed: Jan. 8, 2023].

    ROBOTIS Co., Ltd., “Dynamixel MX-106T.” [Online]. Available: http://en.robotis.com/shop_en/item.php?it_id=902-0061-000. [Accessed: Jan. 8, 2023].

    Raspberry Pi Foundation, “Raspberry-Pi-4-Product-Brief,” [Online]. Available: https://static.raspberrypi.org/files/product-briefs/Raspberry-Pi-4-Product-Brief.pdf. [Accessed: Jan 8, 2023].

    ASUSTeK Computer Inc., “ASUS RT-AC1300UHP AC1300 Dual-band Wi-Fi Router,” [Online]. Available: https://www.asus.com/tw/networking-iot-servers/wifi-routers/asus-wifi-routers/rt-ac1300uhp/techspec/. [Accessed: Jan. 9, 2023].

    Hokuyo Automatic Co., Ltd., “URG-04LX-UG01 Scanning Rangefinder / Distance Data Output,” [Online]. Available: https://www.hokuyo-aut.jp/search/single.php?serial=166. [Accessed: Jan. 8, 2023].

    S. Hymel, “Arduino Ultrasonic Sensor Overview and Tutorials.” Seeed Studio Blog, Nov. 4, 2019. [Online]. Available: https://www.seeedstudio.com/blog/2019/11/04/hc-sr04-features-arduino-raspberrypi-guide/. [Accessed: Jan. 8, 2023].

    Intel Corporation, “Intel RealSense Depth Camera D435i.” Intel RealSense, 2023. [Online]. Available: https://www.intelrealsense.com/depth-camera-d435i/. [Accessed: Jan. 9, 2023].

    盛運機械股份有限公司, “丹麥六軸協作式機械手UR3,” 盛運機械股份有限公司, 2023. [Online]. Available: https://www.shengyumc.com.tw/ur3.html. [Accessed: Jan. 8, 2023].

    Logitech, “C922 Pro HD Stream Webcam,” Logitech, [Online]. Availabel: https://www.logitech.com/zh-tw/products/webcams/c922-pro-stream-webcam.960-001091.html. [Accessed: Jan. 8, 2023].

    下載圖示
    QR CODE