研究生: |
謝翔宇 Hsieh, Hsiang-Yu |
---|---|
論文名稱: |
整合機器視覺與機器手臂快換裝置之虛實整合技術應用於彈性組裝任務學習 A Cyber-Physical System Approach with the Integration of Machine Vision and Robotic Tool Changing for Flexible Assembly Task Learning |
指導教授: |
陳瑄易
Chen, Syuan-Yi 蔣欣翰 Chiang, Hsin-Han |
口試委員: |
林政宏
Lin, Cheng-Hung 蔣欣翰 Chiang, Hsin-Han 陳瑄易 Chen, Syuan-Yi |
口試日期: | 2022/08/03 |
學位類別: |
碩士 Master |
系所名稱: |
電機工程學系 Department of Electrical Engineering |
論文出版年: | 2022 |
畢業學年度: | 110 |
語文別: | 中文 |
論文頁數: | 55 |
中文關鍵詞: | 虛實整合 、機器人視覺 、任務學習 、快換系統 、彈性製造 |
英文關鍵詞: | Cyber-physical systems, machine vision, skill learning, tool changers, flexible manufacturing |
研究方法: | 實驗設計法 |
DOI URL: | http://doi.org/10.6345/NTNU202201776 |
論文種類: | 學術論文 |
相關次數: | 點閱:120 下載:12 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
因應現今大量客製化且彈性化的製造流程,工廠中的生產線必須時常進行任務的轉換,而大量花費人工與時間進行機械手臂實機調教的方法難以達到高效率生產目標,因此本論文提出一套虛實整合技術應用於機械手臂的任務學習中,透過虛擬環境中以軟體示教方式完成機械手臂高階任務決策模型的學習。學習演算法的開發主要是透過任務樹(Task Tree)決策模型進行復雜任務的自動規劃,此決策模型先從虛擬環境機械手臂的動作中學習完成複雜的組合任務,並輸出相對應任務的動作命令來控制機械手臂。該決策模型可即時在手臂端進行決策思考,並且當有新的組裝任務時,也可快速將新的操作步驟加入至原本模型中。本論文所開發之虛實整合技術具備以下特點:步驟合理性分析、快速新增新任務以及物件特性分析,首先導入支持向量機演算法(Support Vector Machine,SVM)使能夠判斷物件擺放狀態及夾取物件的種類,將物件特性考慮至決策模型中,決策模型可決策該物件合適的夾爪並且透過快換裝置進行治具的快速替換。在實驗驗證方面,透過機械手臂搭配快換裝置進行加工及自動組裝燈具之操作任務來展示本論文所提出之虛實整合技術可應用於彈性製造的解決方案。
Due to increasing heavy demands for the customized and flexible manufacturing process, the production line in the factory has to handle a variety of different products from task to task. The traditional method of using many robotic arms that requires lots of human resources and is time-consuming on machine tuning is no longer applicable. To this end, this thesis follows the concept of cyber-physical systems (CPS) for developing a task learning approach for the robotic arm to achieve high-level decision-making through learning from demonstration in the virtual environment. The task tree algorithm is employed for the automatic planning of complex tasks, in which a decision-making model is presented to generate complex task sets from a large number of actions. Then, the decision-making model can export tasks and control the robotic arm to yield the correct operation in real-time when different tasks are endowed to a robotic arm by means of arranging a new task or new steps to the existing model. The decision-making model proposed in our thesis can analyze the rationality of model steps, quickly add new tasks, and perform object analysis. A support vector machine (SVM) algorithm is used to identify the state of an object and a suitable gripper for the object. By considering the characteristics of the relevant object, the decision-making model can then select a suitable gripper and switch between grippers quickly. The decision tree algorithm can be applied to complex tasks and thus can replace expert systems used for adjustment. The conducted experiments demonstrate the machining task and assembly task to investigate the proposed system capability towards the feasible solution to the flexible manufacturing.
[1] C. Cortes and V. Vapnik, "Support-vector networks," Machine learning, vol. 20, no. 3, pp. 273-297, 1995.
[2] L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stone, Classification and regression trees. Routledge, 2017.
[3] KOSMEK Corporation, Robotic Hand Changer, http://www.kosmek.com/, May 1999.
[4] B.-S. Ryuh, S. M. Park, and G. R. Pennock, "An automatic tool changer and integrated software for a robotic die polishing station," Mechanism and Machine Theory, vol. 41, no. 4, pp. 415-432, 2006.
[5] Z. Lianzhong and W. Li, "Machining center automatic ATC analysis and research," in 2010 3rd International Conference on Information Management, Innovation Management and Industrial Engineering, 2010, vol. 2: IEEE, pp. 355-358.
[6] J. P. Rogelio and R. G. Baldovino, "Development of an automatic tool changer (ATC) system for the 3-axis computer numerically-controlled (CNC) router machine: Support program for the productivity and competitiveness of the metals and engineering industries," in 2014 International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment and Management (HNICEM), 2014: IEEE, pp. 1-5.
[7] C. Obreja, G. Stan, D. Andrioaia, and M. Funaru, "Design of an automatic tool changer system for milling machining centers," in Applied Mechanics and Materials, 2013, vol. 371: Trans Tech Publ, pp. 69-73.
[8] CNC快換裝置示意圖,取自https://shop.stepcraft-systems.com/Tool-Changers
[9] ATI QC-310快換裝置,取自https://www.newequipment.com/plant-operations/article/22059508/electric-cars-power-growth-for-robotic-tool-changers
[10] 採椒機器人,取自https://www.nbcnews.com/mach/science/new-pepper-picking-robot-isn-t-fast-it-can-work-ncna950846
[11] 達文西機械手臂,取自https://www.cgh.org.tw/ec99/rwd1320/category.asp?category_id=139
[12] 排爆機器人,取自https://www.popsci.com/police-used-bomb-disposal-robot-to-kill-dallas-shooting-suspect/
[13] P. Nerakae, P. Uangpairoj, and K. Chamniprasart, "Using machine vision for flexible automatic assembly system," Procedia Computer Science, vol. 96, pp. 428-435, 2016.
[14] E. N. Malamas, E. G. M. Petrakis, M. Zervakis, L. Petit, and J. D. Legat, “A survey on industrial vision systems, applications and tools,” Image and Vision Computing, vol. 21, pp. 171-188, 2003.
[15] F. Cheng, "Robot manipulation of 3D cylindrical objects with a robot-mounted 2D vision camera," in 2017 Computing Conference, 2017: IEEE, pp. 192-199.
[16] P.-J. Hwang, C.-C. Hsu, P.-Y. Chou, W.-Y. Wang, and C.-H. Lin, "Vision-Based Learning from Demonstration System for Robot Arms," Sensors, vol. 22, no. 7, p. 2678, 2022.
[17] T. Yu et al., "One-shot imitation from observing humans via domain-adaptive meta-learning," arXiv preprint arXiv:1802.01557, 2018.
[18] K. French, S. Wu, T. Pan, Z. Zhou, and O. C. Jenkins, "Learning behavior trees from demonstration," in 2019 International Conference on Robotics and Automation (ICRA), 2019: IEEE, pp. 7791-7797.
[19] Z. Lončarević, A. Gams, and A. Ude, "Robot skill learning in latent space of a deep autoencoder neural network," Robotics and Autonomous Systems, vol. 135, p. 103690, 2021.
[20] F. Tao, Q. Qi, L. Wang, and A. Nee, "Digital twins and cyber–physical systems toward smart manufacturing and industry 4.0: Correlation and comparison," Engineering, vol. 5, no. 4, pp. 653-661, 2019.
[21] N. Nikolakis, V. Maratos, and S. Makris, "A cyber physical system (CPS) approach for safe human-robot collaboration in a shared workplace," Robotics and Computer-Integrated Manufacturing, vol. 56, pp. 233-243, 2019.
[22] Denso工業型六軸機械手臂,取自https://www.denso-wave.com/en/robot/product/five-six/vs.html
[23] 台灣氣立股份有限公司,https://www.chelic.com/website/tw/index.html
[24] 廖俐智,“結合機器視覺之工業用機械手臂夾爪快換系統研製”,天主教輔仁大學電機工程學系碩士論文,中華民國108年12月。
[25] USB-4750 資料擷取模組,取自https://iotmart.advantech.com.tw/Data-Acquisition-Control/Data-Acquisition-Control-Card-USB-Interface-DAQ/model-USB-4750-BE.htm
[26] V-REP虛擬環境,https://www.coppeliarobotics.com/
[27] J. J. Craig, Introduction to robotics: mechanics and control. Pearson Educacion, 2005.
[28] S. Asif and P. Webb, "Kinematics analysis of 6-DoF articulated robot with spherical wrist," Mathematical Problems in Engineering, vol. 2021, 2021.