簡易檢索 / 詳目顯示

研究生: 吳友名
Wu, You-Ming
論文名稱: 以嵌入式系統實作螢幕眼動儀
Embedded System Implementation Screen-Based Eye Tracker
指導教授: 黃奇武
Huang, Chi-Wu
學位類別: 碩士
Master
系所名稱: 工業教育學系
Department of Industrial Education
論文出版年: 2016
畢業學年度: 104
語文別: 中文
論文頁數: 88
中文關鍵詞: ZedBoard螢幕眼動儀嵌入式系統
英文關鍵詞: ZedBoard, eye tracker, embedded system
DOI URL: https://doi.org/10.6345/NTNU202204465
論文種類: 學術論文
相關次數: 點閱:192下載:16
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本研究以Zedboard 嵌入式開發板,實作螢幕眼動儀。使用的硬體設備為ZedBoard、USB OTG、USB滑鼠、USB視訊攝影機、HDMI顯示器。嵌入式平台的嵌入式作業系統Linaro Ubuntu 12.04,系統移植OpenCV與Qt Creator,可提供螢幕眼動儀所需要使用的函式庫與人機介面環境。系統功能為偵測眼睛之瞳孔影像,找出瞳孔中心位置,使用二階多項式完成螢幕眼動儀的校正後,可透過HDMI螢幕畫面上得知使用者凝視螢幕上的位置。用意是記錄使用者凝視軌跡,計算出人眼凝視點的位置。嵌入式環境中使用暗瞳孔法取得較深色的瞳孔影像資訊,攝影機取得的影像資料,濾除眉毛、眼皮等影像資訊,使用標籤演算法計算出瞳孔中心的位置較為準確。進行九點校正的過程中,操作者保持頭部固定,避免頭部移動所產生的系統誤差。研究成果為使用者與電腦螢幕距離40公分處,凝視誤差為0.69度,而距離更改為80公分處,凝視誤差為0.44度。所得到的凝視誤差結果與其他嵌入式螢幕眼動儀比較凝視點誤差,結果為本研究製作的嵌入式螢幕眼動儀之凝視誤差較好,完成ZedBoard實作螢幕眼動儀。

    In this study, used Zedboard completed embedded system implementation screen-based eye tracker. Hardware device used ZedBoard, USB OTG, USB mouse, USB video camera, HDMI display. Embedded platform is Linaro Ubuntu 12.04 embedded operating system, transplantation OpenCV and Qt Creator. Procedure for the screen-based eye tracker is eigenvalue of the pupil to complete the mapping of the human eye and the computer screen, using second order polynomial calibration. The user can know the gaze on the HDMI display coordinate position. Dark pupil method made darker pupil image information for embedded environment. Camera image obtained filtered off eyebrows, eyelids and other image information. Use labeling algorithms to calculate the center of the pupil position more accurate. Research results for the user and the computer screen at a distance of 40 cm, gaze error of 0.69 degrees, and the distance is changed to 80 centimeters, gaze error of 0.44 degrees. And other embedded system screen-based eye tracker comparison gaze point error, the results of this study gaze point error better.

    摘要 i ABSTRACT ii 目次 iii 表次 v 圖次 vii 第一章 緒論 1 1.1 前言 1 1.2 研究動機 4 1.3 研究目的 7 1.4 研究流程 8 1.5 論文架構 9 1.6 文獻回顧 9 第二章 嵌入式系統開發環境 23 2.1 ZedBoard開發環境 24 2.2 ZedBoard啟動與設定 29 2.3 Linaro Ubuntu 12.04作業系統實現 33 2.4移植OpenCV 39 2.5在Zedboard上執行Qt程式 42 第三章 嵌入式螢幕眼動儀系統 45 3.1系統功能 46 3.2系統硬體架構 47 3.3系統軟體架構 49 3.4瞳孔影像定位 50 3.5系統校正 54 3.6操作流程 59 第四章 實驗結果與討論 63 4.1實驗設備 64 4.2精確度實驗 65 4.3與相關研究比較凝視誤差 73 4.4實驗結果 75 第五章 結論與未來展望 77 5.1結論 77 5.2未來展望 79 參考文獻 83

    [1]Andrew T. Duchowski, “A Breadth First Survey of Eye Tracking Applications,” Behavior Research Methods, Instruments, & Computers, vol. 34, no. 4, pp. 455-470, Nov. 2002.
    [2]Karel Fliegel, “Eyetracking based approach to objective image quality assessment,” International Carnahan Conference on Security Technology, Prague, Oct. 2008, pp.371-376.
    [3]Areej Al-Wabil, Ebtisam Alabdulqader, Latifa Al-Abdulkarim, and Nora Al-Twairesh, “Measuring the User Experience of Digital Books with Children: An Eyetracking Study of Interaction with Digital Libraries,” Internet Technology and Secured Transactions, London, Nov. 2010, pp.1-7.
    [4]W Lemahieu, B Wyns, “Low cost eye tracking for human machine interfacing,” Journal of Eyetracking, Visual Cognition and Emotion, vol. 1, no. 1, pp. 1-12, Oct. 2010.
    [5]Kuryati Kipli, Theo Arvanitis, Neil Cooke, and Lisa Harris, “An eyetracking study of estimation accuracy: Examining cerebellar tumours from Magnetic resonance spectroscopy graphs,” International Symposium on Information Technology, Kuala Lumpur, Aug. 2008, pp.1-5.
    [6]William Garrard, “An examination of eyetracking artifacts related to website design for individuals with cognitive disability,” International Conference on Systems, Man, and Cybernetics (SMC), San Diego, Oct. 2014, pp.3197-3202.
    [7]Anirban Chowdhury, Sougata Karmakar, Swathi Matta Reddy, Sanjog J., Subrata Ghosh, and Debkumar Chakrabarti, “Visual Attention Analysis on Mutated Brand Name using Eye-Tracking: A Case Study,” World Academy of Science, Engineering and Technology, India, 2012, pp.1132-1135.
    [8]Mihajlov Martin, Trpkova Marija, and Arsenovski Sime “Eye tracking recognition-based graphical authentication,” Application of Information and Communication Technologies, Baku, Oct. 2013, pp.1-5.
    [9]江宗憲,“低成本高速眼動儀之建構”,國立臺灣師範大學應用電子研究所,碩士論文,2013。
    [10]曾士誠,“頭戴式眼動儀之頭動補償探討”,國立臺灣師範大學電機工程學系,碩士論文,2015。
    [11]Shahram Eivazi, Roman Bednarik, Ville Leinonen, Mikael von und zu Fraunberg, and Juha E. Jaaskelainen, “Embedding an Eye Tracker Into a Surgical Microscope: Requirements, Design, and Implementation,” IEEE Sensors Journal, vol. 16, no. 7, pp. 2070-2078, Nov. 2015.
    [12]Jason S.Babcock and Jeff B. Pelz, “Building a lightweight eye tracking headgear,” Eye Tracking Research & Application, Texas, 2004, pp.109-113.
    [13]Michał Kowalik, “How to build low cost eye tracking glasses for head mounted system,” Faculty of Computer Science and Information Technology, Szczecin Poland, Sep. 2010, pp.1-7.
    [14]Zhor Ramdane-Cherif and Amine Naït-Ali, “An Adaptive Algorithm for Eye Gaze Tracking Device Calibration,” IEEE Transactions on Instrumentation and Measurement, vol. 57, no. 4, pp. 716-723, Apr. 2008.
    [15]Pengyi Zhang, Zhiliang Wang, Siyi Zheng, and Xuejing Guo, “A Design and Research of Eye Gaze Tracking System Based on Stereovision,” Proceedings of the 5th international conference on Emerging intelligent computing technology and applications, Berlin, Sep. 2009, pp.278-286.
    [16]Krzysztof Murawski, “Method for Determining the Position of the Pupil for Eye Tracking Applications,” Methods and Models in Automation and Robotics, Miedzyzdroje, Aug. 2010, pp.356-362.
    [17]Rafael Santos, Nuno Santos, Pedro M. Jorge, and Arnaldo Abrantes, “Eye Gaze as a Human computer Interface,” Conference on Electronics Telecommunications and Computers, Portugal, Nov. 2014, pp.376-383.
    [18]Pavol Fabo and Roman Durikovic, “Automated Usability Measurement of Arbitrary Desktop Application with Eyetracking,” International Conference on Information Visualisation, Montpellier, Jul. 2012, pp.625-629.
    [19]Huabiao Qin, Jun Liu, and Tianyi Hong, “An eye state identification method based on the Embedded Hidden Markov Model,” Vehicular Electronics and Safety (ICVES), Istanbul, Jul. 2012, pp.255-260.
    [20]Bumhwi Kim, Yonghwa Choi, and Minho Lee, “Embedded face recognition system considers human eye gaze using glass-type platform,” International Conference on Consumer Electronics, Las Vegas, Jan. 2014, pp.357-358.
    [21]Luo Chao and Su Jian-Bo, “Eye localization for embedded system,” Control Conference 31st Chinese, Hefei, Jul. 2012, pp.3828-3833.
    [22]Matej Cerny and Martin Dobrovolny, “Eye tracking system on embedded platform,” International Conference Applied Electronics, Pilsen, Sep. 2012, pp.51-54.
    [23]Ruian Liu, Shengtao Ma, Mimi Zhang, and Lei Wang, “Implementation and Optimization of the Eye Gaze Tracking System Based on DM642,” Intelligent Networks and Intelligent Systems, Shenyang, Nov. 2010, pp.48-51.
    [24]Tomoaki Ando, Vasily G. Moshnyaga, Koji Hashimoto, “A low-power FPGA implementation of eye tracking,” International Conference on Acoustics, Speech and Signal Processing, Kyoto, Mar. 2012, pp.1573-1576.
    [25]Rafael Santos, Nuno Santos, Pedro M. Jorge, and Arnaldo Abrantes, “Eye Gaze as a Human computer Interface,” Conference on Electronics Telecommunications and Computers, Portugal, Nov. 2014, pp.376-383.
    [26]Louise H. Crockett, Ross A. Elliot, Martin A. Enderwitz, and Robert W. Stewart, Embedded Processing with the ARM® Cortex®-A9 on the Xilinx® Zynq®-7000 All Programmable SoC, Scotland, 2014.
    [27]Javier Cerezuela-Mora, Elisa Calvo-Gallego, and Santiago Sanchez-Solano, “Hardware/Software co-design of video processing applications on a reconfigurable platform,” International Conference on Industrial Technology, Seville, Mar. 2015, pp.1694-1699.
    [28]M. Ali Altuncu, Taner Guven, Yasar Becerikli, and Suhap Sahin, “Real-Time System Implementation for Image Processing with Hardware/Software Co-design on the Xilinx Zynq Platform,” International Journal of Information and Electronics Engineering, vol. 5, no. 6, pp. 473-477, Nov. 2015.
    [29]Marcos Nieto, Oihana Otaegui, Gorka Velez, Juan Diego Ortega, and Andoni Cortes, “On creating vision based advanced driver assistance systems,” The Institution of Engineering and Technology, vol. 9, no. 1, pp.59-66, Jan. 2015.
    [30]Matthew Russell and Scott Fischaber, “OpenCV Based Road Sign Recognition on Zynq,” International Conference on Industrial Informatics, Bochum, Jul. 2013, pp.596-601.
    [31]Mauro Turturici, Sergio Saponara, Luca Fanucci, and Emilio Franchi, “Low-power embedded system for real-time correction of fish-eye automotive cameras,” Design, Automation & Test in Europe Conference & Exhibition, Dresden, Mar. 2012, pp.340-341.
    [32]Michael Hubner, “Design of an Attention Detection System on the Zynq-7000 SoC,” International Conference on ReConFigurable Computing and FPGAs, Cancun, Dec. 2014, pp.1-6.
    [33]Roland Dobai and Lukas Sekanina, “Image Filter Evolution on the Xilinx Zynq Platform,” Conference on Adaptive Hardware and Systems, Torino, Jun. 2013, pp.164-171.
    [34]Muhammed Al Kadi, Patrick Rudolph, Diana Gohringer, and Michael Hubner, “Dynamic and partial reconfiguration of Zynq 7000 under Linux,” International Conference on Reconfigurable Computing and FPGAs, Cancun, Dec. 2013, pp.1-5.
    [35]TaiLin Han, Hua Cai, GuangwWen Liu, and Bo Wang, “The Face Detection and Location System Based on Zynq,” International Conference on Fuzzy Systems and Knowledge Discovery, Xiamen, Aug. 2014, pp.835-839.
    [36]Yang Yuan and Yuehui Tan, and Huixian Sun, “Design of the software framework of reconfigurable communication interface test system based on Zedboard,” Software Engineering and Service Science, Beijing, Sep. 2015, pp.754-757.
    [37]黃奇武、吳友名、葉勇志,“以ZedBoard實現螢幕眼動儀”,2015年民生電子研討會,彰化、臺灣,2015年11月,139頁。
    [38]Yiu-ming Cheung and Qinmu Peng, “Eye Gaze Tracking With a Web Camera in a Desktop Environment,” IEEE Transactions on Human Machine Systems, vol. 45, no. 4, pp. 419-430, Aug. 2015.

    下載圖示
    QR CODE