研究生: |
蔡瑞哲 Tsai, Jui-Che |
---|---|
論文名稱: |
具頭動補償之高速可見光眼動儀系統平行架構設計 Parallel Computing Architecture of High Speed Visible Light Gaze Tracking System with Head Motion Compensation |
指導教授: |
高文忠
Kao, Wen-Chung |
學位類別: |
碩士 Master |
系所名稱: |
電機工程學系 Department of Electrical Engineering |
論文出版年: | 2017 |
畢業學年度: | 105 |
語文別: | 中文 |
論文頁數: | 74 |
中文關鍵詞: | 眼動儀 、可見光 、平行化計算 、頭動補償 |
英文關鍵詞: | parallel architecture, multithreading, head motion compensation |
DOI URL: | https://doi.org/10.6345/NTNU202202830 |
論文種類: | 學術論文 |
相關次數: | 點閱:114 下載:3 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
眼動儀可應用於學習與認知心理學、商業廣告行為、神經科學等領域,利用眼睛觀看位置的資料進行統計分析,研究人類觀看行為的差異。目前市售的眼動儀多數使用紅外光技術,缺點為環境中的紅外光源會影響系統準確率,因此許多企業及學術單位投入於可見光眼動儀開發,但至今市面上尚未出現高精確度的可見光眼動儀產品。
本文以既有的眼球模型,改良虹膜抓取方法,針對一般辦公室光源環境下,使用每秒輸入480張影像之高速相機記錄眼睛影像。另一方面,利用高速相機影像連貫性的特徵,抓取眼睛固定特徵進行頭動補償,搭配平行化架構設計,以多執行序的技術,使本眼動儀計算速度可達到每秒480張影像,並達到高精確度與精準度的系統目標。
Nowadays, the eye tracking system has been applied in the studies of learning and cognitive psychology, advertising design, neurosciences, and other fields. That is, the statistical analysis of the gaze data has made it possible to study the human behavior. Currently, most of gaze tracking systems are designed by equipped with an infra-ray (IR) light source, but the accuracy of an IR-based gaze tracking system can be affected by the illumination in the environment. Therefore, more and more researchers devoted themselves to the development of visible light eye tracking system. Still, few systems have been designed successfully which could reach a satisfactory level of accuracy and speed.
The proposed system aims to improve the system accuracy as well as speed by modifying the iris matching algorithm. The breakthrough comes from the modifications of image preprocessing, head movement compensation, optimal iris matching, and a novel software architecture design based on parallel computing scheme. The experimental result shows that the proposed system has reached a process speed of 480 frames/s with a promising accuracy as well as precision result.
[1] J. Sigut and S.-A. Sidha, “Iris center corneal reflection method for gaze tracking using visible light,” IEEE Trans. Biomed. Eng., vol. 58, no. 2, pp. 411–419, Feb. 2011.
[2] R. Valenti and T. Gevers, “Accurate eye center location through invariant isocentric patterns,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 34, no. 9, pp. 1785–1798, Sep. 2012.
[3]J. Daugman, “How iris recognition works,” IEEE Trans. Circuits Syst. Video Technol., vol. 14, no. 1, pp. 21-30, Jan. 2004.
[4] K. Rayner, "Eye movements in reading and information processing: 20 years of research," Psychological bulletin., vol. 124, no.3, pp.372-422, Nov. 1998.
[5] T. Moriyama, T. Kanade, J. Xiao, and J. F. Cohn, “Meticulously detailed eye region model and its application to analysis of facial images,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 28, no. 5, pp. 738-752, May 2006.
[6] J.-G. Wang, E. Sung, and R. Venkateswarlu, “Estimating the eye gaze from one eye,” Comput. Vis. Image Und., vol. 98, no. 1, pp. 83-103, Apr. 2005.
[7] Tony Attwood,亞斯伯格症進階完整版,台北市,2007
[8]劉耿良,泛自閉症患者之臉孔情緒辨識與眼神追蹤眼動表現的關聯性
[9] K. Nishino and S. K. Nayar, “Corneal imaging system: environment from eyes,” in Int. J. Comput. Vis., vol. 70, no. 1, pp. 23-40, Oct. 2006.
[10]F. Lu, Y. Gao and X. Chen, "Estimating 3D Gaze Directions Using Unlabeled Eye Images via Synthetic Iris Appearance Fitting," in IEEE Transactions on Multimedia, vol. 18, no. 9, pp. 1772-1782, Sept. 2016.
[11]J. Li and S. Li, "Two-phase approach — Calibration and iris contour estimation — For gaze tracking of head-mounted eye camera," in 2016 IEEE International Conference on Image Processing (ICIP), Phoenix, AZ, 2016, pp. 3136-3140.
[12] S.-J. Baek, K.-A. Choi, C. Ma, Y.-H. Kim, and S.-J. Ko, “Eyeball Model-based Iris Center Localization for Visible Image-based Eye-Gaze Tracking Systems,” IEEE Trans. Consumer Electron., vol. 59, no. 2, pp. 415-421, May 2013
[13]A. Kar and P. Corcoran, "Towards the development of a standardized performance evaluation framework for eye gaze estimation systems in consumer platforms," in 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Budapest, 2016, pp. 002061-002066.
[14]Sofia Jennifer J, Sree Sharmila T and Srinivasan R, "Facial feature extraction for head tilt images based on eye canthus," in 2016 IEEE Region 10 Conference (TENCON), Singapore, 2016, pp. 566-570.
[15]Cui Xu, Ying Zheng and Zengfu Wang, "Semantic feature extraction for accurate eye corner detection," in 2008 19th International Conference on Pattern Recognition, Tampa, FL, 2008, pp. 1-4.
[16]D. Borza and R. Danescu, "Eye Shape and Corners Detection in Periocular Images Using Particle Filters," in 2016 12th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS), Naples, 2016, pp. 15-22.
[17]W. C. Kao, C. Y. Lee, C. Y. Lin, T. Y. Su, B. Y. Ke and C. Y. Liao, "Gaze tracking with particle swarm optimization," in 2015 International Symposium on Consumer Electronics (ISCE), Madrid, 2015, pp. 1-2.
[18]W. C. Kao, C. Y. Lin, C. C. Hsu, C. Y. Lee, B. Y. Ke and T. Y. Su, "Optimal iris region matching and gaze point calibration for real-time eye tracking systems," in 2016 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, 2016, pp. 443-444.
[19]Reza Jafari and Djemel Ziou, "Eye-gaze estimation under various head positions and iris states,"in Expert Syst., Appl. 42, 1 (January 2015), 510-518.
[20] S. M. H. Jansen, H. Kingma, and R. L. M. Peeters, "A confidence measure for real-time eye movement detection in video-oculography,'' in Proc. 13th Int. Conf. Biomed. Eng., 2009, pp. 335-339.
[21] Capacitive eye tracking technology could be featured in Microsoft's HoloLens 2 http://www.phonearena.com/news/Capacitive-eye-tracking-technology-could-be-featured-in-Microsofts-HoloLens-2_id88908
[22] 心思無所遁形!眼動偵測引爆創新人機介面應用http://www.naipo.com/Portals/1/web_tw/Knowledge_Center/Industry_Economy/publish-90.htm
[23] GT72 6QE DOMINATOR PRO G TOBII https://www.msi.com/Laptop/GT72-6QE-DOMINATOR-PRO-G-tobii.html#hero-overview
[24] 眼動與閱讀實驗室http://emrlab.nccu.edu.tw/em_q&a.html#qa9_top