研究生: |
李家宜 Lee, Chia-Yi |
---|---|
論文名稱: |
粒子群移動演算法實現高速眼動儀系統 Gaze Tracking with Particle Swarm Optimization |
指導教授: |
高文忠
Kao, Wen-Chung |
學位類別: |
碩士 Master |
系所名稱: |
電機工程學系 Department of Electrical Engineering |
論文出版年: | 2016 |
畢業學年度: | 104 |
語文別: | 中文 |
論文頁數: | 85 |
中文關鍵詞: | 眼動儀 、可見光 、粒子群移動演算法 |
英文關鍵詞: | Gaze tracking, visible light, particle swarm optimization |
DOI URL: | https://doi.org/10.6345/NTNU202203836 |
論文種類: | 學術論文 |
相關次數: | 點閱:177 下載:21 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
眼動儀可以記錄眼球運動、轉換為凝視軌跡,用以研究人類視覺焦點並應用於神經科學、認知心理學、教育、行銷/廣告分析等領域,為現今市場上具有實用性的產品。目前市售的眼動儀多數使用紅外光技術,缺點為環境中的紅外光源會影響系統準確率。因此近年的學術研究投入於可見光眼動儀,但至今市面上尚未出現高精確度的實用性產品。
本系統參考既有的眼球模型,改良眼球模型參數的計算方法與匹配方法以處理一般辦公室光源環境下、480fps 工業用相機錄製的複雜影像。在匹配方法中結合粒子群移動演算法大幅加速計算效率,並改良傳統九點校正使用的二次映射曲線使虹膜中心更加準確地轉換至凝視點,實現 30fps 以上即時處理、平均準確率 1.12 度的系統。
The eye tracking system is a new human machine interface device which can analyze the gaze path by tracking the eyeball movement. It has become increasingly popular in the consumer market, due to the fact that the recorded gaze tracking results can be applied to the study of human attention span, cognitive psychology and in the fields of neuroscience, psychology, education, as well as consumer products. However, gaze tracking system mostly rely on infra-ray (IR) light to enhance the image quality of the eyeball, making the application environments and scenarios of the gaze tracking system an issue to be solved. As a result, more and more researchers devoted themselves to the development of visible light eye tracking system, but very few reliable systems achieve high accuracy.
The proposed system aims to improve the system accuracy by modifying the eyeball matching algorithm, even when the images are taken in poor lighting conditions. The improvements come from the modifications of image preprocessing, searching algorithm with the particle swarm optimization (PSO), and a new calibration method. The experimental results indicate the system error is less than 1.12 degree and the entire system reaches processing speed of 30 frames/s.
[1] K. Rayner, "Eye movements in reading and information processing: 20 years of research," Psychological bulletin., vol. 124, no.3, pp.372-422, Nov. 1998.
[2] Human eye [Online] Available at https://en.wikipedia.org/wiki/Human_eye
[3]國立台灣師範大學教育與心理輔導學系的教育神經科學實驗室[Online] Available at http://web.ntnu.edu.tw/~696010077/
[4] 國立政治大學心理系的眼動與閱讀實驗室[Online] Available at:http://emrlab.nccu.edu.tw/
[5] 中央大學認知神經科學研究所的情緒與犯罪實驗室[Online] Available at:http://icn.ncu.edu.tw/f.aspx
[6] 衛生福利部社會及家庭署 輔具資源入口網[Online] Available at: http://repat.sfaa.gov.tw/07product/pro_a_main.asp?id=5951
[7] SMI (SensoMotoric Instruments). [Online] Available at : http://www.smivision.com
[8] Tobii. [Online] Availabe at : http://www.tobii.com
[9] 由田新技股份有限公司[Online]Available at:http://www.utechzone.com.tw/index.aspx
[10] R. Valenti and T. Gevers, “Accurate eye center location through invariant isocentric patterns,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 34, no. 9, pp. 1785–1798, Sep. 2012.
[11] D. W. Hansen and Q. Ji, “In the eye of the beholder: a survey of models for eyes and gaze,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 32, no.3, pp. 478-500, Mar. 2010.
[12] J. Sigut and S.-A. Sidha, “Iris center corneal reflection method for gaze tracking using visible light,” IEEE Trans. Biomed. Eng., vol. 58, no. 2, pp. 411–419, Feb. 2011.
[13] J.-G. Wang, E. Sung, and R. Venkateswarlu, “Estimating the eye gaze from one eye,” Comput. Vis. Image Und., vol. 98, no. 1, pp. 83-103, Apr. 2005.
[14] W. Zhang, T.-N. Zhang, and S.-J. Chang, “Eye gaze estimation from the elliptical features of one iris,” Opt. Eng., vol. 50, no. 4, pp. 047003-1-9, Apr. 2011.
[15] J. Daugman, “How iris recognition works,” IEEE Trans. Circuits Syst. Video Technol., vol. 14, no. 1, pp. 21-30, Jan. 2004.
[16] K. Nishino and S. K. Nayar, “Corneal imaging system: environment from eyes,” Int. J. Comput. Vis., vol. 70, no. 1, pp. 23-40, Oct. 2006.
[17] S.-J. Baek, K.-A. Choi, C. Ma, Y.-H. Kim, and S.-J. Ko, “Eyeball Model-based Iris Center Localization for Visible Image-based Eye-Gaze Tracking Systems,” IEEE Trans. Consumer Electron., vol. 59, no. 2, pp. 415-421, May 2013
[18] Canny edge detector [Online] Available at:https://en.wikipedia.org/wiki/Canny_edge_detector
[19] RANSAC[Online]Available at:https://en.wikipedia.org/wiki/RANSAC
[20] J. B. Roerdink, “Mathematical morphology on the sphere,” Proc. SPIE Vis. Comm. Image Process. ’90: 5th in a Series, vol. 1360, Lausanne, Switzerland, Sep. 1990, pp. 263-271.
[21] F. Lu, Y. Sugano, T. Okabe, and Y. Sato "Adaptive linear regression for appearance-based gaze estimation," IEEE Trans. Pattern Anal. Mach. Intell., vol. 36. no. 10 pp. 2033-2046, Oct. 2014
[22] S. M. H. Jansen, H. Kingma, and R. L. M. Peeters, ``A confidence measure for real-time eye movement detection in video-oculography,'' in Proc. 13th Int. Conf. Biomed. Eng., 2009, pp. 335-339.
[23] Least squares[Online]Available at:https://en.wikipedia.org/wiki/Least_squares
[24] H. C. Lee, D. T. Luong, C. W. Cho, E. C. Lee, and K. R. Park, “Gaze tracking system at a distance for controlling IPTV,” IEEE Trans. Consumer Electron., vol. 56, no. 4, pp. 2577-2583, Nov. 2010.
[25] T. Moriyama, T. Kanade, J. Xiao, and J. F. Cohn, “Meticulously detailed eye region model and its application to analysis of facial images,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 28, no. 5, pp. 738-752, May 2006.
[26] Gaussian pyramid[Online] Available at: https://en.wikipedia.org/wiki/Pyramid_(image_processing)
[27] Mean Shift[Online]Available at: https://en.wikipedia.org/wiki/Mean_shift
[28] Scale-invariant feature transform [Online] Available at:https://en.wikipedia.org/wiki/Scale-invariant_feature_transfor
[29] Sobel Operator[Online] Available at: https://en.wikipedia.org/wiki/Sobel_operator