簡易檢索 / 詳目顯示

研究生: 鄭培廷
Cheng, Pei-Ting
論文名稱: 具深度移動補償之可見光眼動儀
Toward In-Depth Motion for Visible-Spectrum Gaze Tracking System
指導教授: 高文忠
Kao, Wen-Chung
學位類別: 碩士
Master
系所名稱: 電機工程學系
Department of Electrical Engineering
論文出版年: 2021
畢業學年度: 109
語文別: 中文
論文頁數: 64
中文關鍵詞: 眼動儀可見光模型縮放頭動補償
英文關鍵詞: gaze tracker, visible light, eye model, head movement compensation
DOI URL: http://doi.org/10.6345/NTNU202100210
論文種類: 學術論文
相關次數: 點閱:150下載:13
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 在人機介面系統中,眼動儀是一個很有發展前景的技術,因為人在接收資訊時有一半以上是來自視覺。與傳統基於紅外光的眼動儀系統相比,可見光眼動儀提供使用者更為舒適的使用者體驗,並讓此技術成為至關重要的人機互動介面。然而由於眼睛圖像上缺少穩定的參考點,所以頭動補償變得非常困難。如果使用者的頭部未以下巴架固定,眼動儀系統的效能將會降低。本篇論文提出了新的頭動補償機制。提出的方法是利用眼角偵測演算法,允許使用者的頭可以前後移動。實驗結果顯示即使使用者的頭部有移動,我們提出的方法仍可以使可見光眼動儀保持一定程度的精準度與精密度。

    Among the human-computer interfaces, the gaze tracker system is a crucial and developmental technology since more than half of the information humans receive are from the eyes. The visible-spectrum gaze tracker (VSGT) provides excellent user experience compared with the traditional infra-ray (IR) based one, turning it into a vital human-computer interface. However, the head motion compensation becomes extremely difficult due to the lack of stable reference feature points on the eye images. The system performance degrades a lot if the user’s head is not fixed at the chin rest. In this paper, it presents the new compensation mechanism of head motion for a visible-spectrum gaze tracker. The proposed approach aims at allowing the user to move their head back and forth based on the algorithm of canthus detection. The experimental results that the accuracy as well as the precision can be effective even if the users move their head.

    誌謝 i 摘要 iii ABSTRACT iv 目錄 v 圖目錄 vii 表目錄 vx 1. 第一章 緒論 1 1.1 研究背景與動機 1 1.2研究問題 3 1.3論文架構 5 2. 第二章 文獻探討 6 2.1 眼動偵測方法 6 2.2 頭動補償方法 10 2.2.1 交比法(Cross Ratio-based) 10 2.2.2 外觀法(Appearance-based) 11 2.2.3 模型法(3D Model-based) 12 2.3 文獻總結 16 3. 第三章 研究方法 17 3.1可見光眼動儀系統架構 17 3.2實驗流程 19 3.3系統感興趣的區域之資訊 21 3.4眼球模型的縮放 23 3.4.1 縮放演算法 24 3.4.2 內插模型法 27 3.5 系統校正與凝視點估計 28 3.5.1 眼球位置 29 3.5.2 頭動補償 32 3.5.3 凝視點估計 36 4. 第四章 實驗結果與討論 40 4.1 實驗環境與設備 40 4.2 模型縮放測試 42 4.2.1 模型縮放 43 4.2.2 凝視點估計評分方法 44 4.2.3 凝視點估計結果 46 4.3討論 50 4.4系統規格 56 5. 第五章 結論與未來展望 57 5.1 結 論 57 5.2 未來展望 57 6. 參考文獻 58 自傳 63 學術成就 64

    [1]Arar, “Nuri Murat. Robust Eye Tracking Based on Adaptive Fusion of Multiple Cameras,” No. THESIS. EPFL, 2017.

    [2]A. Kar and P. Corcoran, “A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms,” IEEE Access, vol. 5, pp. 16495–16519, 2017.

    [3]E. D. Guestrin and M. Eizenman, “General theory of remote gaze estimation using the pupil center and corneal reflections,” IEEE Trans. Biomed. Eng., vol. 53, no. 8, pp. 1728–1728, 2006.

    [4]S. J. Baek, K. A. Choi, C. Ma, Y. H. Kim and S. J. Ko, “Eyeball model-based iris center localization for visible image-based eye-gaze tracking systems,” IEEE Trans. Consumer Electron., vol. 59, no. 2, pp. 415-421, May. 2013.

    [5]“Using Tobii's eye-tracking camera is like having Superman's laser-guided gaze,” [Online]. Available: https://www.digitaltrends.com/computing/hands-on-tobii-eyemobile/

    [6]X. Zhou, H. Cai, Z. Shao, H. Yu, and H. Liu, “3D eye model-based gaze estimation from a depth sensor,” IEEE In. Conf. Robotics and Biomimetics, 2016, pp. 369-374.

    [7]W. Pichitwong and K. Chamnongthai, “An eye-tracker-based 3d point-of-gaze estimation method using head movement,” IEEE Access, vol. 7, pp. 99086–99098, 2019.

    [8]Arar, N. Murat, H. Gao, and J.P. Thiran, “A regression-based user calibration framework for real-time gaze estimation,” IEEE Trans. Circuits Syst. Video Technol., vol. 27, no. 12, pp. 2623-2638, Dec. 2017.

    [9]Yoo, D. Hyun, and M.J. Chung, “A novel non-intrusive eye gaze estimation using cross-ratio under large head motion,” Elsevier. Computer Vision and Image Understanding, vol. 98, no. 1, pp. 25-51, Oct. 2004.

    [10]L. Feng, Y. Gao, and X.W. Chen, “Estimating 3D gaze directions using unlabeled eye images via synthetic iris appearance fitting,” IEEE Trans. Multimedia, vol. 18, no. 19, pp. 1772-1782, Sep. 2016.

    [11]L. Feng, Y. Sugano, T. Okabe and Y. Sato, “Gaze estimation from eye appearance: A head pose-free method via eye image synthesis,” IEEE Trans. Image Processing, vol. 24, no. 11, pp. 3680-3693, 2015.

    [12]L. Feng, Y. Sugano, T. Okabe and Y. Sato, “Head pose-free appearance-based gaze sensing via eye image synthesis,” in Proc. IEEE Int. Conf. Pattern Recogn., Tsukuba, JP, Nov. 2012, pp.1008-1011.

    [13]Z. Zhu, and Q. Ji, “Novel eye gaze tracking techniques under natural head movement,” IEEE Trans. Biomed. Eng., vol. 54, no. 12, pp. 2246-2260, Dec. 2017.

    [14]F. L. Coutinho and C. H. Morimoto, “Augmenting the robustness of cross-ratio gaze tracking methods to head movement,” in Proc. Symp. ETRA 12, 2012.

    [15]J. J. Cerrolaza, A. Villanueva, and R. Cabeza, “Taxonomic study of polynomial regressions applied to the calibration of video-oculographic systems,” in Proc. Symp. Eye Tracking Res. Appl. New York, USA, pp. 259-266.

    [16]C. C. Lai, Y. T. Chen, K. W. Chen, S. C. Chen, and Y. P. Hung, “Appearance-based gaze tracking with free head movement,” in Proc. IEEE Int. Conf. Pattern Recogn., vol. 1, 2014, pp. 1869-1873.

    [17]Y. Sugano, Y. Matsushita, and Y. Sato, “Calibration-free gaze sensing using saliency maps,” in IEEE Comput. Society Conf. Comput. Vis. and Pattern Recognition, 2010, pp. 2667-2674.

    [18]S. Baluja and D. Pomerleau, “Non-intrusive gaze tracking using artificial neural networks,” in Advances Neural Information Processing Systems, 1994, pp. 753-760.

    [19]L. Q. Xu, D. Machin, and P. Sheppard, “A novel approach to real-time non-intrusive gaze finding.,” in BMVC, 1998, pp. 1-10.

    [20]Z. Zhu, and Q. Ji, “Eye gaze tracking under natural head movements,” in Proc. IEEE Comput. Society Conf. Comput. Vis. and Pattern Recognition, San Diego, Mar. 2005, pp. 918-923.

    [21]C. Hennessey, B. Noureddin, and P. Lawrence, “A single camera eye-gaze tracking system with free head motion,” in Proc. Symp. Eye Tracking Res. Appl. New York, USA, pp. 87-94, 2006.

    [22]S. W. Shih and J. Liu, “A novel approach to 3-D gaze tracking using stereo cameras,” IEEE Trans. Systems, Man, and Cybernetics, vol. 34, no. 1, pp. 234-245, 2004.

    [23]M. Kang, C. Yoo, K. Uhm, D. Lee, and S. Ko, “A robust extrinsic calibration method for non-contact gaze tracking in the 3-D space,” IEEE Access, vol. 6, 2018, pp. 48840-48849.

    [24]C. C. Lai, S. W. Shih, and Y. P. Hung, “Hybrid method for 3-D gaze tracking using glint and contour features,” IEEE Trans. Circuits Syst. Video Technol., vol. 25, no. 1, pp. 2437, Jan. 2015.

    [25]D. Beymer and M. Flickner, “Eye gaze tracking using an active stereo head,” in Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recogn., vol. 2. Jun. 2003, pp. 451-458.

    [26]T. Ohno and N. Mukawa, “A free-head, simple calibration, gaze tracking system that enables gaze-based interaction,” in Proc. Symp. Eye Tracking Res. Appl. Eye Tracking Res. Appl., 2004, pp. 115-122.

    [27]“real-time face pose estimation,” D-lib. C++. [Online]. Available: http://blog.dlib.net/2014/08/real-time-face-pose-estimation.html. [ Accessed: Feb-2016].

    [28]W. Kao, J. Li, S. Lin, and Y. Chiu, “High precision canthus alignment for visible-spectrum gaze tracking system,” in IEEE Int. Conf. Consumer Electron., Taiwan, 2019, pp. 1-2.

    [29]W. Kao, K. Huang, and Y. Chiu, “Eyeball model construction with head movement compensation for gaze tracking systems,” in IEEE Int. Conf. Consumer Electron., Las Vegas, USA, 2020, pp. 1-2.

    [30]Z. Zhang, “A Flexible New Technique for Camera Calibration,” in Microsoft Research, [Online]. Available: https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/tr98-71.pdf. [Accessed: Aug-2008].

    下載圖示
    QR CODE