簡易檢索 / 詳目顯示

研究生: 李俊毅
Li, Jyun-Yi
論文名稱: 高精確度眼角偵測用於可見光眼動儀之頭動補償
High Precision Canthus Alignment for Visible-Spectrum Gaze Tracking System with Head Movement Compensation
指導教授: 高文忠
Kao, Wen-Chung
學位類別: 碩士
Master
系所名稱: 電機工程學系
Department of Electrical Engineering
論文出版年: 2019
畢業學年度: 107
語文別: 中文
論文頁數: 87
中文關鍵詞: 眼動儀可見光頭動補償
英文關鍵詞: gaze tracker, visible light, head movement compensation
DOI URL: http://doi.org/10.6345/NTNU201900728
論文種類: 學術論文
相關次數: 點閱:124下載:9
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 人類接收資訊的來源有80%以上來自視覺。眼動儀透過追蹤眼球運動來獲得凝視點。透過分析視覺行為過程的資訊,可以應用於認知心理學、商業、醫學和人機介面等。為了更方便有效地蒐集眼動資訊,眼動追蹤技術也不斷的發展與創新。但是可見光眼動儀在支援頭部移動仍有許多困難與挑戰,例如:在影像中如何穩定地追蹤眼睛,如何建立空間中眼睛、攝影機與螢幕之間的相對位置關係,還有每個人主視眼不一的情況。本文提出的高精確度眼角偵測,用於定位與眼球的相對位置,在頭動補償中眼角偵測的座標也用於估計頭部在影像中的相對移動,以校正程序建立空間中眼睛、攝影機與螢幕之間的相對位置關係,並偵測主視眼。透過眼角偵測與頭動補償的方法,進一步改善了凝視點估計的精確度。

    More than 80% of people get information from the visual. The gaze tracker obtains gaze points by tracking eye movements. By analyzing visual behavior, it can be applied to cognitive psychology, business, medicine, and human-machine interface. In order to collect eye movement information more conveniently and effectively, eye tracking technology continues to develop and innovate. However, visible eye trackers still have many difficulties and challenges in supporting head movements, such as how to stably track the eyes in the image, how to position the eyes in space, the relative positional relationship between the camera and the screen, and the situation in which each person has a different main eye. Inconsistent situations. The high-precision canthus detection proposed in this paper is used to locate the relative position of the eyeball. canthus detection is also used to estimate the motion of the head in the image. In order to obtain the position of the eyes, camera and screen in the space, and to detect the main eye. The accuracy of the gaze point estimation is further improved by the method of canthus detection and head motion compensation.

    摘 要 i 目 錄 iii 圖目錄 vi 表目錄 ix 第一章 緒論 1 1.1 研究背景 1 1.2 眼動儀的應用 2 1.3 研究問題 3 1.4 論文架構 7 第二章 文獻探討 8 2.1 眼角偵測方法 9 2.2 頭動補償方法 14 第三章 軟體系統架構 26 3.1 系統架構 26 3.2 眼角偵測 28 3.3 凝視點校正 51 3.4 頭部移動校正 56 3.5 凝視點估計 60 第四章 研究成果 65 4.1 實驗環境與設備 65 4.2 眼角偵測結果 66 4.3 頭動補償結果 75 第五章 結論及未來展望 83 5.1 結論 83 5.2 未來展望 83 參考文獻 84

    [1] N. Erdogmus and J. L. Dugelay, “An efficient iris and eye corners extraction method,” in Proc. ACM Int. Conf. Statistical Techniques in Pattern Recognition and Structural and Syntactic Pattern Recognition, Berlin, Aug. 2010, pp. 549-558.
    [2] H. Xia and G. Yan, “A novel method for eye corner detection based on weighted variance projection function,” in Proc. IEEE Int. Cong. Image and Signal Processing, Tianjin, Oct. 2009, pp. 1-4.
    [3] C. Xu, Y. Zheng, and Z. Wang, “Semantic feature extraction for accurate eye corner detection,” in Proc. IEEE Int. Conf. Pattern Recognition, Tampa, Jan. 2008, pp. 1-4.
    [4] D. Borza and R. Danescu, “Eye shape and corners detection in periocular images using particle filters,” in Proc. IEEE Int. Conf. Signal-Image Soft. and Internet-Based Systems, Naples, Dec. 2016, pp. 15-22.
    [5] Z. Zhang, Y. Shen, W. Lin, and B. Zhou, “Eye corner detection with texture image fusion,” in Proc. IEEE Asia-Pacific Signal and Information Processing Association Annual Summit and Conf. , Hong Kong, Dec. 2015, pp. 992-995.
    [6] Y. B. Shi, J. M. Zhang, J. H .Tian, and G. T. Zhou, “An improved facial feature localization method based on ASM”, in Proc. IEEE Int. Conf. Comput. Aided Industrial Design and Conceptual Design, Hangzhou, Nov. 2006, pp. 1-5.
    [7] A. Dasgupta, A. Mandloi, A. George, and A. Routray, “An improved algorithm for eye corner detection,” in Proc. IEEE Int. Conf. Signal Processing and Communications, Bangalore, Jun. 2016, pp. 1-4.
    [8] P. Huber, Z. Feng, W. Christmas, J. Kittler, and M. Rätsch, “Fitting 3D morphable face models using local features,” in Proc. IEEE Int. Conf. Image Processing, Quebec City, Dec. 2015, pp. 1195-1199.
    [9] X. Xiong and F. D. Torre, “Supervised descent method and its applications to face alignment,” in Proc. IEEE Conf. Comput. Vis. and Pattern Recognition, Portland, Jun. 2013, pp. 532-539.
    [10] C. Hennessey and P. Lawrence, “Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions,” IEEE Trans. Biomed. Eng., vol. 56, no. 3, pp. 790-799, Mar. 2009.
    [11] A. Villanueva and R. Cabeza, “A novel gaze estimation system with one calibration point,” IEEE Trans. Systems, Man, and Cybernetics, vol. 38, no. 4, pp. 1123-1138, Aug. 2008.
    [12] T. Nagamatsu, J. Kamahara, and N. Tanaka, “Calibration-free gaze tracking using a binocular 3D eye model,” in Proc. ACM Extended Abstracts on Human Factors in Comput. Systems, Boston, pp. 3613-3618, Apr. 2009.
    [13] J. Chen, Y. Tong, W. Gray, and Q Ji, “A robust 3D eye gaze tracking system using noise reduction,” in Proc. ACM Symp. Eye tracking research and applications, Savannah, pp. 189-196, Mar. 2008.
    [14] D. Beymer and M. Flickner, “Eye gaze tracking using an active stereo head,” in Proc. IEEE Comput. Society Conf. Comput. Vis. and Pattern Recognition, Madison, Jul. 2003, pp. II-451.
    [15] Y. Matsumoto and A. Zelinsky, “An algorithm for real-time stereo vision implementation of head pose and gaze direction measurement,” in Proc. IEEE Int. Conf. Automatic Face and Gesture Recognition, Grenoble, Aug. 2000, pp. 499-504.
    [16] K. P. White, T. E. Hutchinson, and J. M. Carley, “Spatially dynamic calibration of an eye-tracking system,” IEEE Trans. Systems, Man, and Cybernetics, vol. 23, no. 4, pp. 1162-1168, July-Aug. 1993.
    [17] D. H. Yoo, J. H. Kim, B. R. Lee, and M. J. Chung, “Non-contact eye gaze tracking system by mapping of corneal reflections,” in Proc. IEEE Int. Conf. Automatic Face Gesture Recognition, Washington, May 2002, pp. 101-106.
    [18] C. Jin and Y. Li, “Research of gaze point compensation method in eye tracking system,” in Proc. IEEE Int. Conf. Intell. Human Mach. Systems and Cybernetics, Hangzhou, Aug. 2015, pp. 12-15.
    [19] Z. Zhu and Q. Ji, “Non-intrusive eye gaze tracking under natural head movements,” in Proc. IEEE Comput. Society Conf. Comput. Vis. and Pattern Recognition, San Diego, Mar. 2005, pp. 918-923.
    [20] Z. Zhu and Q. Ji, “Novel eye gaze tracking techniques under natural head movement,” IEEE Trans. Biomed. Eng., vol. 54, no. 12, pp. 2246-2260, Dec. 2007.
    [21] S. S. Laura, A. Villanueva, and R. Cabeza, “Gaze estimation interpolation methods based on binocular data,” IEEE Trans. Biomed. Eng., vol. 59, no. 8, pp. 2235-2243, Aug. 2012.
    [22] C. A. Hennessey and P. D. Lawrence, “Improving the accuracy and reliability of remote system-calibration-free eye-gaze tracking,” IEEE Trans. Biomed. Eng., vol. 56, no. 7, pp. 1891-1900, Jul. 2009.
    [23] K. Tamura, R. Choi, and Y. Aoki, “Unconstrained and calibration-free gaze estimation in a room-scale area using a monocular camera,” IEEE Access, vol. 6, pp. 10896-10908, Aug. 2017.
    [24] W. Kao and Y. Chiu, “Eyeball model construction and matching for visible-spectrum gaze tracking systems,” in Proc. IEEE Int. Conf. Consumer Electron. Berlin, Berlin, Sep. 2018, pp. 1-2

    下載圖示
    QR CODE