研究生: |
鄭培廷 Cheng, Pei-Ting |
---|---|
論文名稱: |
具深度移動補償之可見光眼動儀 Toward In-Depth Motion for Visible-Spectrum Gaze Tracking System |
指導教授: |
高文忠
Kao, Wen-Chung |
學位類別: |
碩士 Master |
系所名稱: |
電機工程學系 Department of Electrical Engineering |
論文出版年: | 2021 |
畢業學年度: | 109 |
語文別: | 中文 |
論文頁數: | 64 |
中文關鍵詞: | 眼動儀 、可見光 、模型縮放 、頭動補償 |
英文關鍵詞: | gaze tracker, visible light, eye model, head movement compensation |
DOI URL: | http://doi.org/10.6345/NTNU202100210 |
論文種類: | 學術論文 |
相關次數: | 點閱:150 下載:13 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
在人機介面系統中,眼動儀是一個很有發展前景的技術,因為人在接收資訊時有一半以上是來自視覺。與傳統基於紅外光的眼動儀系統相比,可見光眼動儀提供使用者更為舒適的使用者體驗,並讓此技術成為至關重要的人機互動介面。然而由於眼睛圖像上缺少穩定的參考點,所以頭動補償變得非常困難。如果使用者的頭部未以下巴架固定,眼動儀系統的效能將會降低。本篇論文提出了新的頭動補償機制。提出的方法是利用眼角偵測演算法,允許使用者的頭可以前後移動。實驗結果顯示即使使用者的頭部有移動,我們提出的方法仍可以使可見光眼動儀保持一定程度的精準度與精密度。
Among the human-computer interfaces, the gaze tracker system is a crucial and developmental technology since more than half of the information humans receive are from the eyes. The visible-spectrum gaze tracker (VSGT) provides excellent user experience compared with the traditional infra-ray (IR) based one, turning it into a vital human-computer interface. However, the head motion compensation becomes extremely difficult due to the lack of stable reference feature points on the eye images. The system performance degrades a lot if the user’s head is not fixed at the chin rest. In this paper, it presents the new compensation mechanism of head motion for a visible-spectrum gaze tracker. The proposed approach aims at allowing the user to move their head back and forth based on the algorithm of canthus detection. The experimental results that the accuracy as well as the precision can be effective even if the users move their head.
[1]Arar, “Nuri Murat. Robust Eye Tracking Based on Adaptive Fusion of Multiple Cameras,” No. THESIS. EPFL, 2017.
[2]A. Kar and P. Corcoran, “A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms,” IEEE Access, vol. 5, pp. 16495–16519, 2017.
[3]E. D. Guestrin and M. Eizenman, “General theory of remote gaze estimation using the pupil center and corneal reflections,” IEEE Trans. Biomed. Eng., vol. 53, no. 8, pp. 1728–1728, 2006.
[4]S. J. Baek, K. A. Choi, C. Ma, Y. H. Kim and S. J. Ko, “Eyeball model-based iris center localization for visible image-based eye-gaze tracking systems,” IEEE Trans. Consumer Electron., vol. 59, no. 2, pp. 415-421, May. 2013.
[5]“Using Tobii's eye-tracking camera is like having Superman's laser-guided gaze,” [Online]. Available: https://www.digitaltrends.com/computing/hands-on-tobii-eyemobile/
[6]X. Zhou, H. Cai, Z. Shao, H. Yu, and H. Liu, “3D eye model-based gaze estimation from a depth sensor,” IEEE In. Conf. Robotics and Biomimetics, 2016, pp. 369-374.
[7]W. Pichitwong and K. Chamnongthai, “An eye-tracker-based 3d point-of-gaze estimation method using head movement,” IEEE Access, vol. 7, pp. 99086–99098, 2019.
[8]Arar, N. Murat, H. Gao, and J.P. Thiran, “A regression-based user calibration framework for real-time gaze estimation,” IEEE Trans. Circuits Syst. Video Technol., vol. 27, no. 12, pp. 2623-2638, Dec. 2017.
[9]Yoo, D. Hyun, and M.J. Chung, “A novel non-intrusive eye gaze estimation using cross-ratio under large head motion,” Elsevier. Computer Vision and Image Understanding, vol. 98, no. 1, pp. 25-51, Oct. 2004.
[10]L. Feng, Y. Gao, and X.W. Chen, “Estimating 3D gaze directions using unlabeled eye images via synthetic iris appearance fitting,” IEEE Trans. Multimedia, vol. 18, no. 19, pp. 1772-1782, Sep. 2016.
[11]L. Feng, Y. Sugano, T. Okabe and Y. Sato, “Gaze estimation from eye appearance: A head pose-free method via eye image synthesis,” IEEE Trans. Image Processing, vol. 24, no. 11, pp. 3680-3693, 2015.
[12]L. Feng, Y. Sugano, T. Okabe and Y. Sato, “Head pose-free appearance-based gaze sensing via eye image synthesis,” in Proc. IEEE Int. Conf. Pattern Recogn., Tsukuba, JP, Nov. 2012, pp.1008-1011.
[13]Z. Zhu, and Q. Ji, “Novel eye gaze tracking techniques under natural head movement,” IEEE Trans. Biomed. Eng., vol. 54, no. 12, pp. 2246-2260, Dec. 2017.
[14]F. L. Coutinho and C. H. Morimoto, “Augmenting the robustness of cross-ratio gaze tracking methods to head movement,” in Proc. Symp. ETRA 12, 2012.
[15]J. J. Cerrolaza, A. Villanueva, and R. Cabeza, “Taxonomic study of polynomial regressions applied to the calibration of video-oculographic systems,” in Proc. Symp. Eye Tracking Res. Appl. New York, USA, pp. 259-266.
[16]C. C. Lai, Y. T. Chen, K. W. Chen, S. C. Chen, and Y. P. Hung, “Appearance-based gaze tracking with free head movement,” in Proc. IEEE Int. Conf. Pattern Recogn., vol. 1, 2014, pp. 1869-1873.
[17]Y. Sugano, Y. Matsushita, and Y. Sato, “Calibration-free gaze sensing using saliency maps,” in IEEE Comput. Society Conf. Comput. Vis. and Pattern Recognition, 2010, pp. 2667-2674.
[18]S. Baluja and D. Pomerleau, “Non-intrusive gaze tracking using artificial neural networks,” in Advances Neural Information Processing Systems, 1994, pp. 753-760.
[19]L. Q. Xu, D. Machin, and P. Sheppard, “A novel approach to real-time non-intrusive gaze finding.,” in BMVC, 1998, pp. 1-10.
[20]Z. Zhu, and Q. Ji, “Eye gaze tracking under natural head movements,” in Proc. IEEE Comput. Society Conf. Comput. Vis. and Pattern Recognition, San Diego, Mar. 2005, pp. 918-923.
[21]C. Hennessey, B. Noureddin, and P. Lawrence, “A single camera eye-gaze tracking system with free head motion,” in Proc. Symp. Eye Tracking Res. Appl. New York, USA, pp. 87-94, 2006.
[22]S. W. Shih and J. Liu, “A novel approach to 3-D gaze tracking using stereo cameras,” IEEE Trans. Systems, Man, and Cybernetics, vol. 34, no. 1, pp. 234-245, 2004.
[23]M. Kang, C. Yoo, K. Uhm, D. Lee, and S. Ko, “A robust extrinsic calibration method for non-contact gaze tracking in the 3-D space,” IEEE Access, vol. 6, 2018, pp. 48840-48849.
[24]C. C. Lai, S. W. Shih, and Y. P. Hung, “Hybrid method for 3-D gaze tracking using glint and contour features,” IEEE Trans. Circuits Syst. Video Technol., vol. 25, no. 1, pp. 2437, Jan. 2015.
[25]D. Beymer and M. Flickner, “Eye gaze tracking using an active stereo head,” in Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recogn., vol. 2. Jun. 2003, pp. 451-458.
[26]T. Ohno and N. Mukawa, “A free-head, simple calibration, gaze tracking system that enables gaze-based interaction,” in Proc. Symp. Eye Tracking Res. Appl. Eye Tracking Res. Appl., 2004, pp. 115-122.
[27]“real-time face pose estimation,” D-lib. C++. [Online]. Available: http://blog.dlib.net/2014/08/real-time-face-pose-estimation.html. [ Accessed: Feb-2016].
[28]W. Kao, J. Li, S. Lin, and Y. Chiu, “High precision canthus alignment for visible-spectrum gaze tracking system,” in IEEE Int. Conf. Consumer Electron., Taiwan, 2019, pp. 1-2.
[29]W. Kao, K. Huang, and Y. Chiu, “Eyeball model construction with head movement compensation for gaze tracking systems,” in IEEE Int. Conf. Consumer Electron., Las Vegas, USA, 2020, pp. 1-2.
[30]Z. Zhang, “A Flexible New Technique for Camera Calibration,” in Microsoft Research, [Online]. Available: https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/tr98-71.pdf. [Accessed: Aug-2008].