研究生: |
曾士誠 Shih-Chen Tseng |
---|---|
論文名稱: |
頭戴式眼動儀之頭動補償探討 An approach of head Movement Compensation for wearable eye tracker |
指導教授: |
黃奇武
Huang, Chi-Wu |
學位類別: |
碩士 Master |
系所名稱: |
電機工程學系 Department of Electrical Engineering |
論文出版年: | 2015 |
畢業學年度: | 103 |
語文別: | 中文 |
論文頁數: | 112 |
中文關鍵詞: | 頭戴式眼動儀 、角膜亮點 、光軸 、視軸 、2-D Mapping 、3-D Modeling |
英文關鍵詞: | Wearable eye tracker, Glint, Optic axis, Visual axis, 2-D mapping, 3-D modeling |
DOI URL: | https://doi.org/10.6345/NTNU202205238 |
論文種類: | 學術論文 |
相關次數: | 點閱:265 下載:4 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本研究提出之Rotation可補償2-D Mapping的頭動偏差,還原至Calibration後之預估凝視點(POG),可免去使用者需要頭部固定在支架的困擾。
相關文獻表示,不管是2-D多項式映射或是眼睛3-D Modeling預估方法,其資料大多採用紅外線光源以及所產生之眼角膜反射點特徵,2-D Mapping選用數學多項式計算預估POG,而眼睛3-D Modeling則是找出人眼之視軸方向,視軸與螢幕之交點即為POG。
文獻說明進行預估POG操作前,2-D Mapping需要做簡單的Calibration,請使用者凝視預設的已知點,所得之資料用來計算多項式函數之係數。3-D眼睛模型需要購買昂貴的Stereo-camera,以及取得攝影機系統相關Calibration參數,或是求解眼睛模型之向量,尤其在設定攝影機系統部分,有的使用另一組輔助廣角Stereo-camera,並且搭配3-D掃描儀進行Calibration,相較於2-D Mapping之Calibration步驟,使用者操作會複雜許多。
本研究是使用兩臺PS3攝影機,進而製作一套低於100美元頭戴式眼動儀,軟體部分採用免費的開放原始碼,使用者可以精確地完成目標鍵盤輸入(Eye Controlled Typing)之操作,用於癱瘓人士的溝通是最為廣泛的應用,相較於昂貴的市售眼動儀(成本大於10000美元),本研究眼動儀之精確度可滿足實驗和應用的需求,在硬體成本部分,其優勢顯而易見。
本研究團隊使用自製之頭戴式眼動儀,基於2-D Mapping進行心理學實驗之應用,例如以凝視熱區(Hot-zone)、感興趣區域(Region of Interest)、凝視軌跡(Scan- path),並應用在目標螢幕鍵盤輸入,希望未來研究3-D Modeling之POG預估,有效地應用於實際環境。
This paper proposed an approach, by using a 3-D rotation matrix, the errors caused by head movements in 2-D mapping, which mapped the glint-pupil difference vector obtained from the eye image on to a screen for estimating the Point of Gaze (POG), could be kept under a predefined accuracy even the head was moving away from the original calibration position. Hence, it could free the tracker user from uncomfortably confined his head in a chin rest during eye tracking.
By the analyze of recent eye tracking techniques, either 2-D polynomial mapping or 3-D modeling basically was tracking the glints of eye images, a bright reflected point of the light source from the eye surface, and the rapidly moving pupil, to find the POG. 2-D mapping used the selected polynomial functions to compute the POG on screen as mentioned above while 3-D modeling is actually measured as well as computed the pupil center and the glint in 3-D position such that the visual axis of the eye could be reconstructed; POG was then found when visual axis was intersecting on a screen or any other subject in the real world
Before eye tracking started, a simple calibration procedure was performed in 2-D mapping by using several predefined points on screen to estimate the coefficients of the selected polynomial functions to be used during tracking while in 3-D models, the calibrations are complicated depending on different system configurations, such as Mono-camera measurements, stereo vision measurements. They were also expensive because some models needed additional auxiliary wide angle Stereo-cameras, and 3-D digitizer for system calibration.
This approach used two PS3 cameras, one for eye and one for scene, with open source software to construct a low cost (under $100) wearable eye tracker capable of performing eye-controlled typing with quite satisfactory accuracy. Eye-controlled typing is one of the important Human Computer Interface (HCI) applications, especially for disable people. Currently, some commercial wearable eye trackers are available with the price at least over $10,000.
The homemade eye tracker in our laboratory was mainly based on 2-D tracking with some self-developed application software, such as Scan-path Trace, Hot-zone Display, Interest-region Search, and Eye-controlled Typing. In addition to modify 2-D mapping with rotation matrix, the 3-D based tracking is planned to be developed and hopefully is capable of working in the real world tracking environment instead of screen only for wider applications.
[1] Chi-Wu Huang; Shih-Chen Tseng; Zong-Sian Jiang; Chun-Wei Hu, "Projective mapping compensation for the head movement during eye tracking," Consumer Electronics - Taiwan (ICCE-TW), 2014 IEEE International Conference on , vol., no., pp.131,132, 26-28 May 2014
[2] Crane, Hewitt D., and Carroll M. Steele. "Generation-V dual-Purkinje-image eyetracker." Applied Optics 24.4 (1985): 527-537.
[3] George Wolberg. DIGITAL IMAGE WARPING–abstract of Digital Image Warping. IEEE Computer Society Press 10 Octorber 1996
[4] Guestrin, E.D.; Eizenman, E., "General theory of remote gaze estimation using the pupil center and corneal reflections," Biomedical Engineering, IEEE Transactions on , vol.53, no.6, pp.1124,1133, June 2006
[5] Hansen, D.W.; Qiang Ji, "In the Eye of the Beholder: A Survey of Models for Eyes and Gaze," Pattern Analysis and Machine Intelligence, IEEE Transactions on , vol.32, no.3, pp.478,500, March 2010
[6] Huey, Edmund. The Psychology and Pedagogy of Reading (Reprint). MIT Press 1968 (originally published 1908)
[7] Hutchinson, T.E.; White, K.P., Jr.; Martin, Worthy N.; Reichert, K.C.; Frey, L.A., "Human-computer interaction using eye-gaze input." Systems, Man and Cybernetics, IEEE Transactions on , vol.19, no.6, pp. 1527, 1534, Nov/Dec 1989
[8] Heckbert, Paul. "Projective mappings for image warping." Image-Based Modeling and Rendering (1999): 15-869.
[9] Hiley, J.B.; Redekopp, A.H.; Fazel-Rezai, R., "A Low Cost Human Computer Interface based on Eye Tracking," Engineering in Medicine and Biology Society, 2006. EMBS '06. 28th Annual International Conference of the IEEE , vol., no., pp.3226,3229, Aug. 30 2006-Sept. 3 2006
[10] Juan J. Cerrolaza, Arantxa Villanueva, and Rafael Cabeza. 2012. Study of Polynomial Mapping Functions in Video-Oculography Eye Trackers. ACM Trans. Comput.-Hum. Interact. 19, 2, Article 10 (July 2012), 25 pages.
[11] Kai Han; Xuan Wang; Zili Zhang; Hainan Zhao, "A novel remote eye gaze tracking approach with dynamic calibration," Multimedia Signal Processing (MMSP), 2013 IEEE 15th International Workshop on , vol., no., pp.111,116, Sept. 30 2013-Oct. 2 2013
[12] Lee, Ji Woo, et al. "3D gaze tracking method using Purkinje images on eye optical model and pupil." Optics and Lasers in Engineering 50.5 (2012): 736-751.
[13] Lee, Ji Woo, Hwan Heo, and Kang Ryoung Park. "A novel gaze tracking method based on the generation of virtual calibration points." Sensors 13.8 (2013): 10802-10822.
[14] Mantiuk, Radosław, et al. Do-it-yourself eye tracker: Low-cost pupil-based eye tracker for computer graphics applications. Springer Berlin Heidelberg, 2012.
[15] Ramdane-Cherif, Z.; Nait-Ali, A, "An Adaptive Algorithm for Eye-Gaze-Tracking-Device Calibration," Instrumentation and Measurement, IEEE Transactions on , vol.57, no.4, pp. 716, 723, April 2008
[16] Sheng-Wen Shih; Jin Liu, "A novel approach to 3-D gaze tracking using Stereo-cameras," Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on , vol.34, no.1, pp.234,245, Feb. 2004
[17]Sabes, Philip N. "Linear Algebraic Equations, SVD, and the Pseudo-Inverse."San Francisco, Oct (2001).
[18]Valenti, R.; Sebe, N.; Gevers, T., "Combining Head Pose and Eye Location Information for Gaze Estimation," Image Processing, IEEE Transactions on , vol.21, no.2, pp.802,815, Feb. 2012
[19] Yannjy Yang and Chih-Chien Wang, "Trend of Using Eye Tracking Technology in Business Research" Journal of Economics, Business and Management vol. 3, no. 4, pp. 447-451, 2015.
[20] Yoo, D.H.; Chung, M.J. A novel non-intrusive eye gaze estimation using cross-ratio under largehead motion. Comput. Vis. Image Underst. 2005, 98, 25–51
[21] Z. Zhu and Q. Ji, “Novel Eye Gaze Tracking Techniques under Natural Head Movement,” IEEE Trans. Biomedical Eng., vol. 54,no. 12, pp. 2246-2260, Dec. 2007.
[22] 江宗憲,低成本高速眼動儀之建構。臺灣師範大學應用電子研究所,碩士論文,2013。
[23] http://www.aforgenet.com/
[24] http://www.tobii.com/en/eye-tracking-research/global/products/
[25] http://web.ntnu.edu.tw/~696010077/
[26] http://www.labbookpages.co.uk/software/imgProc/blobDetection.html
[27] http://en.wikipedia.org/wiki/Singular_value_decomposition
[28] http://codelaboratories.com/home/
[29] http://www.diku.dk/~panic/eyegaze/node9.html
[30] http://www.tobii.com/en/eye-experience/
[31] http://www.tobii.com/
[32] http://www.tobii.com/assistive-technology/global/hidden-pages/rehab-sci/
[33] http://www.ime.usp.br/~hitoshi/framerate/node2.html
[34] http://www.howtodegrid.com/dconcept.htm
[35] http://en.wikipedia.org/wiki/Eye_tracking