研究生: |
蘇慧生 Su, Hui-Sheng |
---|---|
論文名稱: |
虛擬實境眼動儀系統研發 Research and Development of Eye Tracker for Virtual Reality System |
指導教授: |
林政宏
Lin, Cheng-Hung 何宏發 Ho, Hong-Fa |
學位類別: |
碩士 Master |
系所名稱: |
電機工程學系 Department of Electrical Engineering |
論文出版年: | 2017 |
畢業學年度: | 105 |
語文別: | 中文 |
論文頁數: | 132 |
中文關鍵詞: | 虛擬實境眼動儀 、瞳孔追蹤方法 、眼動資料分析 、眼動校正 、自動校正 |
英文關鍵詞: | Eye tracker for virtual reality, tracking pupils and the methods, auto calibration |
DOI URL: | https://doi.org/10.6345/NTNU202202131 |
論文種類: | 學術論文 |
相關次數: | 點閱:172 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本研究旨在研發一套精準性高、成本低、校正時間與操作時間皆能降低的虛擬實境眼動儀系統。目前市面上有販售的虛擬實境眼動儀價錢都相當的昂貴【表2-12】,且此類商品目前在市面上僅有九種,故期望能以較低成本且能夠達到與同類產品水平相當,甚至超越的一套系統,並將實驗室既有的眼動資料分析軟體整合在此系統內。
測試者反應本實驗室已研發的眼動儀可以改善一些問題。本實驗室所研發的眼動儀內存在瞳孔映射的精準度不足、校正時間長以及操作時間長的問題,這些問題是本研究待改善的問題。
本論文詳述實現虛擬實境眼動儀的方法及解決上述問題的方法與結果。本研究提出了新的追蹤瞳孔的方法,在進行眼動儀校正之前,先藉由框選眼睛的方式,去除多餘非必要的影像,提高瞳孔抓取的精準度與減少瞳孔抓取的平均時間;本研究也改良了判斷映射象限的方法,利用計算座標象限角的方式,能準確地計算出瞳孔實際所凝視的位置;本研究還提出新的自動抓取瞳孔的方式,利用計算瞳孔位置的座標點,判斷是否位於中心的區域來自動抓取瞳孔,此方法可以降低整體操作的平均時間;在分析軟體方面,本研究整合了目前實驗室現有的眼動數據分析軟體,能夠將虛擬實境眼動儀得到的資料進行分析。
本研究針對各項新方法及功能進行實驗來檢測其效能,經由數據分析結果後,獲得結論如下:
一、 成功研發出一套虛擬實境眼動儀。
二、 本研究在計算瞳孔實際所凝視的位置之準確度達95.98%。
三、 本研究採用框選眼睛的方法後,與未使用框選眼睛的方法相比平均校正時間減少43.08秒,加速73.57%的校正過程。
四、 手持式裝置(外觀長度132.4mm~152.5mm,寬度65.5mm~77.2mm之間)能放入本研究的設備內;手持式裝置(螢幕長度103.2mm~120.5mm,寬度59.0mm~67.9mm之間)能讓受測者完整的看見其畫面,故選擇範圍內的手持式裝置皆能在本系統中使用。
五、 本研究採用自動化操作功能後,與未使用自動化操作功能相比,其整體操作校正時間減少3.3秒,加速52.74%的操作過程。
The purpose of this study was to develop an eye tracker for a virtual reality system. There are only nine mainstream products of the available in the market and they are too expensive. This study further aimed to make a product that is low-cost, with high accuracy and have the high performance by making less in the correction and operating time. In addition, the system also comes with a software tool that will be used for the data analysis.
User feedback and suggestions were taken into consideration in improving the eye tracking system functionality. These issues were addressed by doing the following: First, a new method of pupil detection was used. The noise was reduced by having a feature that could select the area of the eye in the image. By using this, accuracy was improved and correction time was reduced. Second, quadrant determination was improved by using the quadrant angle to calculate the coordinate of the pupil. Third, a new method of automatic pupil detection was used. The coordinate of the pupil was detected if it is in the center or not. This resulted in a faster operating time. Fourth, an additional feature of in the system was added that makes data analysis much easier.
The significant findings and conclusions of this study are as follows:
1. We successfully made an eye tracker for virtual reality system.
2. The accuracy has reached 95.98% by using the quadrant angle to calculate the coordinate of the pupil.
3. Selecting the area of an eye in the image yielded better results than having no eye area selection option. It can cut correction time by 43.08 seconds and accelerate by 73.57%.
4. The appearance of the device (length: 132.4mm~152.5mm, width: 65.5mm~77.2mm) can be placed in this system. The screen of the device (length: 103.2mm~120.5mm, width: 59.0mm~67.9mm) can be seen by the users.
5. The automated function yielded better results. It can reduce operating time by 3.3 seconds and accelerate by 52.74%.
[1] M. V. O'Shea, "The Psychology and Pedagogy of Reading," The journal of philosophy, psychology and scientific methods, vol. 5, p. 500, 1908.
[2] G. T. Buswell, How people look at pictures: University of Chicago Press Chicago, 1935.
[3] N. J. Wade and B. W. Tatler,“Did Javal measure eye movements during reading?,” Journal of Eye Movement Research, vol. 2, pp. 1-7, 2009
[4] M. A. Just and P. A. Carpenter, "A theory of reading: from eye fixations to comprehension," Psychological review, vol. 87, p. 329, 1980.
[5] M. A. Just and P. A. Carpenter, "Eye fixations and cognitive processes," Cognitive psychology, vol. 8, pp. 441-480, 1976.
[6] M. G. Calvo and P. J. Lang, "Gaze Patterns When Looking at Emotional Pictures: Motivationally Biased Attention," Motivation and Emotion, vol. 28, pp. 221-243, 2004.
[7] J. Hewig, R. H. Trippe, H. Hecht, T. Straube, and W. H. R. Miltner, "Gender Differences for Specific Body Regions When Looking at Men and Women," Journal of Nonverbal Behavior, vol. 32, pp. 67-78, 2008.
[8] Neal. E. Seymour, Anthony G.Gallagher, Sanziana A.Roman, Michael K.O’Brien, Vipin K.Bansal, Dana K.Andersen, Richard M.Satava,"Virtual Reality Training Improves Operating Room Performance: Results of a Randomized, Double-Blinded Study," Scientific Papers Of The American Surgical Association, vol. 236, pp. 458-464, 2002.
[9] M. Zyda, “From visual simulation to virtual reality to games,” 2005 9th IEEE Computer Society, pp. 25-32, 2005
[10] Max. M. North and Sarah. M. North, Joseph R. Coble,“VIRTUAL REALITY THERAPY: AN EFFECTIVE TREATMENT FOR THE FEAR OF PUBLIC SPEAKING,” International Journal of Virtual Reality (IJVR), vol. 3, pp. 1-6, 2015
[11] Mindy. F. Levin, Patrice. L. Weiss and Emily. A. Kwshner, "Emergence of Virtual Reality as a Tool for Upper Limb Rehabilitation: Incorporation of Motor Control and Motor Learning Principles," Physical Therapy, vol. 95, pp. 415-425, 2015.
[12] Keith R. Lohse, Courtney G. E. Hilderman, Katharine L. Cheung, Sandy Tatla and H. F. Machiel Van der Loos, "Virtual Reality Therapy for Adults Post-Stroke: A Systematic Review and Meta-Analysis Exploring Virtual Environments and Commercial Games in Therapy," PLOS ONE, vol. 9, pp. 3, 2014.
[13] Michelle R. Kandalaft, Nyaz Didehbani, Daniel C. Krawczyk, Tandra T. Allen,
Sandra B. Chapman, " Virtual Reality Social Cognition Training for Young Adults with High-Functioning Autism," Journal of Autism and Developmental Disorders vol. 43, pp. 34-44, 2013.
[14] Kelly,” Visual Science and Engineering: Models and Applications”
[15] G. Loy and A. Zelinsky,“Fast Radial Symmetry for Detecting Points of Interest,”IEEE Transactions on Pattern Analysis and Machine Intellgence,vol. 25, pp. 959-973, 2003
[16] H. D. Crane and C. M. Steele,” Generation-V dual-Purkinje-image eyetracker,”Applied Optics, vol. 24, pp. 527-537, 1985
[17] B. Yan, J. Li, S. Liu, and H. Yuan, “A Robust Algorithm for Pupil Center Detection,” 2011 6th IEEE Conference on Industrial Electronics and Applications, pp. 413-417, 2011
[18] Y. Ito, W. Ohyama, T. Wakabayashi, and F. Kimura, “Detection of eyes by circular Hough transform and histogram of gradient,”2012 21st International Conference on Pattern Recognition, pp. 1795- 1798, 2012
[19] 移動式眼動儀之實作與視線軌跡分析:http://nccur.lib.nccu.edu.tw/bitstream/140.119/32698/7/301807.pdf, accessed 2017/6/12
[20] D. Li, D. Winfield, and D. J. Parkhurst,“Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches,”Computer Vision and Pattern Recognition – Workshops, 2005. CVPR Workshops. IEEE Computer Society Conference, p. 79, 2005.
[21] J. Gips, P. Olivieri, and J. Tecce,“Direct Control of the Computer through Electrodes Placed Around the Eyes”Sth lnternational Conference on Human-Computer Interaction, 1993
[22] Y. Chen and W. S. Newman,“A human-robit interface based on electrooculography,”In Robotics and Automation, 2004. Proceedings. ICRA’04. 2004 IEEE International Conference on, 2004, pp. 243-248.
[23] 瞳孔中心/角膜亮點法(Pupil-Center/Corneal-Reflection,PCCR). Available:http://vbie.eic.nctu.edu.tw/vol_29/skill_2.htm, accessed 2017/6/12
[24] Otsu’s method. Available: http://blog.csdn.net/abcjennifer/article/details/6671288
[25] Z. Li, and S. Kim, “A Modification of Otsu's Method for Detecting Eye Positions,”Image and signal Processing (CISP), 2010 3rd International Congress on vol. 5, pp. 2454-2457, 2010.
[26] S. Baluja and D. Pomerleau, “Non-intrusive gaze tracking using artificial neural networks,” Sch. Comput. Sci., Carnegie Mellon Univ., Pittsburgh, PA, USA, Tech. Rep. CMU-CS-94-102, 1994.
[27] 徐國慶,低分辨圖像眼睛精確定位方法 vol. 32 計算機應用研究,2015
[28] Homography. Available: https://en.wikipedia.org/wiki/Homography, accessed 2017/6/12
[29] R. Hartley and A. Zisserman, Multiple View Geometry in computer vision: Cambridge University Press, 2003.
[30] S. I. Kim, D. K. Lee, S. Y. Kim, O. S. Kwon, and J. Cho, “An Algorithm to Detect a Center of Pupil for Extraction of Point of Gaze,” Engineering in Medicine and Biology Society, 2004. IEMBS ’04. 26th Annual International Conference of the IEEE, vol. 1, pp. 1237-1240, 2004.
[31] D. I. Tang, W. Y. Chang, “Exploring Eye-Tracking Methodology in Communication Study,” Chinese Journal of Communication Research, vol. 12, No 4, pp.165-211, 2007.
[32] 陳學志,賴惠德,邱發忠,“眼球追蹤技術在學習與教育上的應用,”教育科學研究, vol. 55, pp.39-68, 2010.
[33] 李鎮宇,吳欣蓉,郭慧中,何榮桂,林燕珍,潘鈺筠,林紀慧,徐慧璇,黃淑玲“課程研究,”Journal of Curriculum Studies, vol. 5 No 1, pp.39-68.
[34] SMI Mobile Eye Tracking HMD based on Samsung GearVR. Available: http://www.smivision.com/fileadmin/user_upload/downloads/product_flyer/SMI_Mobile_ET_HMD_SamsungGearVR_screen.pdf, accessed 2017/5/20
[35] Eye Tracking HMD based on HTC Vive. Available: http://www.smivision.com/fileadmin/user_upload/downloads/product_flyer/SMI_ET_HMD_HTC_Vive_screen.pdf, accessed 2017/5/20
[36] SMI Eye Tracking Upgrade for the Oculus Rift DK 2.Available: https://www.smivision.com/eye-tracking/product/oculus-rift-dk2-hmd-upgrade/, accessed 2017/5/20
[37] FOVE 0. Available: https://www.getfove.com/about/, accessed 2017/5/20
[38] Pupil Labs. Available: https://pupil-labs.com/store/#vr-ar, accessed 2017/5/20
[39] aGlass DKII. Available: http://www.aglass.com/product, accessed 2017/6/20
[40] Tobii VR4 for Vive.Available: https://technews.tw/2017/05/31/tobii-vr4-for-vive/, accessed 2017/5/20
[41] Simon Gustafsson, Alexey Bezugly, Anders Kingbäck, Anders Clausen, "Eye-tracking enabled wearable devices", 美國, Patent 2016
[42] Niladri Sarkar, "Eye-Tracking System and Method Therefor", 美國, Patent 2015
[43] Xiaojuan MA, Wing Ki LEUNG, Ching Man AU YEUNG, "Eye tracking method and apparatus", 美國, Patent 2014
[44] John Elvesjo, Marten Skogo, Gunnar Elvers, "Method and installation for detecting and following an eye and the gaze direction thereof", 美國, Patent 2003
[45] Mårten SKOGÖ, Henrik JÖNSSON, John Elevsjö , "Mårten SKOGÖ, Henrik JÖNSSON, John Elevsjö ", 歐洲, Patent 2014
[46] Peter Blixt, Miroslav Kobetski, Anders Hoglund, "Eye-tracking using a GPU ", 美國, Patent 2010
[47] Walter Nistico, Jan Dr. Hoffmann, Eberhard Schmidt, "Optical measuring device and system", 歐洲, Patent 2011
[48] Walter Nistico, Jan Dr. Hoffmann, Eberhard Schmidt, "Optical measuring device and system", 歐洲, Patent 2011
[49] Walter Nistico, Jan Dr. Hoffmann, Eberhard Schmidt, "Optical measuring device and system", 世界專利, Patent 2012
[50] Walter Nistico, Jan Dr. Hoffmann, Eberhard Schmidt, "Optical measuring device and system", 美國, Patent 2012
[51] Horia Grecu, Winfried Dr. Teiwes, "Method and apparatus for image-based eye tracking for retinal diagnostic or surgery device", 歐洲, Patent 2004
[52] Oliver Kersting, Martin Gründig, "Apparatus for monitoring one or more parameters of the eye", 歐洲, Patent 2011
[53] Oliver Kersting, Martin Gründig, "Apparatus for monitoring one or more surgical parameters of the eye", 歐洲, Patent 2011
[54] Oliver Kersting, Martin Gründig, "Apparatus for monitoring one or more parameters of the eye", 世界專利, Patent 2012
[55] Oliver Kersting, Martin Gründig, "Apparatus for monitoring one or more surgical parameters of the eye", 歐洲, Patent 2011
[56] Horia Grecu, Ralf Weise, "Method and apparatus for eye position registering and tracking", 美國, Patent 2011
[57] Winfried Teiwes, Horia Grecu, "Method and apparatus for image-based eye tracking for retinal diagnostic or surgery device", 美國, Patent 2005
[58] Thomas Jakobs, Allen W. Baker, "Eye tracking system and method", 美國, Patent 2009
[59] John Taboada, William Robinson , "Eye tracking system and method", 美國, Patent 1992
[60] Byong Min Kang, Jingu Heo, Dong Kyung Nam, "Method and apparatus for eye tracking", 歐洲, Patent 2016
[61] 沈威, 張濤, 張春光, 李春, 俞能海, 楊柳, "人眼跟踪方法及裝置", 中華人民共和國, Patent 2014
[62] Byong Min Kang, Jingu Heo, Dong Kyung Nam, "Method and apparatus for eye tracking", 美國, Patent 2016
[63] Roel Vertegaal, Changuk Sohn, Daniel Cheng, Victor Macfarlane, Jeffrey S. Shell , "Method and apparatus for calibration-free eye tracking", 美國, Patent 2005
[64] Riad Hammoud, "Eye tracking method based on correlation and detected eye movement", 美國, Patent 2004
[65] Riad I. Hammoud, "Eye tracking method based on correlation and detected eye movement", 美國, Patent 2004
[66] Shin-Min CHAO , "Eye-tracking method and eye-tracking system for implementing the same", 美國, Patent 2012
[67] Winfried Teiwes, Horia Grecu, "Method and apparatus for image-based eye tracking for retinal diagnostic or surgery device", 美國, Patent 2005
[68] VR BOX.Available: https://read01.com/K6k4jg.html, accessed 2017/5/20
[69] Microsoft Surface Pro 4.Available: http://www.bestbuy.com/site/microsoft-surface-pro-4-12-3-128gb-intel-core-i5-silver/4523600.p?skuId=4523600, accessed 2017/5/20
[70] eye1.Available: https://www.google.com.tw/url?sa=i&rct=j&q=&esrc=s&source=images&cd=&cad=rja&uact=8&ved=0ahUKEwiaxNzBhp_UAhXFGpQKHY-fCO4QjRwIBw&url=https%3A%2F%2Fread01.com%2FMemnEn.html&psig=AFQjCNFjwfJHtMmyq_7ABuz7VTMAbT-IKQ&ust=1496489113093036, accessed 2017/6/15
[71] eye2.Available: https://www.google.com.tw/url?sa=i&rct=j&q=&esrc=s&source=images&cd=&cad=rja&uact=8&ved=0ahUKEwiqtJrpiJ_UAhUDpJQKHUjsA7sQjRwIBw&url=http%3A%2F%2Fwww.difangzhi.org%2F1209853.html&psig=AFQjCNGlGtH86GAkf88TST7NJfOEpy8p1g&ust=1496489730393988, accessed 2017/6/15
[72] eye3.Available: https://www.google.com.tw/url?sa=i&rct=j&q=&esrc=s&source=images&cd=&cad=rja&uact=8&ved=0ahUKEwij_vaEiZ_UAhUDo5QKHdzlBYMQjRwIBw&url=https%3A%2F%2Fcommons.wikimedia.org%2Fwiki%2FFile%3AIris.eye.225px.jpg&psig=AFQjCNFxMFVeRaOjTdeqKyHsPdvZw0N34Q&ust=1496489791991592, accessed 2017/6/15
[73] eye4.Available: https://www.google.com.tw/url?sa=i&rct=j&q=&esrc=s&source=images&cd=&cad=rja&uact=8&ved=0ahUKEwil_sCxiZ_UAhWSQpQKHRuyCo4QjRwIBw&url=https%3A%2F%2Fwww.3dtotal.com%2Ftutorial%2F947-realistic-eye-texture-painting-photoshop-by-krishnamurti-martins-costa-pupil-iris-digital%3Fpage%3D2&psig=AFQjCNH4l9QR3AltUgNw4kTLOYw7oTeCaQ&ust=1496489876943060, accessed 2017/6/15
[74] eye6.Available: https://www.google.com.tw/imgres?imgurl=http%3A%2F%2Fwww.go2hn.com%2Fzgjm%2Fimages%2Fmjst00018.jpg&imgrefurl=http%3A%2F%2Fwww.go2hn.com%2Fzgjm%2Fmjst00018.htm&docid=0Zzf9f8cC0b7MM&tbnid=8MAN2Pld03DWNM%3A&vet=10ahUKEwiExpTLgJ7UAhXCGpQKHW3SCwMQMwhRKCQwJA..i&w=680&h=463&bih=770&biw=1440&q=%E7%9E%B3%E5%AD%94&ved=0ahUKEwiExpTLgJ7UAhXCGpQKHW3SCwMQMwhRKCQwJA&iact=mrc&uact=8, accessed 2017/6/15
[75] eye7.Available: http://www.taringa.net/posts/info/18181548/Color-de-ojos-por-pais.html, accessed 2017/6/15
[76] eye8.Available: https://www.google.com.tw/imgres?imgurl=http%3A%2F%2Fi.epochtimes.com%2Fassets%2Fuploads%2F2009%2F08%2F908231648481553.jpg&imgrefurl=http%3A%2F%2Fwww.epochtimes.com%2Fb5%2F9%2F8%2F24%2Fn2633687.htm&docid=TDgYPE0qM_Ir3M&tbnid=Kbp6WcECMW-ijM%3A&vet=10ahUKEwiExpTLgJ7UAhXCGpQKHW3SCwMQMwhNKCAwIA..i&w=273&h=250&bih=770&biw=1440&q=%E7%9E%B3%E5%AD%94&ved=0ahUKEwiExpTLgJ7UAhXCGpQKHW3SCwMQMwhNKCAwIA&iact=mrc&uact=8, accessed 2017/6/15
[77] eye9.Available: https://www.google.com.tw/imgres?imgurl=https%3A%2F%2Fwww.soeyewear.com%2Fresources%2Fimages%2Fupload%2Fpicresources%2Freport%2Freport-t178-p4.jpg&imgrefurl=https%3A%2F%2Fwww.soeyewear.com%2FArticle%2FDetail%2F553&docid=DQ6D0gIcpkgriM&tbnid=U1qYRkQ2bWvjQM%3A&vet=10ahUKEwiExpTLgJ7UAhXCGpQKHW3SCwMQMwhsKD8wPw..i&w=360&h=239&bih=770&biw=1440&q=%E7%9E%B3%E5%AD%94&ved=0ahUKEwiExpTLgJ7UAhXCGpQKHW3SCwMQMwhsKD8wPw&iact=mrc&uact=8, accessed 2017/6/15
[78] eye10.Available: https://www.google.com.tw/url?sa=i&rct=j&q=&esrc=s&source=imgres&cd=&cad=rja&uact=8&ved=0ahUKEwif3d7Fi5_UAhUEjpQKHfNyDoQQjRwIBw&url=http%3A%2F%2Fqx1.loveshares.cc%2Fpage%2F181&psig=AFQjCNGncgCjA9PAg8Jy_THZwh91eshNvw&ust=1496490464537364, accessed 2017/6/15
[79] FOVE 0. Available: https://kknews.cc/zh-tw/digital/2kv98g.html, accessed 2017/5/20
[80] VR眼動儀報導.Available:https://kknews.cc/zh-tw/tech/mgr8byp.html, accessed 2017/6/15
[81] Eye-tracking enabled wearable devices.Available: https://www.google.com.tw/patents/US20170090563?dq=US20170090563+A1&hl=zh-TW&sa=X&ved=0ahUKEwiyvu6K5NfUAhULHJQKHUoKAe4Q6AEIIzAA, accessed 2017/6/15