簡易檢索 / 詳目顯示

研究生: 王姿云
Wang, Zih-Yun
論文名稱: 擴增實境之新興科技預測以專利分析法探討
Forecasting Emerging Technologies of Augmented Reality : Using Patent Analysis
指導教授: 蘇友珊
Su, Yu-Shan
學位類別: 碩士
Master
系所名稱: 工業教育學系
Department of Industrial Education
論文出版年: 2020
畢業學年度: 108
語文別: 中文
論文頁數: 115
中文關鍵詞: 擴增實境專利分析技術生命週期理論羅吉斯模型
英文關鍵詞: Augmented Reality, Patent Analysis, Life Cycle Theory, Logistic Model
DOI URL: http://doi.org/10.6345/NTNU202001164
論文種類: 學術論文
相關次數: 點閱:295下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本研究檢索從1999年至2019年擴增實境之技術專利,以技術生命週期之角度探討擴增實境之技術,包含擴增實境四大項分類技術以及其17項子技術之技術發展趨勢。使用關鍵字與核準專利之公告日期作為專利檢索策略。本研究透過專利件數分析探討擴增實境以及各項技術之整體發展趨勢。透過國際專利分類碼分析,探討擴增實境以及各項技術所偏重之類別為何。藉由公司別分析,了解投入擴增實境以及各項技術領域之廠商為何。本研究以各技術檢索歷年累積的專利數量作為評估技術之績效,採用羅吉斯成長模型(Logistic Growth Model)從而得知其技術的生命週期。研究發現有兩項子技術處於成熟期之階段,為頭部/眼部追蹤技術與語音互動技術;即將於2021年邁入成熟期階段的技術為情緒識別技術、感官技術、場景/物體識別技術、全息圖技術及人機介面技術。

    This research searches the augmented reality technology patents from 1999 to 2019, and discusses augmented reality technology from the perspective of technology life cycle, including the four major classification technologies of augmented reality and its 17 sub-technologies . We use keywords and issue date of patents as a patent search strategy. This research explores augmented reality and the overall development trend of various technologies through the analysis of the number of patents. Through the analysis of international patent classification codes, explore the augmented reality and the types of technologies that are preferred. Through company-specific analysis, understand the manufacturers that have invested in augmented reality and various technical fields. In this study, the number of patents accumulated in each technology search over the years was used as the evaluation technology performance, and the Logistic Growth Model was used to learn the life cycle of the technology. The study found that there are two sub-technologies in the mature stage,head/eye tracking technology and voice interaction technology; the technologies that will enter the mature stage in 2021 are emotion recognition technology, sensory technology, human-machine interface technology, Scene/object recognition technology and hologram technology.

    目次 iii 表次 v 圖次 vi 第一章 緒論 1 第一節 研究背景與動機 1 第二節 研究目的 5 第二章 擴增實境 7 第一節 擴增實境之定義 7 第二節 從虛擬實境到擴增實境之發展 9 第三節 擴增實境之技術分類文獻整理 11 第四節 擴增實境之技術 17 第三章 文獻探討 27 第一節 專利分析 27 第二節 專利指標 28 第三節 專利家族 29 第四節 技術生命週期理論 29 第五節 Loglet Lab軟體 32 第四章 研究方法 33 第一節 研究流程 33 第二節 操作型定義 34 第三節 專利檢索 35 第五章 研究結果 41 第一節 專利歷年趨勢分析 41 第二節 國際專利碼分析 48 第三節 公司別分析 58 第四節 成長曲線分析 69 第六章 討論與結論 91 第一節 研究發現 91 第二節 研究貢獻 93 第三節 研究限制 95 第四節 未來研究方向 95 參考文獻 97 附錄 110 附錄一 110 附錄二 113 附錄三 114

    中文部分
    李文傑(2004)。利用專利分析評估半導體企業的技術定位與發-以快閃記憶體MLC技術為例(碩士論文)。國立交通大學,新竹市。

    阮明淑、梁峻齊(2009)。專利指標發展研究。圖書館學與資訊科學,35(2)。

    周鴻揚(2011)。利用專利分析與成長曲線評估半導體奈米製程發展-以HKMG, Strained Silicon, Nanolithography, TSV爲例(碩士論文)。國立交通大學,新竹市。

    陳哲宏、陳逸南、謝銘洋、徐宏昇(1996)。專利法解讀。台北市:月旦。

    陳達仁、黃慕萱(2009)。專利資訊檢索、分析與策略。台北市:華泰。

    劉尚志(2000)。電子商務與電腦軟體之專利保護、發展、分析、 創新與策略。台北市:翰蘆。

    賴奎魁、歐陽光、郭宗賢(2011)。整合專利家族與專利引用於新產品設計之研究。管理與系統,18(1),199-229。

    賴奎魁、張善斌(2004)。建構商業方法技術擴散模式:整合專利引證及貝氏模式。科技管理學刊,9(1),1-34。

    張志立(2004)。以技術生命週期作為技術預測模式之比較(碩士論文)。中原大學,桃園市。

    英文部分
    Achilladelis, B., Schwarzkopf, A., & Cines, M. (1990). The dynamics of technological innovation: the case of the chemical industry. Research Policy, 19(1), 1-34.

    Alan, P. & Daim, T. (2018). Innovation Discovery: Network Analysis Of Research And Invention Activity For Technology Management (Vol. 30). World Scientific.

    Alsberg, B. K. (2012). Is sensing spatially distributed chemical information using sensory substitution with hyperspectral imaging possible?. Chemometrics and Intelligent Laboratory Systems, 114, 24-29.

    Andersen, B. (1999). The hunt for S-shaped growth paths in technological innovation: a patent study. Journal of evolutionary economics, 9(4), 487-526.

    Azuma, R. T. (1997). A survey of augmented reality. Presence: Teleoperators & Virtual Environments, 6(4), 355-385.

    Azuma, R., Lee, J. W., Jiang, B., Park, J., You, S., & Neumann, U. (1999). Tracking in unprepared environments for augmented reality systems. Computers & Graphics, 23(6), 787-793.

    Bárd, I. (2020). Tailoring reality—The ethics of DIY and consumer sensory enhancement. Ethical Dimensions of Commercial and DIY Neurotechnologies, 3, 93.

    Betz, H. G. (1994). Radical right-wing populism in Western Europe. Springer.

    Beyts, C., Chaya, C., Dehrmann, F., James, S., Smart, K., & Hort, J. (2017). A comparison of self-reported emotional and implicit responses to aromas in beer. Food Quality and Preference, 59, 68-80

    Butera, G., Sturla, F., Pluchinotta, F. R., Caimi, A., & Carminati, M. (2019). Holographic Augmented Reality and 3D Printing for Advanced Planning of Sinus Venosus ASD/Partial Anomalous Pulmonary Venous Return Percutaneous Management. JACC: Cardiovascular Interventions, 12(14), 1389-1391.

    Caldwell, T. (2011). Syntec Optics and Eye-Com Combine Eye Tracking with Head Movement Technology. Biometric Technology Today, 2.

    Castle, R. O., Klein, G., & Murray, D. W. (2010). Combining monoSLAM with object recognition for scene augmentation using a wearable camera. Image and Vision Computing, 28(11), 1548-1556.

    Chadalavada, R. T., Andreasson, H., Schindler, M., Palm, R., & Lilienthal, A. J. (2020). Bi-directional navigation intent communication using spatial augmented reality and eye-tracking glasses for improved safety in human–robot interaction. Robotics and Computer-Integrated Manufacturing, 61, 101830.

    Chantal Tode (2017).Walmart taps augmented reality to reward customers. Retrieved from https://www.retaildive.com/ex/mobilecommercedaily/walmart-rewards-customers-with-augmented-reality-scavenger-hunt

    Chatzopoulos, D., Bermejo, C., Huang, Z., & Hui, P. (2017). Mobile augmented reality survey: From where we are to where we go. Ieee Access, 5, 6917-6950.

    Chen, Y., Wang, Q., Chen, H., Song, X., Tang, H., & Tian, M. (2019, June). An overview of augmented reality technology. In Journal of Physics: Conference Series (Vol. 1237, No. 2, p. 022082). IOP Publishing.

    Chen, Z., Lin, Q., Li, J., Yu, X., Gao, X., Yan, B., ... & Xie, S. (2017). A see-through holographic head-mounted display with the large viewing angle. Optics Communications, 384, 125-129.
    Cho, Y., & Daim, T. (2016). OLED TV technology forecasting using technology mining and the Fisher-Pry diffusion model. foresight.

    Cho, Y., Daim, T. U., & Sklar, P. (2015, August). Forecasting OLED TV technology using bibliometrics and Fisher-Pry diffusion model. In 2015 Portland International Conference on Management of Engineering and Technology (PICMET) (pp. 2167-2176). IEEE.

    Choi, S., & Park, H. (2016). Investigation of strategic changes using patent co-inventor network analysis: The case of samsung electronics. Sustainability, 8(12), 1315.

    Cholewa, N., Wołk, K., & Wołk, R. (2018). Precise eye-tracking technology in medical communicator prototype. Procedia computer science, 138, 264-271.

    Cipresso, P., Giglioli, I. A. C., Raya, M. A., & Riva, G. (2018). The past, present, and future of virtual and augmented reality research: a network and cluster analysis of the literature. Frontiers in psychology, 9, 2086.

    Crofton, E. C., Botinestean, C., Fenelon, M., & Gallagher, E. (2019). Potential applications for virtual and augmented reality technologies in sensory science. Innovative Food Science & Emerging Technologies, 102178.

    Czerniawski, T., & Leite, F. (2020). Automated digital modeling of existing buildings: A review of visual object recognition methods. Automation in Construction, 113, 103131.

    D O’Riordan, A., Toal, D., Newe, T., & Dooly, G. (2019). Object recognition within smart manufacturing. Procedia Manufacturing, 38, 408-414.

    Daim, T. U., Rueda, G., Martin, H., & Gerdsri, P. (2006). Forecasting emerging technologies: Use of bibliometrics and patent analysis. Technological Forecasting and Social Change, 73(8), 981-1012.

    de Wijk, R. A., Kooijman, V., Verhoeven, R. H., Holthuysen, N. T., & de Graaf, C. (2012). Autonomic nervous system responses on and facial expressions to the sight, smell, and taste of liked and disliked foods. Food quality and preference, 26(2), 196-203.

    ECORYS. (2017). VIRTUAL REALITY AND ITS POTENTIAL FOR EUROPE. Retrieved from https://ec.europa.eu/futurium/en/system/files/ged/vr_ecosystem_eu_report_0.pdf

    Ernst, H. (1997). The use of patent data for technological forecasting: the diffusion of CNC-technology in the machine tool industry. Small business economics, 9(4), 361-381.

    Evangelista, A., Ardito, L., Boccaccio, A., Fiorentino, M., Petruzzelli, A. M., & Uva, A. E. (2020). Unveiling the technological trends of augmented reality: A patent analysis. Computers in Industry, 118, 103221.

    Fang, W., Zheng, L., & Wu, X. (2017). Multi-sensor based real-time 6-DoF pose tracking for wearable augmented reality. Computers in Industry, 92, 91-103.

    Fujii, H., Yoshida, K., & Sugimura, K. (2016). Research and development strategy in biological technologies: A patent data analysis of Japanese manufacturing firms. Sustainability, 8(4), 351.

    Fung, H. N., & Wong, C. Y. (2017). Accelerating the technological life cycle through convergence: trends from herbal medicine patents and insights from case studies in Asia Pacific. International Journal of Technological Learning, Innovation and Development, 9(4), 353-378.

    Gomez-Jauregui, V., Manchado, C., Jesús, D. E. L., & Otero, C. (2019). Quantitative evaluation of overlaying discrepancies in mobile augmented reality applications for AEC/FM. Advances in Engineering Software, 127, 124-140.

    Griliches, Z. (1990). Patent Statistics as Economic Indicators: A Survey. part 1-2 (No. 3301). National Bureau of Economic Research.

    H, W., Boesveldt, S., de Graaf, C., & de Wijk, R. A. (2016). The relation between continuous and discrete emotional responses to food odors with facial expressions and non-verbal reports. Food Quality and Preference, 48, 130-137.

    Han, P., & Zhao, G. (2015). CAD-based 3D objects recognition in monocular images for mobile augmented reality. Computers & Graphics, 50, 36-46.

    He, Z., Chang, T., Lu, S., Ai, H., Wang, D., & Zhou, Q. (2017). Research on human-computer interaction technology of wearable devices such as augmented reality supporting grid work. Procedia Computer Science, 107, 170-175.

    Heilig, M. L. (1962). U.S. Patent No. 3,050,870. Washington, DC: U.S. Patent and Trademark Office.

    Heller, J., Chylinski, M., de Ruyter, K., Mahr, D., & Keeling, D. I. (2019). Touching the untouchable: exploring multi-sensory augmented reality in the context of online retailing. Journal of Retailing, 95(4), 219-234.

    Hol, J. D., Schon, T. B., Gustafsson, F., & Slycke, P. J. (2006, July). Sensor fusion for augmented reality. In 2006 9th International Conference on Information Fusion (pp. 1-6). IEEE.

    Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., & Van de Weijer, J. (2011). Eye tracking: A comprehensive guide to methods and measures. OUP Oxford.

    IDC. (2017). Segment Share of Shipments.Worldwide AR/VR Headsets. Retrieved from https://www.idc.com/getdoc.jsp?containerId=prUS46143720

    Jeong, B., & Yoon, J. (2017). Competitive intelligence analysis of augmented reality technology using patent information. Sustainability, 9(4), 497.

    Jim Marous, Co-Publisher of The Financial Brand, Owner/CEO of the Digital Banking Report&host of the Banking Transformed podcast. (2017).Will Augmented and Virtual Reality Replace the Bank ? Retrieved from https://thefinancialbrand.com/65828/ar-vr-voice-chatbot-bank-branch-replacement-trends/

    Kalantarian, H., Jedoui, K., Washington, P., Tariq, Q., Dunlap, K., Schwartz, J., & Wall, D. P. (2019). Labeling images with facial emotion and the potential for pediatric healthcare. Artificial intelligence in medicine, 98, 77-86.

    Karaman, A., Erisik, D., Incel, O. D., & Alptekin, G. I. (2016). Resource usage analysis of a sensor-based mobile augmented reality application. Procedia Computer Science, 83, 300-304.

    Karambakhsh, A., Kamel, A., Sheng, B., Li, P., Yang, P., & Feng, D. D. (2019). Deep gesture interaction for augmented anatomy learning. International Journal of Information Management, 45, 328-336.

    Kim, H. C., Jin, S., Jo, S., & Lee, J. H. (2020). A naturalistic viewing paradigm using 360° panoramic video clips and real-time field-of-view changes with eye-gaze tracking. NeuroImage, 116617.

    Krichenbauer, M., Yamamoto, G., Taketomi, T., Sandor, C., Kato, H., & Feiner, S. (2017). Evaluating the effect of positional head-tracking on task performance in 3D modeling user interfaces. Computers & Graphics, 65, 22-30.

    Kronemeyer, L. L., Eilers, K., Wustmans, M., & Moehrle, M. G. (2020). Monitoring Competitors’ Innovation Activities: Analyzing the Competitive Patent Landscape Based on Semantic Anchor Points. IEEE Transactions on Engineering Management,1–16.

    Krueger, M. W., Gionfriddo, T., & Hinrichsen, K. (1985, April). VIDEOPLACE—an artificial reality. In ACM SIGCHI Bulletin (Vol. 16, No. 4, pp. 35-40). ACM.
    Kuo, C., Jeng, T., & Yang, I. (2013). An invisible head marker tracking system for indoor mobile augmented reality. Automation in construction,33, 104-115.

    Lebowitz, E. R., & François, B. (2018). Using Motion Tracking to Measure Avoidance in Children and Adults: Psychometric Properties, Associations With Clinical Characteristics, and Treatment-Related Change. Behavior therapy, 49(6), 853-865.

    Lee, C., Kim, J., Kwon, O., & Woo, H. G. (2016). Stochastic technology life cycle analysis using multiple patent indicators. Technological Forecasting and Social Change, 106, 53-64.

    Lee, M., Kwahk, J., Han, S. H., & Lee, H. (2020). Relative Pointing Interface: A gesture interaction method based on the ability to divide space. International Journal of Industrial Ergonomics, 75, 102878.

    Lee, Y., & Lee, C. H. (2018). Augmented reality for personalized nanomedicines. Biotechnology advances, 36(1), 335-343.

    Li, K., & Sun, W. (2020). Presentation and interaction of Internet of Things data based on augmented reality. Computer Communications.

    Limbu, B. H., Jarodzka, H., Klemke, R., & Specht, M. (2018). Using sensors and augmented reality to train apprentices using recorded expert performance: A systematic literature review. Educational Research Review, 25, 1-22.

    Little, A. D. (1981). The strategic management of technology. Arthur D. Little.
    Liu, C., Plopski, A., & Orlosky, J. (2020). OrthoGaze: Gaze-based Three-dimensional Object Manipulation using Orthogonal Planes. Computers & Graphics.

    Long, F., & Bartlett, M. S. (2016). Video-based facial expression recognition using learned spatiotemporal pyramid sparse coding features. Neurocomputing, 173, 2049-2054.

    Machado, R. L., & Vilela, C. (2020). Conceptual framework for integrating BIM and augmented reality in construction management. Journal of Civil Engineering and Management, 26(1), 83-94.

    McCarthy, I. P. (2003). Technology management–a complex adaptive systems approach.International Journal of Technology Management, 25(8), 728-745.

    Meyer, P. S., Yung, J. W., & Ausubel, J. H. (1999). A primer on logistic growth and substitution: the mathematics of the Loglet Lab software. Technological forecasting and social change, 61(3), 247-271.

    Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS on Information and Systems, 77(12), 1321-1329.

    Mitra, S., & Acharya, T. (2007). Gesture recognition: A survey. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 37(3), 311-324.

    Moehrle, M. G., & Caferoglu, H. (2019). Technological speciation as a source for emerging technologies. Using semantic patent analysis for the case of camera technology. Technological Forecasting and Social Change, 146, 776-784.

    National Research Foundation (2018). Immersive Media and Advanced Interfaces: Identifying Opportunities Through Patent Analytics. (Produced by the Intellectual Property Office and supported by the Info-communications Media Development Authority). INFOCOMM MEDIA DEVELOPMENT AUTHORITY. Retrieved from https://www.imda.gov.sg/-/media/Imda/Files/Industry-Development/Infrastructure/Technology/Technology-Roadmap/Patent-Analytics-for-Immersive-Media-and-Advanced-Interfaces.pdf

    Noor, A. K., & Aras, R. (2015). Potential of multimodal and multiuser interaction with virtual holography. Advances in Engineering Software, 81, 1-6.

    Pavitt, K. (1988). Uses and abuses of patent statistics. In Handbook of quantitative studies of science and technology (pp. 509-536). Elsevier.

    Petersen, N., & Stricker, D. (2015). Cognitive augmented reality. Computers & Graphics, 53, 82-91.

    Pham, T. T. D., Kim, S., Lu, Y., Jung, S. W., & Won, C. S. (2019). Facial action units-based image retrieval for facial expression recognition. IEEE Access, 7, 5200-5207.

    Pigou, L., Dieleman, S., Kindermans, P. J., & Schrauwen, B. (2014, September). Sign language recognition using convolutional neural networks. In European Conference on Computer Vision (pp. 572-578). Springer, Cham.

    Popper, E. T., & Buskirk, B. D. (1992). Technology life cycles in industrial markets. Industrial Marketing Management, 21(1), 23-31.

    Qian, K., Niu, J., & Yang, H. (2013). Developing a gesture based remote human-robot interaction system using kinect. International Journal of Smart Home, 7(4), 203-208.

    Raajan, N. R., Suganya, S., Priya, M. V., Ramanan, S. V., Janani, S., Nandini, N. S., ... & Gayathri, S. (2012). Augmented reality based virtual reality. Procedia engineering, 38, 1559-1565.

    Rautaray, S. S., & Agrawal, A. (2015). Vision based hand gesture recognition for human computer interaction: a survey. Artificial intelligence review, 43(1), 1-54.

    Relecura. (2017). Augmented Reality & Virtual Reality in Healthcare. Relecura Technologies. Retrieved from http://relecura.com/2017/11/10/augmented-virtual-reality-healthcare/

    Seo, W., Yoon, J., Park, H., Coh, B. Y., Lee, J. M., & Kwon, O. J. (2016). Product opportunity identification based on internal capabilities using text mining and association rule mining. Technological Forecasting and Social Change, 105, 94-104.

    Shin, J., Coh, B. Y., & Lee, C. (2013). Robust future‐oriented technology portfolios: B lack–L itterman approach. R&D Management, 43(5), 409-419.

    Siegrist, M., Leins-Hess, R., & Keller, C. (2015). Which front-of-pack nutrition label is the most efficient one? The results of an eye-tracker study. Food Quality and Preference, 39, 183-190.

    Singh, J., & Modi, N. (2019). Use of information modelling techniques to understand research trends in eye gaze estimation methods: An automated review. Heliyon, 5(12), e03033.

    Sutherland, I. E. (1965). The ultimate display. Multimedia: From Wagner to virtual reality, 506-508.

    Van Krevelen, D. W. F., & Poelman, R. (2010). A survey of augmented reality technologies, applications and limitations. International journal of virtual reality, 9(2), 1-20.

    Verhulst, P. F. (1838). Notice sur la loi que la population suit dans son accroissement. Corresp. Math. Phys., 10, 113-126.

    Wang, J., Suenaga, H., Liao, H., Hoshi, K., Yang, L., Kobayashi, E., & Sakuma, I. (2015). Real-time computer-generated integral imaging and 3D image calibration for augmented reality surgical navigation. Computerized Medical Imaging and Graphics, 40, 147-159.

    White, S. W., Abbott, L., Wieckowski, A. T., Capriola-Hall, N. N., Aly, S., & Youssef, A. (2018). Feasibility of automated training for facial emotion expression and recognition in autism. Behavior therapy, 49(6), 881-888.

    Xie, Z., & Miyazaki, K. (2013). Evaluating the effectiveness of keyword search strategy for patent identification. World Patent Information, 35(1), 20-30.

    Yan, J., Zheng, W., Cui, Z., Tang, C., Zhang, T., & Zong, Y. (2018). Multi-cue fusion for emotion recognition in the wild. Neurocomputing, 309, 27-35.

    Yoon, J., Park, Y., Kim, M., Lee, J., & Lee, D. (2014). Tracing evolving trends in printed electronics using patent information. Journal of nanoparticle research, 16(7), 2471.

    Yukang, Y. A. N., Xin, Y. I., Chun, Y. U., & Yuanchun, S. H. I. (2019). Gesture-based target acquisition in virtual and augmented reality. Virtual Reality & Intelligent Hardware, 1(3), 276-289.

    無法下載圖示 本全文未授權公開
    QR CODE