簡易檢索 / 詳目顯示

研究生: 簡佑達
Yu-Ta Chien
論文名稱: 運用電腦動畫增進問卷設計、技能學習與教師培育:三個在科學教育情境下的研究
Leveraging on Animations to Improve Questionnaire Design, Skill Learning, and Teacher Preparation: Three Studies in Science Educational Settings
指導教授: 張俊彥
Chang, Chun-Yen
學位類別: 碩士
Master
系所名稱: 地球科學系
Department of Earth Sciences
論文出版年: 2011
畢業學年度: 99
語文別: 英文
論文頁數: 78
中文關鍵詞: 科學教育多媒體動畫問卷電腦輔助教學教師培育
英文關鍵詞: Science Educaiton, Multimedia, Animation, Questionnaire, Computer-Assisted Instruction, Teacher Preparation
論文種類: 學術論文
相關次數: 點閱:141下載:11
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • This thesis explored the educational uses of computerized animations in three science educational settings, including science educational questionnaire design, science process skill learning, and science teacher preparation. In Chapter II, based on dual coding theory, the feasibility of using an animation-based questionnaire to survey college students’ perceptions of a future science learning environment was explored. The findings revealed that using animations to visualize the key concepts of survey questions had great potential to bound students’ visual images stimulated from question descriptions, and therefore it could reduce the probability that students misinterpret survey questions. In Chapter III, from the perspective of cognitive load theory, the comparative instructional efficiency among one graphic-based and two animation-based tutorials for assisting high school students in learning a topographic measuring skill was investigated. The results indicated that the degree of user-control in animations would influence students’ cognitive load and achievements in multimedia learning environments. The additional supporting strategies for improving educational animation design were discussed. In Chapter IV, a framework of instructional design anchoring on cognitive apprenticeship model was proposed to facilitate science pre-service teachers in producing animation-based coursewares. This framework was implemented to reform a science teacher education course and evaluated using both quantitative and qualitative approaches. The results indicated that this framework significantly promoted the pre-service teachers’ technology competencies and enhanced their confidence in implementing animation-based science instruction. Moreover, it can hone pre-service teachers’ reasoning on the interplays between technology, pedagogy, and content. Potential additions for incorporating this framework into science teacher education courses were recommended. The preliminary findings reported in this thesis may contribute to a deeper and broader understanding of how and why the uses of computerized animations would benefit the practice in science education.

    Chapter I. Overview 1 Chapter II. Exploring the Impact of Animation-Based Questionnaire on Conducting a Web-Based Educational Survey and its Association with Vividness of Respondents’ Visual Images 6 II.1. Introduction 6 II.2. Methods 10 II.2.1. Participants 10 II.2.2. Measurements 10 II.2.2.1. TBQ 10 II.2.2.2. ABQ 11 II.2.2.3. Vividness of visual imagery scale 12 II.2.2.4. Attitude toward animation questionnaire inventory 13 II.2.3. Procedure and data analysis 14 II.3. Results 16 II.3.1. Difference in students’ responses to TBQ and ABQ 16 II.3.2. Vividness of visual imagery in determining the response change between TBQ and ABQ 17 II.3.3. Students’ perceived effectiveness of ABQ 18 II.4. Discussion and Implication 19 Chapter III. Comparison of Instructional Efficiency of Different Multimedia Forms for Improving Students Topographic Measuring Skill Learning 22 III.1. Introduction 22 III.1.1. Cognitive architecture and cognitive load 23 III.1.2. Impediment to animation-based learning 24 III.1.3. Potential aid in animation-based learning 25 III.2. Purpose of the Study 27 III.3. Methods 28 III.3.1. Learning subject 28 III.3.2. Instructional conditions 28 III.3.3. Participants and research design 30 III.3.4. Measuring instruments 31 III.3.4.1. Subjective mental effort scale 31 III.3.4.2. Practical performance test 32 III.3.4.3. Instructional time-span 32 III.3.4.4. Instructional efficiency 33 III.3.5. Data analysis 34 III.4. Results 35 III.4.1. Difference in subjective mental effort ratings 36 III.4.2. Difference in practical performance scores 36 III.4.3. Difference in instructional efficiency 36 III.4.4. No difference in instructional time-spans 37 III.5. Discussion and Implication 39 Chapter IV. Engaging Pre-Service Science Teachers to Act as Active Designers of Online Animation-Based Coursewares: A MAGDAIRE Framework 42 IV.1. Introduction 42 IV.1.1. Framework for innovating science teacher education courses 45 IV.2. Purpose of the Study 50 IV.3. Methods 51 IV.3.1. The use of multimedia and information technologies 51 IV.3.2. Context of the study 51 IV.3.3. Data collection and analysis 54 IV.3.3.1. Quantitative approach 54 IV.3.3.2. Qualitative approach 55 IV.4. Findings and Discussion 56 IV.4.1. Advancing technology competence for science teaching 56 IV.4.2. Reconsidering interplays between technology, pedagogy, and content 57 IV.4.2.1. Select appropriate components to be transformed with technology 58 IV.4.2.2. Use technology beyond the fun factor 59 IV.4.2.3. Present information as a web of interconnections 59 IV.4.2.4. Provide activities for students to interact with computers 62 IV.4.2.5. Negotiate technology-integrated pedagogy with actual classroom settings 63 IV.5. Conclusion and Recommendations 65 Chapter V. A Final Word 68 Bibliography 71

    Chapter I
    Ainsworth, S., & VanLabeke, N. (2004). Multiple forms of dynamic representation. Learning and Instruction, 14(3), 241-255.
    Bétrancourt, M. (2005). The animation and interactivity principles in multimedia learning. In R. E. Mayer (Ed.), The Cambridge handbook of multimedia learning (pp. 287-296). New York, NY: Cambridge University Press.
    Baek, Y. K., & Layne, B. H. (1988). Color, graphics, and animation in a computer-assisted learning tutorial lesson. Journal of Computer-Based Instruction, 15, 131-135.
    Chandler, P. (2004). The crucial role of cognitive processes in the design of dynamic visualizations. Learning and Instruction, 14(3), 353-357.
    Chandler, P. (2009). Dynamic visualizations and hypermedia: Beyond the "Wow" factor. Computers in Human Behavior, 25(2), 389-392.
    Collins, A. (1988). Cognitive apprenticeship and instructional technology: Technical report. Cambridge, MA: Bolt Beranek and Newman.
    Cook, M. P. (2006). Visual representations in science education: The influence of prior knowledge and cognitive load theory on instructional design principles. Science Education, 90(6), 1073-1091.
    Höffler, T. N., & Leutner, D. (2007). Instructional animation versus static pictures: A meta-analysis. Learning and Instruction, 17(6), 722-738.
    Kim, S., Yoon, M., Whang, S. M., Tversky, B., & Morrison, J. B. (2007). The effect of animation on comprehension and interest. Journal of Computer Assisted Learning, 23(3), 260-270.
    Lowe, R. (2004). Interrogation of a dynamic visualization during learning. Learning and Instruction, 14(3), 257-274.
    Mayer, R. E. (2003). The promise of multimedia learning: Using the same instructional design methods across different media. Learning and Instruction, 13(2), 125-139.
    Mayer, R. E. (2005). Introduction to multimedia learning. In R. E. Mayer (Ed.), The Cambridge handbook of multimedia learning (pp. 1-18). New York, NY: Cambridge University Press
    Mayer, R. E. (2008). Applying the science of learning: Evidence-based principles for the design of multimedia instruction. American Psychologist, 63(8), 760-769.
    Mayer, R. E., Hegarty, M., Mayer, S., & Campbell, J. (2005). When static media promote active learning: Annotated illustrations versus narrated animations in multimedia instruction. Journal of Experimental Psychology-Applied, 11(4), 256-265.
    Mayer, R. E., & Moreno, R. (2002). Aids to computer-based multimedia learning. Learning and Instruction, 12(1), 107-119.
    Moreno, R. (2006). Learning in high-tech and multimedia environments. Current Directions in Psychological Science, 15(2), 63-67.
    National Center for Education Statistics. (2010). Teachers' Use of Educational Technology in U.S. Public Schools: 2009. Washington, DC: U.S. Department of Education.
    Paivio, A. (1986). Mental representations: A dual coding approach. New York, NY: Oxford University Press.
    Ploetzner, R., & Lowe, R. (2004). Dynamic visualizations and learning: Introduction to the special issue. Learning and Instruction, 14(3), 235-240.
    Schnotz, W., & Lowe, R. (2003). External and internal representations in multimedia learning: Introduction. Learning and Instruction, 13(2), 117-123.
    Sweller, J., & Chandler, P. (1994). Why some material is difficult to learn. Cognition and Instruction, 12(3), 185-233.
    Tversky, B., Morrison, J. B., & Bétrancourt, M. (2002). Animation: Can it facilitate? International Journal of Human-Computer Studies, 57(4), 247-262.
    von Wodtke, M. (1993). Mind over media: Creative thinking skills for electronic media. New York, NY: McGraw-Hill.
    Wu, H. C., Chang, C. Y., Chen, C. L., Yeh, T. K., & Liu, C. C. (2010). Comparison of Earth Science achievement between animation-based and graphic-based testing designs. Research in Science Education, 40(5), 639-673.
    Chapter II
    Chang, C. Y., & Lee, G. (2010). A major e-learning project to renovate science learning environment in Taiwan. Turkish Online Journal of Educational Technology, 9(1), 7-12.
    Chien, Y. T., & Chang, C. Y. (2010). Exploring the feasibility of an online contextualized animation-based questionnaire for educational survey. British Journal of Educational Technology, 41(5), E104-E109.
    Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Earlbaum Associates.
    Conrad, F. G., Couper, M. P., Tourangeau, R., & Peytchev, A. (2006). Use and non-use of clarification features in web surveys. Journal of Official Statistics, 22(2), 245-269.
    Conrad, F. G., Schober, M. F., & Coiner, T. (2007). Bringing features of human dialogue to web surveys. Applied Cognitive Psychology, 21(2), 165-187.
    Cui, X., Jeter, C. B., Yang, D. N., Montague, P. R., & Eagleman, D. M. (2007). Vividness of mental imagery: Individual variability can be measured objectively. Vision Research, 47(4), 474-478.
    Dancy, M. H., & Beichner, R. (2006). Impact of animation on assessment of conceptual understanding in physics. Physical Review Special Topics-Physics Education Research, 2(1), 7.
    Graesser, A. C., Cai, Z. Q., Louwerse, M. M., & Daniel, F. (2006). Question Understanding Aid (QUAID) - A Web facility that tests question comprehensibility. Public Opinion Quarterly, 70(1), 3-22.
    Hardre, P. L., Crowson, H. M., Xie, K., & Ly, C. (2007). Testing differential effects of computer-based, web-based and paper-based administration of questionnaire research instruments. British Journal of Educational Technology, 38(1), 5-22.
    Lind, L. H., Schober, M. F., & Conrad, F. G. (2001). Clarifying question meaning in a web-based survey. In Proceedings of the American Statistical Association, Section on Survey Research Methods. Alexandria, VA: American Statistical Association.
    Marks, D. F. (1995). New directions for mental imagery research. Journal of Mental Imagery, 19, 153-167.
    Mayer, R. E., & Moreno, R. (1998). Split-attention effect in multimedia learning: Evidence for dual processing systems in working memory. Journal of Educational Psychology, 90(2), 312-320.
    Paivio, A. (1986). Mental representations: A dual coding approach. New York: Oxford University Press.
    Presser, S., Couper, M. P., Lessler, J. T., Martin, E., Martin, J., Rothgeb, J. M., et al. (2004). Methods for testing and evaluating survey questions. Public Opinion Quarterly, 68(1), 109-130.
    Riva, G., Teruzzi, T., & Anolli, L. (2003). The use of the Internet in psychological research: Comparison of online and offline questionnaires. Cyberpsychology & Behavior, 6(1), 73-80.
    Sadoski, M., Goetz, E. T., & Fritz, J. B. (1993). Impact of concreteness on comprehensibility,interest, and memory for text: Implications for dual coding theory and text design. Journal of Educational Psychology, 85, 291-304.
    Sadoski, M., Goetz, E. T., & Rodriguez, M. (2000). Engaging texts: Effects of concreteness on comprehensibility, interest, and recall in four text types. Journal of Educational Psychology, 92(1), 85-95.
    Sadoski, M., & Paivio, A. (2007). Toward a unified theory of reading. Scientific Studies of Reading, 11(4), 337-356.
    U.S. Census Bureau for the Bureau of Labor Statistics. (2008). Current Population Survey, May 2006, August 2006, and January 2007: Tobacco Use Supplement. Washington, DC: U.S. Census Bureau.
    Suessbrick, A., Schober, M. F., & Conrad, F. G. (2000). Different respondents interpret ordinary questions quite differently. In Proceedings of the American Statistical Association, Section on Survey Research Methods (pp. 907-912). Alexandria, VA: American Statistical Association.
    Sweller, J., & Chandler, P. (1994). Why some material is difficult to learn. Cognition and Instruction, 12(3), 185-233.
    Sweller, J., van Merriënboer, J. J. G., & Paas, F. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10(3), 251-296.
    Tourangeau, R., Rips, L. C., & Rasinski, K. (2000). The psychology of survey response. Cambridge: Cambridge University Press.
    Willis, G. B., Royston, P., & Bercini, D. (1991). The use of verbal report methods in the development and testing of survey questionnaires. Applied Cognitive Psychology, 5(3), 251-267.
    Wu, H. C., Chang, C. Y., Chen, C. L., Yeh, T. K., & Liu, C. C. (2010). Comparison of Earth Science achievement between animation-based and graphic-based testing designs. Research in Science Education, 40(5), 639-673.
    Wu, H. C., Yeh, T. K., & Chang, C. Y. (2010). The design of an animation-based test system in the area of Earth sciences. British Journal of Educational Technology, 41(3), E53-E57.
    Yu, S. C., & Yu, M. N. (2007). Comparison of Internet-based and paper-based questionnaires in Taiwan using multisample invariance approach. Cyberpsychology & Behavior, 10(4), 501-507.
    Chapter III
    Ainsworth, S., & VanLabeke, N. (2004). Multiple forms of dynamic representation. Learning and Instruction, 14(3), 241-255.
    Campbell, D., & Stanley, J. (1966). Experimental and quasi-experimental designs for research. Chicago, CA: Rand McNally.
    Chandler, P. (2004). The crucial role of cognitive processes in the design of dynamic visualizations. Learning and Instruction, 14(3), 353-357.
    Chandler, P. (2009). Dynamic visualizations and hypermedia: Beyond the "wow" factor. Computers in Human Behavior, 25(2), 389-392.
    Clark, R. E. (2001). Learning from media. Greenwich, CT: Information Age Publishing.
    Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Earlbaum Associates.
    Evans, C., & Gibbons, N. J. (2007). The interactivity effect in multimedia learning. Computers & Education, 49(4), 1147-1160.
    Höffler, T. N., & Leutner, D. (2007). Instructional animation versus static pictures: A meta-analysis. Learning and Instruction, 17(6), 722-738.
    Lowe, R. (2004). Interrogation of a dynamic visualization during learning. Learning and Instruction, 14(3), 257-274.
    Mayer, R. E. (2003). Elements of a science of e-learning. Journal of Educational Computing Research, 29(3), 297-313.
    Mayer, R. E., & Anderson, R. B. (1992). The instructive animation: Helping students build connections between words and pictures in multimedia learning. Journal of Educational Psychology, 84(4), 444-452.
    Mayer, R. E., & Chandler, P. (2001). When learning is just a click away: Does simple user interaction foster deeper understanding of multimedia messages? Journal of Educational Psychology, 93(2), 390-397.
    Mayer, R. E., Hegarty, M., Mayer, S., & Campbell, J. (2005). When static media promote active learning: Annotated illustrations versus narrated animations in multimedia instruction. Journal of Experimental Psychology-Applied, 11(4), 256-265.
    Mayer, R. E., & Moreno, R. (2002). Aids to computer-based multimedia learning. Learning and Instruction, 12(1), 107-119.
    Mayer, R. E., & Moreno, R. (2003). Nine ways to reduce cognitive load in multimedia learning. Educational Psychologist, 38(1), 43-52.
    Moreno, R. (2006). Learning in high-tech and multimedia environments. Current Directions in Psychological Science, 15(2), 63-67.
    Paas, F. (1992). Training strategies for attaining transfer of problem-solving skill in statistics: A cognitive-load approach. Journal of Educational Psychology, 84(4), 429-434.
    Paas, F., & van Merriënboer, J. J. G. (1993). The efficiency of instructional conditions: An approach to combine mental effort and performance-measures. Human Factors, 35(4), 737-743.
    Ploetzner, R., & Lowe, R. (2004). Dynamic visualizations and learning: Introduction to the special issue. Learning and Instruction, 14(3), 235-240.
    Sweller, J., & Chandler, P. (1994). Why some material is difficult to learn. Cognition and Instruction, 12(3), 185-233.
    Sweller, J., van Merriënboer, J. J. G., & Paas, F. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10(3), 251-296.
    Tuovinen, J. E., & Paas, F. (2004). Exploring multidimensional approaches to the efficiency of instructional conditions. Instructional Science, 32(1-2), 133-152.
    Tversky, B., Morrison, J. B., & Bétrancourt, M. (2002). Animation: Can it facilitate? International Journal of Human-Computer Studies, 57(4), 247-262.
    van Merriënboer, J. J. G., & Sweller, J. (2005). Cognitive load theory and complex learning: Recent developments and future directions. Educational Psychology Review, 17(2), 147-177.
    Chapter IV
    American Association for the Advancement of Science (AAAS). (1993). Benchmarks for science literacy. New York: Oxford University Press.
    Adobe. (2010a). Flash concepts: Design and animation. Retrieved November 20, 2010, from http://www.adobe.com/support/flash/design_animation.html.
    Adobe. (2010b). Flash player statistics. Retrieved November 20, 2010, from http://www.adobe.com/products/player_census/flashplayer/index.html.
    Angeli, C. (2005). Transforming a teacher education method course through technology: effects on preservice teachers' technology competency. Computers & Education, 45(4), 383-398.
    Angeli, C., & Valanides, N. (2009). Epistemological and methodological issues for the conceptualization, development, and assessment of ICT–TPCK: Advances in technological pedagogical content knowledge (TPCK). Computers & Education, 52(1), 154-168.
    Bauer, J., & Kenton, J. (2005). Toward technology integration in the schools: Why it isn't happening. Journal of Technology and Teacher Education, 13(4), 519-546.
    Brown, J. S., Collins, A., & Duguid, S. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32-42.
    Chang, C. Y., Hsiao, C. H., & Barufaldi, J. P. (2006). Preferred-actual learning environment "spaces" and earth science outcomes in Taiwan. Science Education, 90(3), 420-433.
    Chang, C. Y., Yeh, T. K., Lin, C. Y., Chang, Y. H., & Chen, C. L. D. (2010). The impact of congruency between preferred and actual learning environments on tenth graders' science literacy in Taiwan. Journal of Science Education and Technology, 19(4), 332-340.
    Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Earlbaum Associates.
    Collins, A. (1988). Cognitive apprenticeship and instructional technology: Technical report. Cambridge, MA: Bolt Beranek and Newman.
    Collins, A., Brown, J. S., & Newman, S. E. (1989). Cognitive apprenticeship: Teaching the craft of reading, writing, and mathematics. In L. B. Resnick (Ed.), Knowing learning and instruction: Essays in honor of Robert Glaser (pp. 453-494). Hillsdale, NJ: Lawrence Erlbaum Associates.
    Derry, S., & Lesgold, A. (1996). Toward a situated social practice model for instructional design. In D. C. Berliner & R. C. Calfe (Eds.), Handbook of educational psychology (pp. 787-806). New York, NY: Macmillan.
    Dick, W., & Carey, L. (1996). The systematic design of instruction (4th ed.). New York: Haper Collins College Publishers.
    Greeno, J. G. (1998). The situativity of knowing, learning, and research. American Psychologist, 53(1), 5-26.
    Heinich, R., Molenda, M., Russell, J. D., & Smaldino, S. E. (2001). Instructional media and technologies for learning (7th ed.). Englewood Cliffs, NJ: Prentice Hall.
    International Society for Technology in Education. (2008). National educational technology standards for teachers. Eugene, OR: Author.
    Jang, S.-J. (2008). The effects of integrating technology, observation and writing into a teacher education method course. Computers & Education, 50(3), 853-865.
    Jang, S.-J., & Chen, K.-C. (2010). From PCK to TPACK: Developing a transformative model for pre-service science teachers. Journal of Science Education and Technology, 19(6), 553-564.
    Jonassen, D. H., & Reeves, T. C. (1996). Learning with technology: Using computers as cognitive tools. In D. H. Jonassen (Ed.), Handbook of research for educational communications and technology (pp. 693-719). New York: Macmillan.
    Keengwe, J., & Anyanwu, L. (2007). Computer technology-infused learning enhancement. Journal of Science Education and Technology, 16(5), 387-393.
    Keengwe, J., Onchwari, G., & Wachira, P. (2008). Computer technology integration and student learning: Barriers and promise. Journal of Science Education and Technology, 17(6), 560-565.
    Koehler, M. J., Mishra, P., & Yahya, K. (2007). Tracing the development of teacher knowledge in a design seminar: Integrating content, pedagogy and technology. Computers & Education, 49(3), 740-762.
    Lee, M. H., Chang, C. Y., & Tsai, C. C. (2009). Exploring Taiwanese high school students’ perceptions of and preferences for teacher authority in the earth science classroom with relation to their attitudes and achievement. International Journal of Science Education, 31(13), 1811-1830.
    Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017-1054.
    Ministry of Education (MOE). (2001). The 1-9 grades science and life technology curriculum standards. Taipei, Taiwan: Author.
    National Center for Education Statistics. (2010). Teachers' use of educational technology in U.S. public schools: 2009. Washington, DC: U.S. Department of Education.
    National Research Council (NRC). (2001). Educating teachers of science, mathematics, and technology: New practices for the new millennium. Washington, DC: National Academy Press.
    Niess, M. L. (2005). Preparing teachers to teach science and mathematics with technology: Developing a technology pedagogical content knowledge. Teaching and Teacher Education, 21, 509-253
    Patton, M. Q. (1990). Qualitative evaluation and research methods (2nd ed.). Newbury Park: Sage.
    Reiser, R. A. (2001). A history of instructional design and technology; Part II: A history of instructional design. Educational Technology Research and Development, 49(2), 57-67.
    Salomon, G., Perkins, D. N., & Globerson, T. (1991). Partners in cognition: Extending human intelligence with intelligent technologies. Educational Researcher, 20(3), 2-9.
    Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.
    Wachira, P., & Keengwe, J. (2010). Technology integration barriers: Urban school mathematics teachers’ perspectives. Journal of Science Education and Technology, doi: 10.1007/s10956-10010-19230-y.
    Wilson, E. K. (2003). Preservice secondary social studies teachers and technology integration: What do they think and do in their field experiences. Journal of Computing in Teacher Education, 20(1), 29-39.
    Chapter V
    Daniel, L. G. (1998). Statistical significance testing: A historical overview of misuse and misinterpretation with implication for the editorial policies of educational journals. Research in the Schools, 5, 23-32.
    McLean, J. E., & Ernest, J. M. (1998). The role of statistical significance testing in educational research. Research in the Schools, 5, 15-22.
    Rennie, L. J. (1998). Improving the interpretation and reporting of quantitative research. Journal of Research in Science Teaching, 35(3), 237-248.
    Thompson, B. (1996). AERA editorial policies regarding statistical significance testing: Three suggested reforms. Educational Researcher, 25(2), 26-30.

    下載圖示
    QR CODE