The API (Agent Persona Instrument) for assessing pedagogical agent persona Presenter: Wan-Ning Chen Professor: Ming-Puu Chen Date: May 18, 2009 Baylor,

Slides:



Advertisements
Similar presentations
Scale Construction and Halo Effect in Secondary Student Ratings of Teacher Performance Ph.D. Dissertation Defense Eric Paul Rogers Department of Instructional.
Advertisements

Cognitive-metacognitive and content-technical aspects of constructivist Internet-based learning environments: a LISREL analysis 指導教授:張菽萱 報告人:沈永祺.
Predicting Youth Engagement: The Role of Initiating and Sustaining Factors Linda Rose-Krasnor 1, Kelly Campbell 1, Lisa Loiselle 2, Mark Pancer 3, Michael.
Teacher Evaluation Model
An evaluation of scaffolding for virtual interactive tutorials 指導教授 : 陳 明 溥 研 究 生 : 許 良 村 Pahl, C.(2002).An evaluation of scaffolding for virtual interactive.
NAMA ANGGOTA : DIANITA YANA RATRI ( ) SELVI PUTRI SIMDORA ( ) DEDY ARFIANTO ( ) MUJI IDA KURNIASARI ( )
Domain 1: Planning and Preparation
1]Knoll, N., Burkert, S., & Schwartzer, R. (2006). Reciprocal support provision: Personality as a moderator? European Journal of Personality, 20,
Chapter 4 Validity.
Psychology 202b Advanced Psychological Statistics, II April 7, 2011.
Common Factor Analysis “World View” of PC vs. CF Choosing between PC and CF PAF -- most common kind of CF Communality & Communality Estimation Common Factor.
Attachment as a moderator of the effect of security in mentoring on subsequent perceptions of mentoring and relationship quality with college teachers.
Instrument Development for a Study Comparing Two Versions of Inquiry Science Professional Development Paul R. Brandon Alice K. H. Taum University of Hawai‘i.
In the name of Allah. Development and psychometric Testing of a new Instrument to Measure Affecting Factors on Women’s Behaviors to Breast Cancer Prevention:
Chapter 7 Correlational Research Gay, Mills, and Airasian
Motivating Language Learners’ Project University of Alberta, Edmonton, Canada Changes in Perceptions: Motivation, Teaching Styles, Engagement Maya Sugita.
Factors affecting contractors’ risk attitudes in construction projects: Case study from China 박병권.
Factor Analysis Psy 524 Ainsworth.
Copyright © 2001 by The Psychological Corporation 1 The Academic Competence Evaluation Scales (ACES) Rating scale technology for identifying students with.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
1 MSP-Motivation Assessment Program (MSP-MAP) Tools for the Evaluation of Motivation-Related Outcomes of Math and Science Instruction Martin Maehr
Implication of Gender and Perception of Self- Competence on Educational Aspiration among Graduates in Taiwan Wan-Chen Hsu and Chia- Hsun Chiang Presenter.
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
Presenter: Che-Yu Lin Advisor: Ming-Puu Chen Date: 06/15/2009
Student Engagement Survey Results and Analysis June 2011.
Indicators of Family Engagement Melanie Lemoine and Monica Ballay Louisiana State Improvement Grant/SPDG.
Connecting in the Facebook Age: Development and Validation of a New Measure of Relationship Maintenance Jessica Vitak College of Information Studies, University.
N97C0004 Betty Exploration of The Attitudes of Freshman Foreign Language Students Toward Using Computers A Turkish State University.
Presenter : Ching-ting Lin Instructor: Ming-puu Chen Developing a Usability Evaluation Method for E-learning Application: From Functional Usability to.
Tonya Filz & Regan A.R. Gurung University of Wisconsin – Green Bay Abstract As class sizes increase due to stagnating budgets, and as colleges and universities.
Effects of an Animated Pedagogical Agent with Instructional Strategies in Mutlimedia Learning Yung, H. I. (2009). Effects of an animated pedagogical agent.
THE RELATIONSHIP BETWEEN PRE-SERVICE TEACHERS’ PERCEPTIONS TOWARD ACTIVE LEARNING IN STATISTIC 2 COURSE AND THEIR ACADEMIC ACHIEVEMENT Vanny Septia Efendi.
Interface agents as social models:The impact of appearance on females attitude toward engineering 指導教授: Chen, Ming-puu 報 告 者: Chen, Hsiu-ju 報告日期: 2007.
Supporting learners with math anxiety: The impact of pedagogical agent emotional and motivational support 報 告 人:張純瑋 Baylor, A. L., Shen, E., & Warren,
Etiquette and Efficacy in Animated Pedagogical Agents: The Role of Stereotypes Kristen N. Moreno, 1 Natalie K. Person, 2 Amy B. Adcock, 1 Richard N. Van.
Perceptive Agile Measurement: New Instruments for Quantitative Studies in the Pursuit of the Social-Psychological Effect of Agile Practices Department.
Measurement Models: Exploratory and Confirmatory Factor Analysis James G. Anderson, Ph.D. Purdue University.
Reliability Analysis Based on the results of the PAF, a reliability analysis was run on the 16 items retained in the Task Value subscale. The Cronbach’s.
The Effects of Image and Animation in Enhancing Pedagogical Agent Persona Presenter: Wan-Ning Chen Professor: Ming-Puu Chen Date: March 2, 2009 Baylor,
1 Meta-Analysis of the Effectiveness of Pedagogical Agent 報 告 人:張純瑋 Kim, M. & Ryu, J. (2003). Meta-analysis of the effectiveness of pedagogical agent.
1 Presenter: Jing-Yi Zhao Advisor: Ming-Puu Chen Date: Aug. 19, 2009 Angeli, C. (2005). Transforming a teacher education method course through technology:
The Pedagogical Agent Split-Persona Effect: When Two Agents are Better than One 報 告 人:張純瑋 Baylor, A. & Ebbers, S. (2003). The pedagogical agent split-persona.
Establishing educational standards and monitoring student performance Directions for methodological improvements in international assessments.
© 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 1 Chapter 12 Testing for Relationships Tests of linear relationships –Correlation 2 continuous.
G544 – Practical project SELF REPORT. Revision  Socrative quiz  In pairs – answer each question.  We will then discuss each answer given.
Surveying instructor and learner attitudes toward e-learning Presenter: Jenny Tseng Professor: Ming-Puu Chen Date: April 12, 2008 Liaw, S., Huang, H.,
Applied Quantitative Analysis and Practices
Online students’ perceived self-efficacy: Does it change? Presenter: Jenny Tseng Professor: Ming-Puu Chen Date: July 11, 2007 C. Y. Lee & E. L. Witta (2001).
ACES: Developing a Valid and Reliable Survey to Assess Faculty Support of Diversity Goals.
This is a mess... How the hell can I validate the consumer behaviors’ scales of my SEM model? Maria Pujol-Jover 1, Irene Esteban-Millat 1 1 Marketing Research.
Simulating instructional roles through pedagogical agents 報 告 人:張純瑋 Baylor, A. L. & Kim, Y. (2005). Simulating instructional roles through pedagogical.
1 The Effects of Competency and Type of Interaction of Agent Learning Companion on Agent Value, Motivation, and Learning 指導教授: Chen, Ming-puu 報告者 : Chang,
Writing Methodology Section (Quantitative Research)
FACTOR ANALYSIS & SPSS. First, let’s check the reliability of the scale Go to Analyze, Scale and Reliability analysis.
CHAPTER 16: Research and Assessment in Family Therapy Family Therapy: History, Theory, and Practice 6 th Edition Samuel T. Gladding Developed by Nathaniel.
Quantification of dyspnea using descriptors: Development and initial testing of the Dyspnea-12 J Yorke, S H Moosavi, C Shuldham, P W Jones (Thorax
Designing Quality Assessment and Rubrics
The Behavior Assessment System for Children (BASC)
EVALUATING EPP-CREATED ASSESSMENTS
FACTOR ANALYSIS & SPSS.
Exploratory Factor Analysis Participants, Procedures, & Measures
Showcasing the use of Factor Analysis in data reduction: Research on learner support for In-service teachers Richard Ouma University of York SPSS Users.
Andrew Caudill Western Kentucky University
First study published in JOGS.
Learning online: Motivated to Self-Regulate?
THE RELATIONSHIP BETWEEN PRE-SERVICE TEACHERS’ PERCEPTIONS TOWARD ACTIVE LEARNING IN STATISTIC 2 COURSE AND THEIR ACADEMIC ACHIEVEMENT Vanny Septia Efendi.
Factor Analysis.
Educational Testing Service
WrItIng Methodology SectIon (QuantItatIve Research)
  Using the RUMM2030 outputs as feedback on learner performance in Communication in English for Adult learners Nthabeleng Lepota 13th SAAEA Conference.
Presentation transcript:

The API (Agent Persona Instrument) for assessing pedagogical agent persona Presenter: Wan-Ning Chen Professor: Ming-Puu Chen Date: May 18, 2009 Baylor, A. & Ryu, J. (2003). The API (Agent Persona Instrument) for assessing pedagogical agent persona. In D. Lassner & C. McNaught (Eds.), Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2003 (pp ). Chesapeake, VA: AACE.

In an effort to measure learners’ perceptions of pedagogical agent persona features, there have been several studies that employ scales based on user ratings. van Eck and Adcock (2003) raised this issue of establishing a reliable and valid scale to measure agent persona effects. They conducted a factor analysis and constructed the Attitude Toward Agent Scale (ATAS):  Sub-scales: pedagogical efficacy of the agent and agent’s attitude toward teaching.  In order to establish the scale, they adapted questions from a human teacher rating scale that measured how effectively human teachers teach students.  Potential problems: a pedagogical agent is not a human teacher but rather a machine-generated teacher. This paper describes  API (Agent Persona Instrument): measure perceived pedagogical agent persona while considering features unique to the computer-based aspects of the agent. Instruction

The first sample  80 undergraduate students (55 % Male, 45% Female).  An exploratory analysis to identify a factor model. The second sample  133 undergraduate students (30% Male, 70% Female)  A confirmatory factor analysis to validate the identified factor model. To develop an item pool for the instrument we collected the instruments used in other studies that investigated pedagogical agent persona together with the ATAS. After collating all 66 items from these studies, we selected appropriate items from the initial item pool and revised them, deleting duplicate items and those that did not specifically measure agent persona. The final initial instrument was comprised of 38 items. Methods - Participants and Instrument Development

Principal component analysis and maximum likelihood analysis were performed on the raw scores of the first sample. The explained variances  three factor model: 60.86%, four factor model: 65.10%, five factor model: 68.76%. The decision to select five factors:  the 5-factor model explained 68.76% of the total variance of the 38 question items.  the commonalities of most of items ranged from 0.51 to The criterion of strength of association between items and subscales was set at.70 and 8 items were deleted. Another factor analysis was conducted to evaluate the revised set of items with 5 factors.  The explained variance of the revised factor model was improved to 72.66%.  The five factors: Credible, Facilitating Learning, Mentor-like, Engaging, and Human-like. Results - Exploratory Factor Analysis: Determination of dimensionality

Using maximum likelihood method. The five factors were reviewed to build an initial model with two latent variables.  Content-oriented variable (Credible, Facilitating Learning, and Mentor- like): addressed student’s perception of instructional help from the agent.  Affective features of the agent (Engaging and Human-like): measured the agent’s human-like behavior, including emotional expression. However, a test of the initially-hypothesized model revealed that it did not fit the data (χ2[4]=27.02, p<0.001). Results - Confirmatory Factor Analysis: Model estimation and evaluation

a) Reviewing the correlation coefficients from the initial model:  Mentor-like showed the lowest correlation coefficients. b) Analysing the contextual meaning of the items:  Half of the items of Mentor-like were originally revised from a scale used for a human tutor and focused on individualized feedback to students.  Appropriate for a human teacher or a more adaptive agent system rather than general pedagogical agents. Based on both of these results, Mentor-like was dropped from the initial model. Results - Model Revision

The latent variables were labeled as  Informational Usefulness (Facilitating Learning and Credible) : Agent’s instructional advice and information, with Facilitating Learning having a higher correlation at 1.04 (versus Credible at.58).  Emotive Interaction (Human-like and Engaging): The motivational and entertaining features of the agent, and thus contributes to student motivation and friendliness of agent. Results - Latent variables

Discussion