Presentation is loading. Please wait.

Presentation is loading. Please wait.

C R E S S T / U C L A Evaluating the Impact of the Interactive Multimedia Exercises (IMMEX) Program: Measuring the Impact of Problem-Solving Assessment.

Similar presentations


Presentation on theme: "C R E S S T / U C L A Evaluating the Impact of the Interactive Multimedia Exercises (IMMEX) Program: Measuring the Impact of Problem-Solving Assessment."— Presentation transcript:

1 C R E S S T / U C L A Evaluating the Impact of the Interactive Multimedia Exercises (IMMEX) Program: Measuring the Impact of Problem-Solving Assessment Software Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST Tina C. Christie, UCLA / CRESST Roy S. Zimmermann, UCLA / CRESST Ronald H. Stevens, UCLA School of Medicine UCLA Graduate School of Education & Information Studies Center for the Study of Evaluation National Center for Research on Evaluation, Standards, and Student Testing Annual Meeting of the American Educational Research Association April 24, 2000

2 C R E S S T / U C L A Overview è IMMEX overview è Evaluation questions, design, findings è Focus on barriers to adoption è Implications for the future

3 C R E S S T / U C L A Implementation Context è Los Angeles Unified School District ¥ 697,000 students, 41,000 teachers, 790 schools (1998) ¥ Average class size: 27 (1998-99) ¥ Limited English Proficiency (LEP): 46% of students (1998-99) ¥ 2,600 classrooms have Internet access (1998-99)

4 C R E S S T / U C L A IMMEX Program Goal è Improve student learning via the routine use of IMMEX assessment technology in the classroom ¥ Explicitly link assessment technology with classroom practice, theories of learning, and science content ¥ Provide aggressive professional development, IMMEX, and technology support

5 C R E S S T / U C L A IMMEX Program Problem Solving Assessment Software è Problem solving architecture: ¥ Students presented with a problem scenario, provided with information that is relevant and irrelevant to solving problem ¥ Problem solving demands embedded in design of information space and multiple problem sets (e.g., medical diagnosis) ¥ Performance: # completed, % solved ¥ Process: Pattern of information access yields evidence of use of a particular problem solving strategy (e.g., elimination, evidence vs. conjecture, cause-effect)

6 C R E S S T / U C L A IMMEX Program: Theory of Action Better classroom teaching Increased student outcomes Use of IMMEX to assess students Greater teacher understanding of students Greater teacher facility with technology Quality teacher training Individual teacher differences Deeper teacher understanding of science content Use of IMMEX to instruct students

7 C R E S S T / U C L A Evaluation Questions è Implementation: Is the IMMEX software being implemented as intended? è Impact: How is IMMEX impacting classrooms, teachers, and students? è Integration: How can IMMEX best be integrated into the regular infrastructure of schooling?

8 C R E S S T / U C L A Evaluation Methodology è Pre-post design ¥ Y1, Y2: Focus on teachers and classroom impact ¥ Y3, Y4: Focus on student impact è Examine impact over time

9 C R E S S T / U C L A Evaluation Methodology è Instruments ¥ Teacher surveys: demographics, teaching practices, attitudes, usage, perceived impact ¥ Teacher interviews: barriers, integration, teacher characteristics ¥ Student surveys: demographics, perceived impact, attitudes, strategy use

10 C R E S S T / U C L A Evaluation Methodology è Data collection: ¥ Year 1: Spring 99 ¥ Year 2: Fall 99/Spring 00 ¥ Year 3, 4: Fall/Spring 01, Fall/Spring 02 è Teacher sample ¥ Y1: All IMMEX-trained teachers (~240): 45 responded to survey, 9 interviewed ¥ Y2 Fall 99: 1999 IMMEX users (38): 18 responded to survey, 8 interviewed

11 C R E S S T / U C L A Evaluation Methodology Year 1Year 2Year 3Year 4 Spr 99Fall 99Spr 00Fall 01Spr 01Fall 00Spr 01 è Teacher sample ¥ Y1: Sample all teachers who were trained on IMMEX (~240)  45 responded to survey, 9 interviewed ¥ Y2 Fall: Sample all confirmed 1999 users (38)  18 responded to survey, 8 interviewed

12 C R E S S T / U C L A Results è Teacher surveys: ¥ High satisfaction with participation in IMMEX program ¥ Once a month considered high, more often few times (< 7 times) a school year ¥ Implementation: assessing students’ problem solving, practice integrating their knowledge ¥ Impact: use of technology, exchange of ideas with colleagues, teaching effectiveness

13 C R E S S T / U C L A Results è Teacher interviews: ¥ In general, IMMEX teachers have a very strong commitment to teaching and student learning  Passionate about their work, committed to students and the profession, engage in a variety of activities (school and professional), open to new teaching methods  Strong belief in the pedagogical value of IMMEX

14 C R E S S T / U C L A Results è Teacher interviews: ¥ In general, IMMEX teachers are willing to commit the time and effort required to implement IMMEX  Able to deal with complexity of implementation logistics  Highly motivated, organized, self-starters

15 C R E S S T / U C L A Results è Teacher interviews: General barriers ¥ Lack of computer skills ¥ Lack of computers ¥ Classroom challenges

16 C R E S S T / U C L A Results è Teacher interviews: IMMEX barriers ¥ User-interface ¥ Lack of problem sets / Weak link to curriculum ¥ Amount of time to implement IMMEX in classroom ¥ Amount of time to author IMMEX problem sets

17 C R E S S T / U C L A Addressing Barriers Problem sets Computer related Implementation Authoring, Curriculum, >100 problem sets, authoring capability, ongoing problem set development Basic computer skills instruction, rolling labs, on-demand technical support, Web version Full-service model Finely-tuned development workshops, stipend, documentation, curriculum guides Experienced, dedicated, focused staff with teaching and research experience BarriersHow Addressed

18 C R E S S T / U C L A Implications è Short-term ¥ No widespread adoption by teachers  too many barriers for too many teachers  only highly motivated likely to adopt  full-service model evidence of difficulty of adoption ¥ Learn from the “A-team”  high usage teachers represent best practices ¥ Establish deployment infrastructure

19 C R E S S T / U C L A Implications è Long-term  Problem solving instruction and assessment will remain relevant  Computer barriers: lowered (computer access, skills) ¥ Time-to-Implement barriers: lowered (problem set expansion, Web access, automated scoring and reporting) ¥ Time-to-Author barriers: ???(reduction in mechanics of authoring, problem set expansion; conceptual development of problem sets remains a constant)

20 C R E S S T / U C L A Contact Information For more information about the evaluation: Greg Chung (greg@ucla.edu) www.cse.ucla.edu For more information about IMMEX: Ron Stevens (immex_ron@hotmail.com) www.immex.ucla.edu

21 C R E S S T / U C L A IMMEX Program è First used for medical school examination in 1987 è First K-12 deployment context (content development, teacher training, high school use) between 1990-92

22 C R E S S T / U C L A IMMEX Software: Search path maps


Download ppt "C R E S S T / U C L A Evaluating the Impact of the Interactive Multimedia Exercises (IMMEX) Program: Measuring the Impact of Problem-Solving Assessment."

Similar presentations


Ads by Google