Evaluating Professional Development Debbie Junk, Coordinator for Mathematics Initiatives Mathematics Project Directors’ Meeting Tuesday, October 9th Austin,

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Pre and Post Assessments A quick and easy way to assess your Student Learning Outcomes.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Collecting data Chapter 5
NORTH CAROLINA TEACHER EVALUATION PROCESS TRAINING 2-Day Training for Phase I, II and III *This 2-Day training is to be replicated to meet.
Round Table Discussion- Evaluating Arts Teachers William Kohut, Principal- Denver School of the Arts Dr. Mark Hudson- Director of Arts- Denver Public Schools.
Item Writing Techniques KNR 279. TYPES OF QUESTIONS Closed ended  Checking yes/no, multiple choice, etc.  Puts answers in categories  Easy to score.
ESTEEMS (ESTablishing Excellence in Education of Mathematics and Science) Project Overview and Evaluation Dr. Deborah H. Cook, Director, NJ SSI MSP Regional.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Alternative Assessments FOUN 3100 Fall 2003 Sondra M. Parmer.
Supporting Teachers to make Overall Teacher Judgments The Consortium for Professional Learning.
Mathematics and Science Partnership Grant Title IIB Information Session April 10, 2006.
Sandi Snyder Math/Computer Teacher Shickley Public School.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
How to Write Goals and Objectives
Development of Questionnaire By Dr Naveed Sultana.
Observations A Mirror of the Classroom. Objectives  Gain specific strategies for communicating with teachers to promote successful teaching and learning.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
TEXAS REGIONAL COLLABORATIVES FOR EXCELLENCE IN SCIENCE AND MATHEMATICS TEACHING MATHEMATICS PROJECT DIRECTORS MEETING February 12, 2008 Debbie Junk-TRC.
1 MSP-Motivation Assessment Program (MSP-MAP) Tools for the Evaluation of Motivation-Related Outcomes of Math and Science Instruction Martin Maehr
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Becoming a Teacher Ninth Edition
Models for Evaluating MSP Projects Evaluation of Professional Development Programs MSP Regional Conference Dallas, Texas February 7, 2007 Norman L. Webb.
The Evaluation of Mathematics and Science Partnership Program A Quasi Experimental Design Study Abdallah Bendada, MSP Director
Developing teachers’ mathematics knowledge for teaching Challenges in the implementation and sustainability of a new MSP Dr. Tara Stevens Department of.
Mathematics Teacher Leader Session 1: The National Mathematics Strategy & Modelling Exemplary Teaching 1.
Pre-Conference Workshop – June 2007 BUILDING A NATIONAL TEAM: Theatre Education Assessment Models Robert A. Southworth, Jr., Ed.D. TCG Assessment Models.
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
Too expensive Too complicated Too time consuming.
DEBBIE FRENCH EDCI 5870 OCTOBER 30,  Title of research project:  “An Investigation of the NITARP/ SSTPTS Astronomy Research Experience for Teachers”
Reflect and Revise: Evaluative Thinking for Program Success Tom DeCaigny and Leah Goldstein Moses.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
 Collecting Quantitative  Data  By: Zainab Aidroos.
Planning and Integrating Curriculum: Unit 4, Key Topic 1http://facultyinitiative.wested.org/1.
TAH Project Evaluation Data Collection Sun Associates.
Assessments Matching Assessments to Standards. Agenda ● Welcome ● What do you think of assessment? ● Overview of all types of evidence ● Performance Tasks.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
SURVEY RESEARCH.  Purposes and general principles Survey research as a general approach for collecting descriptive data Surveys as data collection methods.
Record Keeping and Using Data to Determine Report Card Markings.
Exploring Evidence.
Program Evaluation.
Technology Enhanced Learning at University - How can learning enhancement be demonstrated? Adrian Kirkwood & Linda Price IET, The Open University.
The Evaluation of Mathematics and Science Partnerships Program A Quasi Experimental Design Study Abdallah Bendada, Title II Director
Planning for Instruction Chapter 6 NC Teaching Standard IV.
MSP Program Evaluation Carol L. Fletcher, Ph.D. TRC Project Director Meetings 1/27/09 and 2/5/09 Carol L. Fletcher, Ph.D. TRC Project Director Meetings.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Minnesota Manual of Accommodations for Students with Disabilities Training January 2010.
1 An Evaluation Plan and Tool Kit for the Archibald Bush Innovative Teaching and Technology Strategies Grant Valerie Ruhe and J.D. Walker, Center for Teaching.
Application Individualization and Differentiation in Czech Primary Schools - One of the Characteristic Features of Inclusion Application Individualization.
PBL Instructional Design. PBL Instructional Design Name: Name of PBL: Grade Level: Content Area:
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
Personal Communication as Classroom Assessment. “What’s in a question, you ask? Everything. It is a way of evoking stimulating response or stultifying.
Creative Curriculum and GOLD Assessment: Early Childhood Competency Based Evaluation System By Carol Bottom.
SCIENCE Assessment Amanda Cantafio.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Instructional Leadership Supporting Common Assessments.
EVALUATING A MIDDLE SCHOOL MATH M.ED. PROFESSIONAL DEVELOPMENT PROGRAM. Knowledge, Pedagogy, Practice or Student Achievement:
MSP Summary of First Year Annual Report FY 2004 Projects.
Justin Allegra EDU 671 – Fundamentals of Educational Research Dr. Newton Miller Ashford University March 12, 2016 Action Research Proposal Final Project.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Inquiry learning Does IBL work?
Evaluation of An Urban Natural Science Initiative
Instructional Coaching in the Elementary Mathematics Classroom
Teacher Evaluation “SLO 101”
Doing Educational Research By: T
Exploring Assessment Options NC Teaching Standard 4
Parent-Teacher Partnerships for Student Success
Accountability and Assessment in School Counseling
Presentation transcript:

Evaluating Professional Development Debbie Junk, Coordinator for Mathematics Initiatives Mathematics Project Directors’ Meeting Tuesday, October 9th Austin, Texas Debbie Junk, Coordinator for Mathematics Initiatives Mathematics Project Directors’ Meeting Tuesday, October 9th Austin, Texas

 True professional development should be a learning experience for all who are involved…professional development is a purposeful and intentional process designed to enhance the knowledge and skills of educators so that they might, in turn,improve the learning of all students.” (Guskey, 2001 p. 121) What counts as professional development?

Folding Activity

Why do we evaluate?  Effectiveness of professional development in mathematics need to be evaluated to assist in-  Planning  Formative assessment and,  Summative assessment  To provide evidence of effectiveness so that good programs get better, great programs spread and ineffective programs get eliminated or significantly reworked.  Effectiveness of professional development in mathematics need to be evaluated to assist in-  Planning  Formative assessment and,  Summative assessment  To provide evidence of effectiveness so that good programs get better, great programs spread and ineffective programs get eliminated or significantly reworked.

Whose Fault is it?  Evidence versus proof  The complex nature of teaching and learning  Evidence versus proof  The complex nature of teaching and learning

Effectiveness indicators show:  Increased student achievement  Increased teacher content knowledge  Evidence that shows the two are linked in some way.  Increased student achievement  Increased teacher content knowledge  Evidence that shows the two are linked in some way.

What Counts as Mathematics Knowledge for Teaching (MKT)?  Mathematics teachers need to know the math they teach in a very specialized way-  Children’s thinking  Task posing  Assessment  Expectations  Decision-making  Mathematics teachers need to know the math they teach in a very specialized way-  Children’s thinking  Task posing  Assessment  Expectations  Decision-making

Research  Teachers who score high on knowledge assessments designed to test this specialized knowledge have students who learn more in mathematics in their classrooms. Ball, et al, 2004  Students who are in classrooms with teachers who understand and can predict their thinking in mathematics score higher on achievement tests. Carpenter, et al 1998  Teachers who score high on knowledge assessments designed to test this specialized knowledge have students who learn more in mathematics in their classrooms. Ball, et al, 2004  Students who are in classrooms with teachers who understand and can predict their thinking in mathematics score higher on achievement tests. Carpenter, et al 1998

What kinds of evaluation should be given?  Levels of evaluation  Participants’ reactions  Participants’ learning  Organization support and change  Participants’ use of new knowledge and skills  Student learning outcomes From Guskey, 2001 pp  Levels of evaluation  Participants’ reactions  Participants’ learning  Organization support and change  Participants’ use of new knowledge and skills  Student learning outcomes From Guskey, 2001 pp 79-81

You get what you ask for:  After a 2-day Workshop on Cooperative Learning, one participant responded, “The ideas were fine, but they had us working in groups too much”.  One teacher responded to a workshop titled, Tactics for Thinking: “this is all very interesting but I feel it requires students to think too much!”  From Gusky, 2001  After a 2-day Workshop on Cooperative Learning, one participant responded, “The ideas were fine, but they had us working in groups too much”.  One teacher responded to a workshop titled, Tactics for Thinking: “this is all very interesting but I feel it requires students to think too much!”  From Gusky, 2001

Designing and/or choosing the instrument  Understand the purpose of your assessment-  Understand the intervention you are assessing-  The questions should vary in scope so that you gain understanding in all areas of Math Knowledge for Teaching.  Understand the purpose of your assessment-  Understand the intervention you are assessing-  The questions should vary in scope so that you gain understanding in all areas of Math Knowledge for Teaching.

How much is enough?  Practical considerations  Length of the evaluation tool  Time, when and where  Pre and post data  Participant numbers  Workshop numbers  Practical considerations  Length of the evaluation tool  Time, when and where  Pre and post data  Participant numbers  Workshop numbers

What do you do with ALL THAT DATA?  Ethical considerations  Permission to use/collect data  Anonymous vs. named  Paper assessments  Electronic assessments  Summarizing and analyzing assessment data  Ethical considerations  Permission to use/collect data  Anonymous vs. named  Paper assessments  Electronic assessments  Summarizing and analyzing assessment data

Qualitative or Quantitative… Qualitative evaluations of program effectiveness can occur at any level in many formats.  interviews  open-ended responses  observations  journals Summarizing these data usually takes more time and consistently reporting the results can be difficult. Information gained from qualitative data can yield a more personal and deeper understanding of the participants’ experiences. Qualitative evaluations of program effectiveness can occur at any level in many formats.  interviews  open-ended responses  observations  journals Summarizing these data usually takes more time and consistently reporting the results can be difficult. Information gained from qualitative data can yield a more personal and deeper understanding of the participants’ experiences.

Quantitative Data  Easy to collect  Easier to interpret without bias*  Limited in scope  Likert scales  Multiple choice *bias is still present, especially within the assessment design  Easy to collect  Easier to interpret without bias*  Limited in scope  Likert scales  Multiple choice *bias is still present, especially within the assessment design

Both?  Typical professional development evaluation has both elements.  e.g. “Were your expectations met today?” (a numeric score is assigned)  Tell what you liked best about this workshop--(open-ended)  Typical professional development evaluation has both elements.  e.g. “Were your expectations met today?” (a numeric score is assigned)  Tell what you liked best about this workshop--(open-ended)

A Whole Different Animal!  Assessment of learning requires a documentation of change over time.  For students, we can compare year to year test data.  Assessing teachers knowledge is more complex.  Assessment of learning requires a documentation of change over time.  For students, we can compare year to year test data.  Assessing teachers knowledge is more complex.

Mathematics and Science Partnership Grants (MSP)  Expectations

Texas Regional Collaboratives  Plan of Action:  provide instruments for use in pre- post format (e.g. Geometry, K-8 geosciences)  form advisory committee to explore evaluation and assessment  Plan of Action:  provide instruments for use in pre- post format (e.g. Geometry, K-8 geosciences)  form advisory committee to explore evaluation and assessment

HOMEWORK Evaluating Professional Development By Thomas Guskey Corwin Press, 2001 Evaluating Professional Development By Thomas Guskey Corwin Press, 2001