Identifying Measures for Evaluating Changes in Teacher Practice, Teacher and Student Attitudes and Beliefs, and Sustainability Dave Weaver REL Central.

Slides:



Advertisements
Similar presentations
Professional Learning Communities Connecting the Initiatives
Advertisements

TWS Aid for Supervisors & Mentor Teachers Background on the TWS.
1 Strengthening Teaching and Learning: Educational Leadership and Professional Standards SABES Directors’ Institute July 2011.
Providing On-going Support for STEM Teachers Joan D. Pasley Horizon Research, Inc.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Structuring Retreats to Share Findings and Discuss Recommendations Paul Cobb and the MIST Team.
How Logical Is Your Logic Model
1 Why is the Core important? To set high expectations – for all students – for educators To attend to the learning needs of students To break through the.
WRITING PROPOSALS WITH STRONG METHODOLOGY AND IMPLEMENTATION Kusum Singh, Virginia Tech Gavin W. Fulmer, National Science Foundation 1.
Ensuring Quality and Effective Staff Professional Development to Increase Learning for ALL Students.
+ Hybrid Roles in Your School If not now, then when?
Sharon Walpole University of Delaware Michael C. McKenna University of Virginia Literacy Coaches in Action: Strategies for Crafting Building- Level Support.
Session Materials  Wiki
Session Materials Wireless Wiki
Welcome What’s a pilot?. What’s the purpose of the pilot? Support teachers and administrators with the new evaluation system as we learn together about.
Meeting SB 290 District Evaluation Requirements
Meeting of the Staff and Curriculum Development Network December 2, 2010 Implementing Race to the Top Delivering the Regents Reform Agenda with Measured.
What is Effective Professional Development? Dr. Robert Mayes Science and Mathematics Teaching Center University of Wyoming.
Manipulatives – Making Math Fun Dr. Laura Taddei.
© 2006 Teachers Development Group 1 Building K-12 Math Leaders: “Walking the Walk” MEANS “Talking the Talk” Tom Dick, Oregon State University Linda Foreman,
DeAnn Huinker & Kevin McLeod University of Wisconsin-Milwaukee Designing High Quality Professional Development Knowledge, Management, & Dissemination Conference.
Our Leadership Journey Cynthia Cuellar Astrid Fossum Janis Freckman Connie Laughlin.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Cindy M. Walker & Kevin McLeod University of Wisconsin - Milwaukee Based upon work supported by the National Science Foundation Grant No
Interstate New Teacher Assessment and Support Consortium (INTASC)
Iowa’s Teacher Quality Program. Intent of the General Assembly To create a student achievement and teacher quality program that acknowledges that outstanding.
Louisiana Math & Science Teacher Institute (LaMSTI) Overview of External Evaluation and Development of Self-Report Measures of Instructional Leadership.
Mathematics Teacher Leader Session 1: The National Mathematics Strategy & Modelling Exemplary Teaching 1.
Leading Change Through Differentiated PD Approaches and Structures University-District partnerships for Strengthening Instructional Leadership In Mathematics.
PRESENTED BY THERESA RICHARDS OREGON DEPARTMENT OF EDUCATION AUGUST 2012 Overview of the Oregon Framework for Teacher and Administrator Evaluation and.
Sharing in Leadership for Student Success DeAnn Huinker & Kevin McLeod, UWM Beth Schefelker, MPS 18 April 2008.
Research Indicators for Sustaining and Institutionalizing Change CaMSP Network Meeting April 4 & 5, 2011 Sacramento, CA Mikala L. Rahn, PhD Public Works,
DeAnn Huinker, UW-Milwaukee MMP Principal Investigator 26 August 2008 This material is based upon work supported by the National Science Foundation under.
Ensuring that Professional Development Leads to Improved Mathematics Teaching & Learning Kristen Malzahn Horizon Research, Inc. TDG Leadership Seminar.
{ Principal Leadership Evaluation. The VAL-ED Vision… The construction of valid, reliable, unbiased, accurate, and useful reporting of results Summative.
ISLN Network Meeting KEDC SUPERINTENDENT UPDATE. Why we are here--Purpose of ISLN network New academic standards  Deconstruct and disseminate Content.
Urban Mathematics Education Leadership Academy Session 1 February 4-6, 2009 Dallas, TX.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
EDUCATE ALABAMA & PROFESSIONAL LEARNING PLAN Oak Mountain High School
Lessons Learned about Going to Scale with Effective Professional Development Iris R. Weiss Horizon Research, Inc. February 2011.
LANSING, MI APRIL 11, 2011 Title IIA(3) Technical Assistance #2.
1 What Are We Doing Here Anyway? Vision for our Work: Effective Science Learning Experiences Dave Weaver RMC Research Corp.
TPEP Teacher & Principal Evaluation System Prepared from resources from WEA & AWSP & ESD 112.
Challenges and Trade-offs in Measuring the Outcomes of NSF’s Mathematics and Science Partnership Program: Lessons from four years on the learning curve.
1. Administrators will gain a deeper understanding of the connection between arts, engagement, student success, and college and career readiness. 2. Administrators.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Readiness for AdvancED District Accreditation Tuscaloosa County School System.
Data Analysis Processes: Cause and Effect Linking Data Analysis Processes to Teacher Evaluation Name of School.
Student Learning Objectives (SLO) Resources for Science 1.
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Developing Leaders in Effective Teaching Diane J. Briars President National Council of Teachers of Mathematics 2015 NCTM Minneapolis Regional.
The School Effectiveness Framework
Writing a Professional Development Plan.  Step 1–Identify Indicators to be Assessed  Step 2 –Determine Average Baseline Score  Step 3 –Develop a Growth.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
UPDATE ON EDUCATOR EVALUATIONS IN MICHIGAN Directors and Representatives of Teacher Education Programs April 22, 2016.
Director of Mathematics, Coppell ISD Mary Kemper.
FLORIDA EDUCATORS ACCOMPLISHED PRACTICES Newly revised.
Instructional Leadership and Application of the Standards Aligned System Act 45 Program Requirements and ITQ Content Review October 14, 2010.
Instructional Leadership Supporting Common Assessments.
PGES Professional Growth and Effectiveness System.
MSP Summary of First Year Annual Report FY 2004 Projects.
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
The Call for Action: Coaching and Supporting Mathematics Instruction
Continuous Assessment Establishing Checkpoints
Presentation transcript:

Identifying Measures for Evaluating Changes in Teacher Practice, Teacher and Student Attitudes and Beliefs, and Sustainability Dave Weaver REL Central Online Seminar #3 Centennial, CO September 20, 2013

The work of TEAMS is supported with funding provided by the National Science Foundation, Award Number DRL Any opinions, suggestions, and conclusions or recommendations expressed in this presentation are those of the presenter and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

Strengthen the quality of the MSP project evaluation and build the capacity of the evaluators by strengthening their skills related to evaluation design, methodology, analysis, and reporting.

Today’s Objectives Identify and define a variety of outcomes and methods to measure those outcomes that projects may wish to use to demonstrate the impacts of the MSP project

Things to Consider Exploratory nature of project vs. explicit nature of the project Measures of the logical connections between activities and impacts – Proximal vs. distal relationships Importance of fidelity measures Use existing vs. developing or adapting – Document validity and reliability Importance of documenting measures and instrument

Clarification of Terms Measure—What is to be quantified – Teacher beliefs Indicator—A concise description of how a value will be calculated for a measure – Examples Percent of Grade 8 students who met the science standard on the state assessment The teachers’ score on the professional culture subscale on the annual teacher survey Instrument—The tools used to collect data to calculate the measures

Chicken or the Egg Problem The Evaluator’s Version: Which came first? The Instrument OR The Measure

Exploratory vs. Explicit Nature of the Project Does the project have a theory of action? If so, how explicit is it?

What Is a Theory of Action Collective belief about causal relationships between action and desired impacts – Simple: If... Then... – Complex: If... and... and... and... Then... A collaborative interpretation of the literature 9

Characteristics of an Explicit Theory of Action Can be a testable hypothesis Useful for defining fidelity of project implementation Describes project impact as close to the primary target audience as possible – A description of what students do to learn Recognizable when it is going on – Observable Believable 10

Science Example Students learn science when they: – Articulate their initial ideas, – Are intellectually engaged with important science content, – Confront their ideas with evidence, – Formulate new ideas based on that evidence, and – Reflect upon how their ideas have evolved April

Math Example (Common Core State Standards) If teachers use developmentally appropriate yet challenging tasks and activities that engage students in: Justifying—Explaining and justifying their reasoning mathematically Generalizing—Identifying and verifying conjectures or predictions about the general case Representing—Using representations (symbolic, notation, graphs, charts, tables, and diagrams) to communicate and explore mathematical ideas Applying—Applying mathematical skills and concepts to real-world applications Then student achievement and interest in mathematics will increase. April

Implications for Choosing Measures Continuum of Project Types Evaluating exploratory projects that seek to identify relationships between interventions and impacts Tend not to have an explicit theory of action Require a broad range of measures Measures must be general in nature Evaluating projects with explicit theories of action Require fewer measures Measures can be more focused

Measures of the Logical Connections Between Activities and Impacts AKA: What is in the Black Box?

Attainment of Goals and Objectives A Primary Role of Evaluation – To what extent has the project met its stated goals and objectives? Goals tend to be expected impacts – Student achievement goals – Increased enrollment in STEM disciplines Objectives tend to be activities or strategies – Conduct teacher professional development – Provide specific experiences for students

What is Between Activities and Impacts? Example of proximal relationship Provide training to educators on specific subject matter content Activity Educators demonstrate increased subject matter content knowledge Impact

Example of Distal Relationship Provide training to educators on specific subject matter content Activity Students increase achievement in the subject matter content Impact For almost all projects in education, the expected impacts are very distal with respect to the planned activities!

What We Really Want! Provide training to educators on specific subject matter content Student increase achievement in the subject matter content Activity Impact A Miracle!

In Reality, A Lot Must Happen Provide training to educators on specific subject matter content Activity Students increase achievement in the subject matter content Impact Educators demonstrate increased subject matter content knowledge Students are actively engaged in rich learning experiences Educators apply new knowledge to enhance lesson development Classroom instruction improves

What is Between Activities and Impacts? The Well-Planned Activities The Desired Impact The BLACK BOX of Implementation

Implications for Evaluation Evaluation is more complex Evaluation is more expensive It is more difficult to demonstrate success It is more difficult to attribute any success you may find to the intervention

Implications for Selecting Measures The logical connection between activities and impacts should be defined or at least anticipated Measures are identified at each step along the way

What is a Logic Model A diagram that shows the logical connection between project resources, activities, outcomes, and expected impacts Incorporates a primary theory of action Can be viewed as a collection of theories of actions Answers the question: April Why would the planned activities be expected to have the desired impacts?

24 Black Box Issues Basic Logic Model InputsActivitiesOutputsOutcomesImpacts Resources Activities Proximal Impacts Distal Impacts

MSP Example Oregon Mathematics Leadership Institute – 5-year project funded by NSF – 10 School Districts and 86 schools, 2 lead IHEs, Teachers Development Group Logic model approach used during the development of the conceptual framework

26 OMLI Example InputsActivitiesOutcomesImpacts Partner Organizations Summer Institutes New Content Courses at IHEs Close the Achievement Gap in Math Increased Student Achievement in Math Project Leadership Leadership Summit School Leadership Teams Follow-up and Technical Support HUGE BLACK BOX Leadership Symposium

A Unifying Theme Emerged If students were more actively engaged in discourse that involves mathematical justification and generalization, then they would develop a deeper understanding of mathematics that would be evident by increased student achievement. This became the Theory of Action for the OMLI Project that enabled the leadership to fill in the black box.

28 OMLI Example InputsActivitiesOutcomesImpacts Partner Organizations Summer Institutes New Content Courses at IHEs Close the Achievement Gap in Math Increased Student Achievement in Math Project Leadership Leadership Summit School Leadership Teams Follow-up and Technical Support Professional Learning Communities Leadership Symposium Increased Mathematical Discourse Through Explanation, Justification, and Generalization Improved Conceptual Understanding

Using the Logic Model to Identify Measures Identify evaluation question for each major component of the logic model Identify measures for each component of the logic model Develop indicators Identify methodology – Multiple informants – Multiple data collection methods

Questions!

Some Common Measures Teacher Practice Student Engagement Teacher and Student Attitudes and Beliefs Sustainability Teacher Preparedness Instructional Leadership Administrative Leadership Partnership Professional Climate Fidelity

Teaching Practice TEACHERS—Actions a teacher takes to support students learning MethodExisting Instruments ObservationInside the Classroom Observation and Analytic Protocol Reformed Teaching Observation Protocol (RTOP) SurveyArizona Mathematics Partnership (AMP) Teacher Survey Surveys of Enacted Curriculum Inside the Classroom Teacher Questionnaire (Math or Science) STUDENTS—Evidence that students are intellectually engaged—focus on what instruction elicits in students MethodExisting Instruments ObservationLASER Science Classroom Observation Protocol (Science) OMLI Classroom Observation Protocol (Math)

Attitudes and Beliefs TEACHER—Underlying philosophies that influence teacher practice and day-to-day instructional decisions that teachers make MethodExisting Instruments SurveyInside the Classroom Teacher Questionnaire Surveys of Enacted Curriculum TIMSS-R Teacher Questionnaire (Math and Science) Principles of Scientific Inquiry Teacher Survey InterviewTeacher Beliefs Inventory STUDENT—Underlying philosophies that influence student engagement in learning and future aspirations MethodExisting Instruments SurveyAttitudes Toward Mathematics Inventory Attitudes Toward Science Inventory (Revised)

Sustainability Institutionalization of factors that will sustain the work of the project beyond current funding – Policy changes – Administrative support – Time and structure for school-based PD – Establishment of instructional leadership positions (coaches, TOSAs, mentors)

Sustainability Examples Policies—Evidence of changes in policies and procedure intended to support continuous improvement Access to Instructional Leadership—The number of schools where instructional leadership is available to all teachers (established position) Administrator Participation—The attendance rate at meetings and PD intended for administrators Administrator Beliefs—Administrator’s score on the instructional beliefs scale – Compared for change over time – Compared with that of teachers

Other Common Measures Teacher Preparedness – The degree to which teachers feel prepared to engage students in learning activities that align with the project theory of action LSC Through Teacher Enhancement Questionnaires Instructional Leadership – The degree to which teacher leaders feel prepared to fulfill their role as instructional leaders responsible for influencing colleagues Examining Mathematical Coaching Teacher Survey

Other Common Measures Administrative Leadership – Factors such as administrator awareness and support of the project, engagement in professional development, and interactions with teachers Partnership – Factors that allow partners to move beyond their own individual institutional needs and engage in the work of the partnership to better meet project goals Education R&D Partnership Tool Professional Climate – Factors that foster a constructive and supportive professional environment such as trust, collegiality, and collaboration Arizona Mathematics Partnership Teacher Survey

Questions!

Importance of Fidelity Measures

Fidelity of Implementation Requires clear theory of action Fidelity of implementation: – The degree to which the initiative is carried out the way it was intended – The degree to which the spirit of the theory of action is enacted – Recognizing that not everyone will implement the way you planned

Identifying Essential Elements What are the essential elements of implementing the project/initiative that make the logic model valid Identifies what can and cannot be adapted Takes into account human nature and desire to personalize Well-defined essential elements can identify fidelity measures

Process Example A project has a PD model that includes five phases in a cyclic process that PD facilitators are expected to follow Evaluation Question: – To what extent are the facilitators implementing the proposed PD model with fidelity? PD Model Fidelity – The number of teacher participants who report that the they experienced and can comment on the value of each of the 5 phases

Outcome and Fidelity Example Theory of Action Includes: – Student discourse that involves justifying mathematical reasoning Evaluation Questions: – To what extent has participation in the project increase student discourse that involves justification of mathematical reasoning? Classroom Discourse – The percent of classroom observations that indicated the majority of students were engaged in discourse that required them to justify their reasoning mathematically during at least part of the lesson observed

Advantages of Assessing Fidelity Formative Evaluation – Provides information to improve the project during implementation – Which elements are implemented well? – Why are some essential elements difficult to implement? Summative Evaluation – Did those who implemented with fidelity have better results than those who did not? Ensuring Attribution – How do you know the things you are measuring are a result of the intervention? – Clearly shows a relationship between important elements of the project and the expected outcome

Use Existing vs. Developing or Adapting When should I... – Use existing measures and instruments? – Adapt existing measures and instruments? – Develop new measures and instruments?

Using Existing Advantages – Can take advantage of reliability and validity information provided by others – No pilot testing—Ready for use almost immediately – Can reveal unanticipated outcomes Disadvantages – Alignment with project focus – Increased burden Uses – Exploratory Don’t forget... Get appropriate permissions Give credit

Adapting Existing Advantage – Better alignment to project focus – Requires less development time than starting from scratch – Can be shorter and less burdensome Disadvantages – May require pilot testing – Must test reliability and provide evidence of validity – Less likely to reveal unintended outcomes Uses – Projects with more explicit theory of action

Developing New Advantage – Better alignment to project focus – Can be shorter and less burdensome Disadvantages – Requires development time – Requires pilot testing – Must test reliability and provide evidence of validity – Less likely to reveal unintended outcomes Uses – Projects with more explicit theory of action

Importance of Documenting Measures and Instrument Reliability and Validity

Sharing to Improve Evaluation Rigor Wherever possible provide the following information about measures used – Complete description of what is measured – Target informants – Methods for data collection – Pilot test procedures – Evidence of validity – Results of reliability tests – Scales and subscales calculations – Threshold for missing data

Review Exploratory nature of project vs. explicit nature of the project Measures of the logical connections between activities and impacts – Proximal vs. distal relationships Importance of fidelity measures Use existing vs. developing or adapting – Document validity and reliability Importance of documenting measures and instrument

Resources For a copy of this presentation and a document containing the references used to support this presentation, go to: teams.mspnet.org

Dave Weaver Director of Educational Evaluation RMC Research Corporation—Portland 111 SW Columbia Street Suite 1200 Portland, OR Phone: Toll Free: Fax: