Enhancing Evaluation Stakeholder Responsiveness Through Collaborative Development of Data Collection Instruments Karen Kortecamp, PhD The George Washington.

Slides:



Advertisements
Similar presentations
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Advertisements

Semonti Basu PBS Technical Assistance Facilitator Grace Martino-Brewster PBS Specialist Austin Independent School District Connecting Data Dots Using data-based.
The Marzano School Leadership Evaluation Model Webinar for Washington State Teacher/Principal Evaluation Project.
Chapter 15 Evaluation Recognizing Success. Social Work Evaluation and Research Historically –Paramount to the work of early social work pioneers Currently.
Practicing Community-engaged Research Mary Anne McDonald, MA, Dr PH Duke Center for Community Research Duke Translational Medicine Institute Division of.
The Purpose of Action Research
Why Student Perceptions Matter Rob Ramsdell, Co-founder April 2015.
Evaluation. Practical Evaluation Michael Quinn Patton.
Peer Mentoring Model National Business Education Alliance.
Ensuring Quality and Effective Staff Professional Development to Increase Learning for ALL Students.
Creating System-Wide Support for Learning Coaches with Joellen Killion
Copyright © 2014 by The University of Kansas Choosing Questions and Planning the Evaluation.
POINT OF VIEW IN HISTORICAL INTERPRETATION & ANALYSIS October 16, 2013.
PRIMARY/SECONDARY SOURCE HISTORY LABS SOCIAL STUDIES CRITICAL THINKING LABS.
DEVELOPING ALGEBRA-READY STUDENTS FOR MIDDLE SCHOOL: EXPLORING THE IMPACT OF EARLY ALGEBRA PRINCIPAL INVESTIGATORS:Maria L. Blanton, University of Massachusetts.
School Counselors Doing Action Research Jay Carey and Carey Dimmitt Center for School Counseling Outcome Research UMass Amherst CT Guidance Leaders March.
Overall Teacher Judgements
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
NEXT GENERATION BALANCED ASSESSMENT SYSTEMS ALIGNED TO THE CCSS Stanley Rabinowitz, Ph.D. WestEd CORE Summer Design Institute June 19,
Creating a Learning Community Vision
Pontotoc City School District. Pontotoc City School District believes LEARNING is a priority, a need, and a desire. To be successful, we must nurture.
Michigan Department of Education Office of Education Improvement and Innovation One Voice – One Plan Michigan Continuous School Improvement (MI-CSI)
Institutional Outcomes and their Implications for Student Learning by John C. Savagian History Department Alverno C O L L E G E.
Session 2: Are We There Yet? Integrating Understanding by Design and Historical Thinking.
Cultural Competence & Capacity Building By: Nate Hartman, Chris Hunter, Terry Martin, & Kristin Williams Chapter 9.
Teaching to the Standard in Science Education By: Jennifer Grzelak & Bonnie Middleton.
“Outcomification”: Development and Use of Student Learning Outcomes Noelle C. Griffin, PhD Director, Assessment and Data Analysis Loyola Marymount University.
Historical Thinking Why Historical Thinking Matters.
Nursing Research as the Basis of Nursing. Importance of Nursing Research Nurses ask questions aimed at gaining new knowledge to improve pt. care Nurses.
Introduction to Surveys of Enacted Curriculum Presentation: Introduce SEC to Educators [Enter place and date]
CHAPTER 3 Strategy Cards for: Communication with Colleagues, Students, & Families.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Literacy Strategies for AIG Readers March 6, 2015 Dana Gillis Mount Mourne IB School, Iredell-Statesville Schools A Race to the Top Initiative NC Department.
Greenbush. An informed citizen possesses the knowledge needed to understand contemporary political, economic, and social issues. A thoughtful citizen.
National Science Education Standards. Outline what students need to know, understand, and be able to do to be scientifically literate at different grade.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Choosing Questions and Planning the Evaluation. What do we mean by choosing questions? Evaluation questions are the questions your evaluation is meant.
ACTIVE RESEARCH Group 1 – Assignment 2 Chloe Weaver Narelle Webb Bec Wheldon, Dallas Wolf Emily Weight.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
PSC ALLIANCE Session 2: Critical Reflection and Capacity Building within a Strengths-based Framework.
Focus Questions What is assessment?
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Action Research for School Leaders by Dr. Paul A. Rodríguez.
Strategic planning A Tool to Promote Organizational Effectiveness
CAEP Standard 4 Program Impact Case Study
Quality Assurance processes
TEACHERS COLLABORATING TO IMPROVE INSTRUCTION
Alexandria City Public Schools Preliminary Results of the 2016 Teaching, Empowering, Leading, and Learning (TELL) Survey. Dawn Shephard Associate Director, Teaching,
Learning Without Borders: From Programs to Curricula
Eleventh hour evaluation
Understanding by Design
THE JOURNEY TO BECOMING
Learning Forward Annual Conference Session F28
How to Talk to Families about the 3 Global Outcomes and the EI Program
Wait, Why Are We Doing This
Focused Conversation for TTA
Focus Question: How can understanding context help us interpret the past? Do Now: What makes one account of an event more trustworthy than another?
Introduction Introduction
Presented by: Skyline College SLOAC Committee Fall 2007
How to Talk to Families about the 3 Global Outcomes and the EI Program
February 21-22, 2018.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
TECHNOLOGY ASSESSMENT
Roles and Responsibilities
Roles and Responsibilities
Roles and Responsibilities
Developmentally Appropriate Practices (DAP)
Accountability and Assessment in School Counseling
Presentation transcript:

Enhancing Evaluation Stakeholder Responsiveness Through Collaborative Development of Data Collection Instruments Karen Kortecamp, PhD The George Washington University

Collaborating With Stakeholders Should we or shouldn’t we?  Utilization-focused evaluators  Accommodate stakeholder concerns  Increase evaluation utilization  Theory-focused evaluators  Minimize the effects of stakeholder influence  Apply preset standards  Social justice perspective  Evaluation should be responsive to the views of the weakest stakeholders

Factors to Consider  Type of program being evaluated  Size of program  Duration of program  Number of stakeholders involved  Impetus for the evaluation  Internal or external evaluation  Implications of the evaluation  Point at which the evaluation of a program is designed and implemented  Funding available to conduct the evaluation

Evaluation Context  Initial grants funded in 2001  Teacher professional development  Ultimate aim is increased student achievement in American history Grant that is the subject of this presentation: Large school system Voluntary participation of 8 th & 9 th grade teachers Three cohort groups serving up to 75 teachers U. S. Department of Education required: External evaluation Measure of teachers’ knowledge Quasi-experiment

Project Objectives 1) Increase teachers’ knowledge of traditional American history 2) Increase teachers’ use of primary sources in teaching traditional American history 3) Increase student knowledge of and interest in traditional American history

The Collaborative Process Decision-making stakeholders ◦ Project Manager, project coordinator, lead historians and faculty of partner university Theory of change ◦ Amalgamation of assumptions, experiences, and intuition Evaluator role in articulating a theory? ◦ Asking good questions

Pricking the Balloon! What specific needs will the project address? How? Why the interventions identified? What specific change(s) are desired? How can change(s) be measured? Are there barriers to overcome? What do you most want to know? What outcomes do you most want to occur?

Collaborating in Developing Evaluation Instruments Development of 5 Primary Instruments ◦ Assessment of teachers’ knowledge ◦ Survey of teachers’ perceptions ◦ Observation of teachers’ practice ◦ Assessment of students’ knowledge ◦ Survey of students’ attitudes Additional data collected through focus group interviews, event surveys, and analysis of teacher reflections & products

Guiding Development of the Teacher Assessments What knowledge is of most value? What knowledge will the professional development address? What constitutes evidence that knowledge has been acquired and applied? Stakeholders developed content Evaluator focused on developing valid and reliable measures

An Early Draft of the 8 th Grade Teacher Assessment Included multiple choice questions that had nothing to do with the professional development content Asked teachers to: “Briefly explain why middle school students should study American history” Asked teachers to observe a Civil War period political cartoon and develop 2 or 3 questions they would ask their students about this primary source

Some of the Problems Gathering data on content not related to the professional development would yield meaningless data Scoring teachers’ responses as to why middle school students should learn American history would be highly subjective (impossible) and not relevant The direction to develop 2 or 3 questions was vague; scoring would be a nightmare What would any of these items tell stakeholders about the influence of the professional development on teachers’ knowledge? What would the assessment tell the U.S.DOE about the effectiveness of the project?

Final Iteration Represented content addressed in the professional development Integrated assessment of teachers’ historical thinking Measured teachers’ ability to apply historical thinking to guide students’ analysis and interpretation of primary sources

Assessment Items Content pairs ◦ Benjamin Franklin – Slavery Primary and secondary sources ◦ Define and provide examples of both; identify steps historians take to analyze and interpret primary sources Analysis of a primary source ◦ Political cartoon from a historical period Develop student questions ◦ Guide students in analyzing and interpreting meaning

Conclusions The collaborative process: ◦ supported stakeholders in developing a shared vision ◦ sharpened stakeholders’ focus on outcomes ◦ deepened stakeholders understanding of the central purpose of evaluation ◦ led stakeholders to utilize evaluation findings to strengthen the professional development programs ◦ enhanced the evaluator’s understanding of the assumptions and beliefs that guided decision- making ◦ promoted mutual trust and respect