Reflect and Revise: Evaluative Thinking for Program Success Tom DeCaigny and Leah Goldstein Moses.

Slides:



Advertisements
Similar presentations
The Principals Role in Systemic Change for Reading Commitment.
Advertisements

Continuing and Expanding Action Research Learning Cedar Rapids Community Schools February, 2005 Dr. Susan Leddick.
A Vehicle to Promote Student Learning
PD Plan Agenda August 26, 2008 PBTE Indicators Track
M & E for K to 12 BEP in Schools
So what can I expect when I serve on a NEASC/CPSS Visiting Committee? A Primer for New Visiting Committee Members.
University of Wisconsin - Extension, Cooperative Extension, Program Development and Evaluation Unit 1: Getting ready.
Assessing Learning for Classroom Success Friday, February 16, :00-3:00.
Data Collection* Presenter: Octavia Kuransky, MSP.
1 ETR 520 Introduction to Educational Research Dr. M C. Smith.
The Academic Assessment Process
 Graphic Design Institute Overview. Managing the Curriculum  Industry Driven  Implementing Project-Based Strategies  Meeting CTE, State, & Industry.
1 Assessment and Evaluation for Online Courses Associate Professor Dr. Annabel Bhamani Kajornboon CULI’s 6 th Intl Conference: Facing.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Assessment and Evaluation Angella Barker Renee Clarke Olivia Smith Charrah Whittaker.
The 5 E Instructional Model
Professional Portfolios
Education Bachelor’s Degree in Elementary Education Began the Master’s of Special Education program in January of 2011 Professional After graduation Sorensen.
Time Management and Crucial Conversations for Principals.
What We Know About Effective Professional Development: Implications for State MSPs Part 2 Iris R. Weiss June 11, 2008.
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
So What Can I Expect When I Serve on an NEASC/CPSS Visiting Team? A Primer for New Team Members.
Schoolwide Preparation for English Language Learners: Teacher Community and Inquiry-Based Professional Development.
Assessment Cycle California Lutheran University Deans’ Council February 6, 2006.
The Evaluation Plan.
Authentic Assessment Lynne E. Houtz, Ph.D. Associate Professor of Education Creighton University.
Montgomery County Public Schools, Maryland Middle School Reform in Montgomery County Public Schools Linda Ferrell Director Director Middle School Instruction.
NASPA Presentation Practical Tools for Building Division-wide Assessment Capacity Adrienne Dumpe, Graduate Assistant, VPSA Katie O’Dair, Director of Assessment.
DLM Early Childhood Express Assessment in Early Childhood Dr. Rafael Lara-Alecio Dr. Beverly J. Irby
Creating a Teaching Dossier Shea Wang, Ph.D Interim Faculty Evaluation Coordinator Oct. 21, 2013.
Assessment 101 Center for Analytics, Research and Data (CARD) United Church of Christ.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
California County Superintendents Arts Initiative CCESSA Statewide initiative launched in 2006 The William and Flora Hewlett Foundation “CCSESA urges every.
CONVENING OF TEXAS WOMAN’S UNIVERSITY STRATEGIC PLANNING Tuesday, January 11, 2005 Denton, Texas.
A process to collect information about content, assessment and thinking processes currently in place with a school. What concepts, skills, knowledge and.
Connecting the Dots PLC AfL DI Higher Order Thinking TLCP Multi- Literacies Arts Technology Inquiry BIP SEF SIP.
Technology Use Plan Bighorn County School District #4 Basin / Manderson, Wyoming “Life-long learning through attitude, academics, and accountability.”
Social Studies Classroom Based Assessments (CBAs ) Tacoma Public School K – 5 Implementation Plan
Inquiry and Investigation. What was the TOPIC? PROBLEM? CIVIC INQUIRY?
ND Topical Call Subgrantee Monitoring Tools: Meeting the Civil Rights Obligations to Students (Call 1) January 14, 2015.
Organizing for General Education Assessment Linda Suskie, Vice President Middle States Commission on Higher Education 3624 Market Street, Philadelphia.
 Integrate the Bacc Core category learning outcomes into the course.  Clarify for students how they will achieve and demonstrate the learning outcomes.
Creating a S.I.M.P.L.E. & Comprehensive Assessment Plan for Residence Life and Housing Dr. Mike Fulford Emory University Student Affairs Assessment Conference.
SCHOOL BOARD A democratically elected body that represents public ownership of schools through governance while serving as a bridge between public values.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Second session of the NEPBE I in cycle Dirección de Educación Secundaria February 22, 2013.
Integrating Technology & Media Into Instruction: The ASSURE Model
STAR3 Project for WS/FCS. STAR3 All students deserve and thrive under a great teacher that cares for their well being. Our responsibility is to provide.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Authentic Assessment Kellie Dimmette CI Pretest on Evaluation Part I 1.C & D 2.B & C 3.T 4.Valid, reliable 5.T 6.T 7.T 8.A & B 9.C 10.B.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Challenges and Trade-offs in Measuring the Outcomes of NSF’s Mathematics and Science Partnership Program: Lessons from four years on the learning curve.
Projects #9, 17, 29, and 32 Mentor: Helga Bernard, Ph. D. Clark County School District School Improvement and Research.
School Accreditation School Improvement Planning.
Technology Action Plan By: Kaitlyn Sassone. What is Systemic Change? "Systemic change is a cyclical process in which the impact of change on all parts.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Making it Count! Program Evaluation For Youth-Led Initiatives.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
COMPONENT THREE MONITORING LEARNING PROGRESS WHAT IS THE SCHOOL’S ASSESSMENT PLAN? HOW IS THE ASSESSMENT DATA ANALYZED AND KNOWN? HOW DID THE RESULTS IMPACT.
Writing a Professional Development Plan.  Step 1–Identify Indicators to be Assessed  Step 2 –Determine Average Baseline Score  Step 3 –Develop a Growth.
Assessment Small Learning Communities. The goal of all Small Learning Communities is to improve teaching, learning, and student outcomes A rigorous, coherent.
Action Research Purpose and Benefits Technology as a Learning Tool to Improve Student Achievement.
Tri City United Public Schools August 6, 2013 “Leading for educational excellence and equity. Every day for every one.”
Teaching Strategies GOLD
Assessment in student life
So what can I expect when I serve on a NEASC/CPSS Visiting Team?
What It Is and How to Design an Action Research Project
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Presentation transcript:

Reflect and Revise: Evaluative Thinking for Program Success Tom DeCaigny and Leah Goldstein Moses

Topics to guide our discussion Benefits of evaluation and assessment to staff, program participants and the organization as a whole Building on existing knowledge and resources for evaluation Launching evaluation and assessment efforts Choosing the right methods of assessment and evaluation Learning from other organizations to integrate evaluative thinking and use evaluation to support programs

Let’s get to know each other Your presenters: Leah Goldstein Moses and Tom DeCaigny How comfortable are you with evaluation? What comes to mind when someone asks you to evaluate and/or assess your work? What are your challenges with evaluation?

First, some definitions Assessment: The act of determining the standing of an object on some variable of interest, for example, testing students and reporting scores. Evaluation: Systematic investigation of the worth or merit of an object; e.g., a program, project, or instructional material. –Source Joint Committee on Standards for Educational Evaluation. (1994). The Program Evaluation Standards, 2nd ed. Thousand Oaks, CA: Sage. (used with permission of publisher)

Benefits of evaluation and assessment Be accountable to important stakeholders Professional and organizational development – learn how you are doing Program management – see where your programs need continued support or improvement Investigation and learning Feeds curiosity and fosters innovation

Stories of evaluation benefits Utilizing evaluation results to identify a need and develop a new project (special needs - ARISE case study) Utilizing evaluation results to improve program quality and design (teaching artist training case study) Evaluation and assessment as part of reflective artistic practice (A Cycle of Artistic Inquiry case study)

A Cycle of Artistic Inquiry Performing Arts Workshop and Dr. Richard Siegesmund (2000)

Ensuring evaluation benefits are shared Evaluation, at its best, is “engaged in”, not “done to”. When developing or improving evaluation systems, think about who will be doing the work for the evaluation (distributing surveys, gathering information, analyzing the data): –Is there a way to decrease the burden? –Is there a way to provide benefits? Examples of evaluation efficiency and incentives

Build on existing knowledge and resources Internal insights can really support a new evaluation effort. Determine: –What do we collect already? –What does the information we already have tell us? –What can we report on just from our own internal record keeping or observations? –What can we gather in the course of our work – during existing programs, contact, etc. –Example in ARISE: student achievement – test scores..

Build on existing knowledge and resources Use external information, such as reports done by organizations you admire: –What did they study? –How did they gather information? –Can you apply any of their tools or methods? –Can you infer/generalize anything from their findings so you don’t have to replicate their effort?

Getting started in evaluation and assessment Logic models are incredibly useful. They help you: –Determine how your efforts are related to your expected impact –Map out what you want to measure, and why You can determine what data you already have and what you are lacking during the logic model process

Getting started in evaluation and assessment After you have created a logic model and/or identified data gaps, you can determine what you are going to collect, when, and in what format Surveys are great; but in the arts, you might want to use an artistic process or other valid alternative assessments –Illustrative rubrics –Observation –Portfolios

Learning from others’ experiences Notes from our discussion: –What evaluation approaches have worked well for you? Electronic portfolios in classroom. Reflects project-based learning. Can see progression over course of year. Parents, administrators can also see. Time to reflect can be challenging but is important. “Level Best” is a good resource. Tried to find things that existed and could be used in the organization. Festin “Theater Communications Group” is a good resource. Anecdotal information, journals can have a bigger impact on Boards and other audiences that don’t care much for quantitative. Site visits for Board members are required as part of their responsibilities. Having tools at your fingertips – did anything good happen today? Did any challenges happen? Right at time they are needed. Using incentives – crayola pencils were good for parents. –Where have you struggled? Finding time for reflection. Having right tool for evaluation. Logic models- can be cumbersome or difficult to use. Having a way to capture, understand and communicate unexpected outcomes. Avoiding bias through body language, tone – need to make sure to encourage honesty. Board can ignore quantitative. Finding sophistication/depth in questions has been hard when they are in a survey.

Our contact information Tom DeCaigny, Executive Director, Performing Arts Workshop T: (415) x207 / F: (415) E: Leah Goldstein Moses, President, The Improve Group T: (877) x11 / F: (612)