PROGRAM EVALUATION Carol Davis. Purpose of Evaluation The purpose of evaluation is to understand whether or not the program is achieving the intended.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Pre and Post Assessments A quick and easy way to assess your Student Learning Outcomes.
M & E for K to 12 BEP in Schools
Accreditation Process Overview Presented By: The Saint John Vianney Accreditation Team Chris Gordon Pam Pyzyk Courtney Albright Dan Demeter Gloria Goss.
2025 Planning Contacts Meeting November 8, 2012 K-State 2025.
Highlighting Your Woksape Oyate Achievements Writing reports that demonstrate outcomes and impact on institutional capacity Nancy M. Lucero, Ph.D., LCSW.
COMMUNITY RESOURCE MAPPING Train the Trainer MAST - NH December 15, 2006 Facilitated by: Kelli Crane.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
A plain English Description of the components of evaluation Craig Love, Ph.D. Westat.
DOCUMENTING YOUR HOPES & DREAMS Jim Moran Nancy Lucero.
Webinar #1 The Webinar will begin shortly. Please make sure your phone is muted. (*6 to Mute, #6 to Unmute) 7/3/20151.
Principal Leadership Academy Basic Leadership Training November 2012.
Grade 12 Subject Specific Ministry Training Sessions
Standards and Guidelines for Quality Assurance in the European
HOW TO WRITE A GOOD TERMS OF REFERENCE FOR FOR EVALUATION Programme Management Interest Group 19 October 2010 Pinky Mashigo.
Accountability Assessment Parents & Community Preparing College, Career, & Culturally Ready Graduates Standards Support 1.
School’s Cool in Childcare Settings
How to Develop the Right Research Questions for Program Evaluation
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Session Materials  Wiki
An Overview of the New HCPSS Teacher Evaluation Process School-based Professional Learning Module Spring 2013 This presentation contains copyrighted material.
Implementation & Evaluation Regional Seminar ‘04 School Development Planning Initiative “An initiative for schools by schools”
Participatory Evaluation Mary Phillips, BME Former Circles of Care Program Coordinator, Oakland and an Evaluator, Los Angeles, CA.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
 1. Equal Access and Treatment  2. Appropriate AIVR Services  3. Impact of AIVR Services  4. Collaboration to Maximize Services 1.
1 MBA PROJECT Nasir Afghan/Asad Ilyas. 2 Objective To enable MBA students to execute a client focused challenging assignment and to enhance.
School’s Cool in Kindergarten for the Kindergarten Teacher School’s Cool Makes a Difference!
Alaska’s Standards for Culturally Responsive Schools 1 TLS Institute, Anchorage Hilton, Sept th, 2013.
Texas Education Agency Updated F-2 FOUNDATION.
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
The Evaluation Plan.
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
1 Evaluation is a Partnership Dr. Darla M. Cooper Associate Director Center for Student Success Basic Skills Coordinators Workshop September 17, 2009.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.
HECSE Quality Indicators for Leadership Preparation.
Regional Seminar 2005 EVALUATING POLICY Are your policies working? How do you know? School Development Planning Initiative.
© 2012 Cengage Learning. All Rights Reserved. This edition is intended for use outside of the U.S. only, with content that may be different from the U.S.
Chapter 6 Team Work Blueprint By Lec.Hadeel Qasaimeh.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
From SEF to SIP March 24, Content Objectives Participants will 1. Understand SIP law 2. Explore the CNA 3. Explore the Plan-Do-Check-Act Model for.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Staying on Message in Changing Times Oklahoma Statewide System of Support (SSOS) January 7, 2011 Dr. Cindy Koss, Assistant State Superintendent Oklahoma.
TRUE PATIENT & PARTNER ENGAGEMENT HOW IS IT DONE?.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
 Development of a model evaluation instrument based on professional performance standards (Danielson Framework for Teaching)  Develop multiple measures.
Institutional Effectiveness A set of ongoing and systematic actions, processes, steps and practices that include: Planning Assessment of programs and.
Assessment & Program Review President’s Retreat 2008 College of Micronesia - FSM May 13 – 15, 2008 FSM China Friendship Sports Center.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Doing Teacher Evaluation Right: 5 Critical Elements: Evidence.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Curriculum Design & Review Process.  Understand the basic principles and elements of the Curriculum Design & Review Process (CDRP).  Using a CDRP framework,
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Research Problem The role of the instructor in online courses depends on course design. Traditional instructor responsibilities include class management,
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Gordon State College Office of Institutional Effectiveness Faculty Meeting August 5, 2015.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
The University of West Florida Reaffirmation of Accreditation Project Southern Association of Colleges and Schools Commission on Colleges.
School’s Cool Makes a Difference!
Implementation, Monitoring, and NM DASH
Presented by: Skyline College SLOAC Committee Fall 2007
MAP-IT: A Model for Implementing Healthy People 2020
Presentation transcript:

PROGRAM EVALUATION Carol Davis

Purpose of Evaluation The purpose of evaluation is to understand whether or not the program is achieving the intended outcomes. Helps identify what works and does not work.

Today’s Evaluation Discussion 1. PEEC Outcomes (Indigenous Evaluation) 2. Evaluation Plan 3. Formative Evaluation 4. Progress Evaluation 5. Summative Evaluation 6. Write Report

PEEC: Indigenous Evaluation Framework

Indigenous Evaluation Consultants American Indian Spiritual Leaders, Cultural Leaders and Educators assisted in the preparation of this guide. It’s main function is for evaluation of Indian Education programs.

Four Core Values Emerged 1. Tribes are a people of place 1. Tribes are a people of place 2. Tribes recognize their gifts 2. Tribes recognize their gifts 3. Tribes Honor family 3. Tribes Honor family 4. Tribes respect sovereignty 4. Tribes respect sovereignty

Indigenous Evaluation Framing Principles Tribal Colleges reflect tribal core values. Tribal Colleges reflect tribal core values. When we create programs, we are creating a set of understandings. When we create programs, we are creating a set of understandings. These understandings are our hypothesis of what we believe will happen. These understandings are our hypothesis of what we believe will happen. Our evaluations need to tell our story. Our evaluations need to tell our story.

Guiding Principles  An Indigenous framework can incorporate broadly held values while also remaining flexible and responsive to local traditions and cultures.  Responsive evaluation uses practices and methods from the field of evaluation that fit our needs and conditions.

Culture “[It] is culture that provides the tools for organizing and understanding our worlds in communicable ways.” “[It] is culture that provides the tools for organizing and understanding our worlds in communicable ways.” - Jerome Bruner- The Culture of Education - Jerome Bruner- The Culture of Education

Traditional Knowledge “Every act, element, plant, animal, and natural process is considered to have a moving spirit with which humans continually communicate.” “Every act, element, plant, animal, and natural process is considered to have a moving spirit with which humans continually communicate.” -Dr. Gregory Cajete- -Dr. Gregory Cajete- Native Science: Natural Laws of Native Science: Natural Laws of Interdependence (2000) Interdependence (2000)

Evaluation Plan The purpose of evaluation planning is to assess the understanding by all stakeholders of… The purpose of evaluation planning is to assess the understanding by all stakeholders of…  Project Goals  Objectives  Strategies, and  Timelines

Evaluation Planning is Guide for Formative and Summative When you “plan to tell your story,” your formative and summative evaluation becomes more than numbers. When you “plan to tell your story,” your formative and summative evaluation becomes more than numbers. You collect information. You do surveys. You collect information over time. You analyze results. You collect information. You do surveys. You collect information over time. You analyze results.

Evaluator Role  To assure that the program evaluation will meet the needs of the organization, reservation, and/or community, require that the evaluator submit an evaluation proposal prior to beginning the project.

Intended Outcomes Outcomes create the desired change or impact we seek through this program. Outcomes create the desired change or impact we seek through this program. Stakeholders should brainstorm the intended outcomes and post them. That will assure that everyone is working toward the same outcomes. Stakeholders should brainstorm the intended outcomes and post them. That will assure that everyone is working toward the same outcomes.

Examining the Story  What do we want to know as the story unfolds?  Drafting questions/statements guide the process of tracking the story.

Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data Data

Objectives Objectives are measurable. Objectives are measurable.

Outputs Your outputs result from your activities. Your outputs result from your activities. Staff can assist in determining the outputs for each activity. Staff can assist in determining the outputs for each activity. A process for “documenting” is essential and necessary when writing reports. A process for “documenting” is essential and necessary when writing reports.

A Program is a story PEEC may want more than numbers. What is PEECs legacy? You may want to find out how the students rate their participation in your program or how faculty view your services. PEEC may want more than numbers. What is PEECs legacy? You may want to find out how the students rate their participation in your program or how faculty view your services.

Survey Monkey SurveyMonkey's online survey tool is easy and quick. Create many types of surveys using their methodologist- certified survey templates. SurveyMonkey's online survey tool is easy and quick. Create many types of surveys using their methodologist- certified survey templates.

Report Program Impact? VALUE ADDED: VALUE ADDED: e.g., Do you have a student who is the first to attend college in the family? e.g., Do you have a student who is the first to attend college in the family? Do you have a student who comes from a foster home who is going to become an engineer? Do you have a student who comes from a foster home who is going to become an engineer? Is this the first tribal member who aspires to become an engineer? Is this the first tribal member who aspires to become an engineer?

Quantitative/Qualitative  Both methods can contribute to telling the story. You decide which is appropriate.  Quantitative can build Breadth (These are your numbers, percentages, etc.).  Qualitative can build depth (These are your personal stories, focus groups, etc.).

Qualitative Design  It is important for American Indian tribes to include qualitative reporting methods.  Truth emerges from consensus among informed people.  Value framework gives meaning to numbers.

Formative Evaluation  Formative Evaluation is assessment of the ongoing projects and activities  Evaluation should be conducted for action- related reasons and the information should be used to decide a course of action.

Progress Evaluation  Determine whether or not the project is being conducted as planned within the established timelines.

Summative Evaluation  Summative Evaluation is conducted at or toward the end of the program. The purpose is to assess the projects outcomes to determine its success.

Use of Results of Summative Evaluation  Sustainability  Disseminate your model  Continue funding  Continue on probation, and/or  Discontinue

Writing the Report Writing the Report Gather and organize the documents, data, Survey Monkey results & other information Gather and organize the documents, data, Survey Monkey results & other information Review the report format, instructions, and due date. Review the report format, instructions, and due date. Create a checklist to make sure you include all of the information. Create a checklist to make sure you include all of the information. Answer basic question: who, what, when, where, why, how, how many. Answer basic question: who, what, when, where, why, how, how many.

Today’s Evaluation Discussion 1. PEEC Outcomes (Indigenous Evaluation) 2. Evaluation Plan 3. Formative Evaluation 4. Progress Evaluation 5. Summative Evaluation 6. Write Report

References  American Indian Higher Education Consortium Indigenous Evaluation Framework.  Boyer, Paul Should Expediency Always Trump Tradition? AIHEC/NSF Project develops indigenous evaluation methods. Tribal College Journal. Vol. 18 No. 2. Pp  LaFrance, J., & Nichols, R. (2010). Reframing Evaluation:Defining Indigenous Eval. Framwork. Canadian Journal of Eval. 23(2),  NSF.Gov/Evaluation. March 3,  March 3,

MIIGWECH (Thank You)