Aaron J. Scott Research Associate October 13 th, 2009.

Slides:



Advertisements
Similar presentations
DELPHI #25 RESULTS SRR Strategic Planning Meeting Denver, CO July 27, 2005 Jaime Whitlock.
Advertisements

Copyright © 1997 CompassPoint Nonprofit Services What Does a Strategic Plan Look Like & How Do You Keep It Relevant? Anushka Fernandopulle CompassPoint.
Summative Assessment Kansas State Department of Education ASSESSMENT LITERACY PROJECT1.
A Self Study Process for WCEA Catholic High Schools
Objectives Improve programs and services Explore the processes involved in identifying and creating successful partnerships Identify the steps to follow.
Local Control and Accountability Plan: Performance Based Budgeting California Association of School Business Officials.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
The Periodic Review Report and the Assessment of Institutional Effectiveness Presentation by: Karen Froslid Jones and Robin Beads, American University.
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Project Monitoring Evaluation and Assessment
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
A New Practice Model for Child Welfare
An Assessment Primer Fall 2007 Click here to begin.
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
Global Poverty Action Fund Community Partnership Window Funding Seminar January 2014 Global Poverty Action Fund Community Partnership Window Funding Seminar.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
DEVELOPING DEPARTMENTAL OUTCOMES ASSESSMENT PLANS Jerry Rackoff Lois Huffines Kathy Martin.
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
SAISD Federal Programs Department. Stage 1 of the Organization and Development Process Form the Planning Team 1 2.
Writing an Effective Assessment Plan
Legal & Administrative Oversight of NGOs Establishing and Monitoring Performance Standards.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Charting a course PROCESS.
2012 West Texas Assessment Conference CREATING A CULTURE OF ASSESSMENT IN NON-INSTRUCTIONAL AREAS KARA LARKAN-SKINNER, DIRECTOR OF IR AND IE & KRISTIN.
Chapter 1 Marketing Strategy Chapter 1 Strategic Market Planning.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
Collaborative Data Teams
Reporting and Using Evaluation Results Presented on 6/18/15.
Assessment of Staff Needs and Satisfaction in an Evergreen Environment David Green NC Cardinal Consultant State Library of North Carolina.
Institutional Effectiveness & B-CU Dr. Helena Mariella-Walrond VP of Institutional Effectiveness Cory A. Potter Director of Assessment Academic.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
2008 Adobe Systems Incorporated. All Rights Reserved. Developing an eLearning Strategy at a Nigerian University By Jerome Terpase Dooga, Christopher Tony.
Strategic Planning Module Preview This PowerPoint provides a sample of the Strategic Planning Module PowerPoint. The actual Strategic Planning PowerPoint.
Evidence of Success: Assessing Student Learning Outcomes in International Education Dr. Darla K. Deardorff Association of International Education.
Too expensive Too complicated Too time consuming.
PANAMA-BUENA VISTA UNION SCHOOL DISTRICT
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
GBA IT Project Management Final Project - Establishment of a Project Management Management Office 10 July, 2003.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
Tools for Civil Society to Understand and Use Development Data: Improving MDG Policymaking and Monitoring Module 3: MDGs and the Policy Cycle.
Creating a S.I.M.P.L.E. & Comprehensive Assessment Plan for Residence Life and Housing Dr. Mike Fulford Emory University Student Affairs Assessment Conference.
Dr. John D. Barge, State School Superintendent “Making Education Work for All Georgians” Building an Effective Evaluation of Your Flexible.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Why Do State and Federal Programs Require a Needs Assessment?
Fresno County Employees’ Retirement Association Strategic Planning Presented by Tom Iannucci Cortex Applied Research February 20, 2008.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Evidence-based Education and the Culture of Special Education Chair: Jack States, Wing Institute Discussant: Teri Palmer, University of Oregon.
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
Week 12: Performance Management and Performance Budgeting Discuss Eureka exercise Review mid-term Conceptual Origins of Performance Management Government.
Project Management Learning Program 23 Mar – 3 Aprl 2009, Mekong Institute, Khon Kaen, Thailand Managing for Development Results Results-Oriented Monitoring.
July 2007 National Quality Assurance and Accreditation Committee & Quality Assurance and Accreditation Project Role of Action Planning in The Developmental.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Laying the Groundwork Before Your First Evaluation Diana Epstein, Ph.D, CNCS Office of Research and Evaluation Adrienne DiTommaso, MPA, CNCS Office of.
Preparing a Written Report Prepared by: R Bortolussi MD FRCPC and Noni MacDonald MD FRCPC.
Michigan Assessment Consortium Common Assessment Development Series Module 16 – Validity.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Session 2: Developing a Comprehensive M&E Work Plan.
Promoting the Vision & Mission of the School Governing Board Online Training Module.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Resource 1. Involving and engaging the right stakeholders.
Applying for funding: Tips fom the trenches
Results of the Organizational Performance
School of Dentistry Education Research Fund (SDERF)
Assessing Academic Programs at IPFW
Annual Meeting of Title I Parents
Assessment components
Presentation transcript:

Aaron J. Scott Research Associate October 13 th, 2009

 To briefly revisit the culture of evidence definition and model and its associated stages.  To reconsider Stage 1 and expand our notion of planning.  To discuss specific items to consider during the planning process – i.e. going beyond the objective statements.  To provide examples from our own department that are in or have recently gone through this stage.

 A culture of evidence is defined by Lakos & Phipps (2004) as: “An organizational environment in which decisions are based on facts, research, and analysis, and where services are planned and delivered in ways that maximize positive outcomes and impacts for customers and stakeholders” (p. 352).

 Transparent Decision-Making  Charting progress toward goals  Documenting implementation and improvement  Data-Informed decisions through assessment  Organizational/Departmental Effectiveness  Integrating goals and working together  Reporting outcomes in a consistent format  Accountability  To the Department  To the University  To External Stakeholders  The State  Parents  Funding Sources

or Processes Stage 1 Stage 2 Stage 3 Stage 4

3 Steps in Affirming and Building Upon Our Purpose(s)  Through development/review of Mission  Through development of Goals  Through development of Objectives MissionGoalsObjectives

 Action statements  Clear, specific and concrete  Measurable  Time limited  Realistic  Hierarchical (ultimate, intermediate, immediate)  Build on strengths and reduce need  Focus on outcome or process

 Securing the Planning & Budgeting at the beginning sets the foundation for a clear course of implementation and assessment.  This first stage has only begun, however. You must determine how you will go about implementing the program or services, defining and measuring their degree of success, who will be involved, the resources it will take to work toward accomplishing your objectives, and how long the process will take, to name just a few. And this often requires much more work and time!

 The quality of a program or service, and decisions made on its behalf, is dependent, in part, upon the quality and utility of the data informing it.  But the quality of the data, and therefore evidence, is proportional to the degree of planning and rigor invested in developing the clarity of purpose, the design, and the implementation of a program or service, and its associated assessment.

 Example: Living on campus during a student’s first year is known from a plethora of previous research to have a positive impact on his/her retention and overall academic success. But less certainty exists about how this takes place(Braxton, 2007).  To what degree is this true on our campus and how do we know? Is it true for all sub-groups? Are there ways we can maximize our impact? What data are there to support what we know? What data do we still need?

Current or New Program/Service? Current Program/Service Define Purpose of Program/Service Define Problem or Need Define Success or Progress Identify Type of Evidence Needed List and Consider Methods Available to Collect Evidence Determine Resources/Instruments Available to Collect Evidence Create a Timeline to Collect, Analyze, and Report Evidence

 Carefully choose your evidence and it’s appropriateness (and timeliness) for the program or service.  Choose how to collect your evidence  (interviews, focus groups, brief questionnaires, in- depth surveys, paper form, web-based, - based).  Choose how to frame your evidence by providing a context for the results.  Determine to whom will you be providing this evidence.

 What is my potential available budget?  What is my timeline? How realistic is it?  What are my analysis capabilities?  Who needs to see these data? How long will it take to get the data in presentable format?  How easily can I fit this method into my annual responsibilities?  Who needs to make decisions with these data?  Will this kind of evidence help me make the decisions I need to make? How?  How will I document the evidence and the decisions made from that evidence?

 Purpose of program/service and assessment is stated clearly, concisely, and completely in terms of outcomes or processes. Often times this is stated in terms of a problem or need.  Members involved have been consulted early on and have agreed to participate.  How purpose will be carried out (i.e. implementation) is spelled out clearly.  Expectations are clearly stated.  A timeline has been through at least two drafts and has been peer reviewed.  A method for measuring outcomes has been secured (pre-fab or home- made) and either has well-established reliability and validity or has been planned to be piloted.  IRB process has been completed, if required.  Anticipated costs have been outlined.

 Is it measurable?  Is it meaningful?  Is it manageable?  Who is the target audience of my outcome?  How will I know if it has been met?  Will it provide me with evidence that will lead me to make a decision for continuous improvement?

 A culture of evidence is a great motto, but what kind of evidence? How good is the evidence and how do you know? Only through rigorous planning and implementation can we begin to answer these and other important assessment questions.