Aggregating Outcomes for Effort and Effect: What NTAC Learned from its Site Review Ella L. Taylor, Ph.D. NTAC Teaching Research Institute Western Oregon.

Slides:



Advertisements
Similar presentations
Program Evaluation Strategies to Improve Teaching for Learning Rossi Ray-Taylor and Nora Martin Ray.Taylor and Associates MDE/NCA Spring School Improvement.
Advertisements

Catulpa Community Support Services.  Use of an electronic data entry program to record demographic data and case notes to reflect service delivery 
Project Monitoring Evaluation and Assessment
Campus Staffing Changes Positions to be deleted from CNA/CIP  Title I, Title II, SCE  Academic Deans (211)  Administrative Assistants.
Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st CCLC Leading Indicators Breakout Session Neil Naftzger and Deborah Moroney.
Family Resource Center Association January 2015 Quarterly Meeting.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Measuring Success & Impact: Challenges & Opportunities for STEM Early Learning Programs Tiffany R. Lee University of Colorado Boulder University of Washington.
Gifted Program Review Spring Process  In February 2013 a team of 41 individuals met to develop questions: parent, teachers, psychologists and administrators.
Standards and Guidelines for Quality Assurance in the European
1 Professional Development Planning and Needs Assessment Regional Trainings Spring 2006.
+ Hybrid Roles in Your School If not now, then when?
REGIONAL PEER REVIEW PANELS (PRP) August Peer Review Panel: Background  As a requirement of the ESEA waiver, ODE must establish a process to ensure.
Formative Assessment in Idaho Idaho is committed to the idea that a system of assessment will yield far better information about teaching and learning.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Using and Implementing Goal Attainment Scales as a way to Measure Progress 1.
Introduction to Home/School Compacts
SESSION ONE PERFORMANCE MANAGEMENT & APPRAISALS.
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Module 3 of 3.
New England Regional Colloquium Series “Systems of State Support” B. Keith Speers January 24, 2007.
Webinar on the OSEP Self Assessment and Site Review Process for State and Multi-State Deaf-Blind Projects October 29, 2004.
Final Summary Evaluation: State Projects Serving Individuals with Deaf-Blindness October 23, 2006 NCDB Sponsored Webinar Presented by Richard Zeller.
Pacific TA Meeting: Quality Practices in Early Intervention and Preschool Programs Overview to Trends and Issues in Quality Services Jane Nell Luster,
Using Online Instruction in Positive Behavior Support Training Kansas Institute for Positive Behavior Support (KIPBS) Rachel Freeman.
Proficiency Delivery Plan Strategies Curriculum, Assessment & Alignment Continuous Instructional Improvement System ( CIITS) New Accountability Model KY.
Curriculum and Learning Omaha Public Schools
Indistar Summit – Coaching with Indistar February 2012 Presenters: Yvonne Holloman, Ph.D. Associate Director, Office of School Improvement Michael Hill.
National Center on Response to Intervention NCRTI TECHNICAL ASSISTANCE DOCUMENTATION AND IMPLEMENTATION Tessie Rose, PhD NCRTI Co-coordinator of TA and.
August 7, Market Participant Survey Action Plan Dale Goodman Director, Market Services.
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
What is HQPD?. Ohio Standards for PD HQPD is a purposeful, structured and continuous process that occurs over time. HQPD is a purposeful, structured and.
National Consortium On Deaf-Blindness Families Technical Assistance Information Services and Dissemination Personnel Training State Projects.
Early Childhood Outcomes Center1 Using Data for Program Improvement Christina Kasprzak, NECTAC/ECO Ann Bailey, NCRRC July 2010.
The Iowa Distance Mentoring Model (DMM) for Early ACCESS promotes the systematic implementation of family guided routines based intervention (FGRBI) for.
Partnership Analysis & Enhancement Tool Kit Cindy S. Soloe Research Triangle Institute (RTI) April Y. Vance Centers for Disease Control and Prevention.
1 The Oregon Reading First Model: A Blueprint for Success Scott K. Baker Eugene Research Institute/ University of Oregon Orientation Session Portland,
Statewide Systems of Support Oregon School Improvement Facilitators Carol Larson, Willamette ESD Christina Reagle, Oregon Dept. of Education.
Why Do State and Federal Programs Require a Needs Assessment?
LANSING, MI APRIL 11, 2011 Title IIA(3) Technical Assistance #2.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
D. JAY GENSE & MARK SCHALOCK NCDB: TEACHING RESEARCH INSTITUTE, WESTERN OREGON UNIVERSITY BARBARA PURVIS NCDB: HELEN KELLER NATIONAL CENTER USING EVIDENCE-BASED.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
Welcome to today’s Webinar: Tier III Schools in Improvement We will begin at 9:00 AM.
Student Growth within the Teacher Professional Growth and Effectiveness System (TPGES) Overview 1.
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
Sustainability Planning Framework and Process Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24, 2015 ©
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Moving Title IA School Plans into Indistar ESEA Odyssey Summer 2015 Presented by Melinda Bessner Oregon Department of Education.
Building Bridges: Embedding outcome evaluation in national and state TA delivery Ella Taylor Diane Haynes John Killoran Sarah Beaird August 1, 2006.
An Update of One Aspect of Monitoring, Support and Technical Assistance Available Through the State Department of Education, Bureau of Special Education.
National Secondary Transition Technical Assistance Center Connecting TA for Part B Indicators 1, 2, 13, & 14: Working Together to Support States OSEP Project.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
The Power of Parents: National Consortium on Deaf-Blindness Family Leadership Training Program It all begins today!
Goal Attainment Scales as a way to Measure Progress Amy Gaumer Erickson & Monica Ballay December 3, 2012.
District Implementation of PBIS C-1 Rob Horner Brian Megert University of Oregon Springfield School District.
Introduction to the Road to Quality Process using the Missouri Afterschool Program Self- Assessment.
School Improvement Needs Assessment – © Iowa Association of School Boards Assessment Conducted by the Iowa Association of School Boards.
The PDA Center is funded by the US Department of Education Office of Special Education Programs Stories from the Field and from our Consumers Building.
Monitor and Revise Teaching. ObjectivesObjectives Describe how to monitor teaching List ways to contribute to broader evaluations Explain how to review.
PLCs Professional Learning Communities Staff PD. Professional Learning Committees The purpose of our PLCs includes but is not limited to: teacher collaborationNOT-
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Tell Survey May 12, To encourage large response rates, the Kentucky Education Association, Kentucky Association of School Administrators, Kentucky.
Phase I Strategies to Improve Social-Emotional Outcomes
Leveraging Evaluation Data: Leading Data-Informed Discussions to Guide SSIP Decisionmaking Welcome Mission of IDC- we provide technical assistance to build.
2018 OSEP Project Directors’ Conference
Using Data for Program Improvement
Key Stakeholders are aware of the Coalitions activities
Using Data for Program Improvement
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Presentation transcript:

Aggregating Outcomes for Effort and Effect: What NTAC Learned from its Site Review Ella L. Taylor, Ph.D. NTAC Teaching Research Institute Western Oregon University

Agenda Site review response to evaluation Revisions to evaluation plan Collecting effort and effect data Aggregating data Questions

Site Review Concerns about Evaluation Very complex Less likely to succeed than a simpler plan Needs to be simplified Needs to be made more realistic and appropriate

Constraints of Evaluation Everyone wants evaluation to be seamless and transparent Reticence to see the direct benefit for improving the project Evaluation is seen as cumbersome and confusing

Primary purpose of evaluation 1. Did we do what we said we were going to do? (effort) 2. What was the impact of what we did? (effect)

Effort vs. Effect Effort – actions carried out by the project Satisfaction data Numbers of participants, events, etc. Effect – impact of the actions What outcome resulted from the activity? Change of awareness Change of knowledge Change in skill/implementation (service provider, family, systems) Change in child (child change data)

Alignment What needs to be done? (Needs Assessment) How to meet the need? (Activity) (Effort data) Did we meet the need? (Evaluation) (Effort & Effect data)

Did we do what we said we were going to do? (EFFORT) Grant objectives Met = explain how we met them Not met = explain why not Data Number of events Number of participants Satisfaction with effort

What was the impact of what we did? (EFFECT) What to measure? How to measure? How to report succinctly? How to aggregate?

What was the impact of what we did? (EFFECT) Outcome and Performance Indicators (NTAC’s OPIs) Outcome A statement of a measurable condition or an expected result or change. (i.e. increase, improvement, progress toward). Performance Indicator A statement that helps quantify the outcome and indicates whether the outcome has been achieved. Often, multiple indicators may provide better evidence of the achievement of an outcome.

Kudos to… John Killoran Kathy McNulty Paddi Davies Many, many hours of development and refinement of NTAC’s OPIs

OPIs Comprehensive outcomes for children (15), families (9), service providers (19) and systems (6) Embed in all aspects of planning, delivery and evaluation Aligns needs assessment, project activities and measurement of impact On web (

How we are using OPIs Planning Identify the stakeholder group (service provider, child, family and/or systems) Identify the outcomes you will target Identify the performance indicators that will help you determine attainment of targeted outcome(s) Delivery of service Implement TA that is targeted to the outcomes selected Evaluation Tailor assessment/evaluation measures to targeted outcomes and performance indicators

An Example

Webinar Example: Planning Needs Assessment Comments at Project Directors’ Meeting during self- evaluation breakout sessions Conversations with state project directors, coordinators and staff NTAC Advisory committee meeting Outcome Goal: The use of formative and summative evaluation of the systems change and/or capacity building has increased (S4) Performance Indicator: Uses outcomes measures (S4g)

Webinar Example: Delivery of Service Activity: Webinar Align the intensity of our evaluation with the intensity of the activity One time activity results in less intense evaluation than sustained professional development. Align needs with activity with evaluation

Webinar Example Evaluation: Change of awareness Item Rating SAA N A Nor D DSDNA 1. The presenter had the necessary expertise NA 2. I increased my awareness of using outcome measures in evaluation (S4g) NA 3. I felt this activity was a good use of my time NA

A slightly more complicated example…

Example 2 Planning/Needs: Ongoing training and support Outcome: Use of formative and summative evaluation (Systems 4) Performance Indicators: Uses satisfaction measures (S4d) Uses awareness, knowledge or skills measures (S4e) Uses outcomes measures (S4g) Activity: Series of webinars for one region Evaluation: Change of knowledge & skill

I learned how to develop, implement and analyze… Rating SAA N A Nor D DSDNA 1. Satisfaction measures (S4d) 54321NA 2. Awareness, knowledge and skills measures (S4e) NA 3. Outcomes measures (S4g) NA

Another more complicated example…

Example 3 Planning/Needs: Development, implementation, analysis & support in the use of formative and summative evaluation Outcome: Use of formative and summative evaluation (Systems 4) Performance indicators: Uses participant demographic data (S4c) Uses satisfaction measures (S4d) Uses change in awareness, knowledge or skills measures (S4e) Uses outcomes measures (S4g) Uses formative and summative evaluation information for ongoing feedback and continuous improvement (S4j) Disseminates evaluation results of the systems change or capacity building activities (S4k)

Sustained Professional Development Activity: Multiple visits by TAS and evaluation specialist to 2 states Evaluation: Change of knowledge & skill Follow-up

As a result of the training, my ability to develop, implement and analyze the following evaluation measures has seen … Rating Substantial progress Some progress No progress 1. Use of demographic data (S4c) Use of satisfaction data (S4d) Use of A, K or S… (S4e) The remainder of PI… 321

Aggregating the Data

Data Collection What data did we collect? –Number of events/activities (effort) –Satisfaction (effort) –Change of awareness (effect) –Change of knowledge & skill (effect) –Follow-up evaluation (effect) On what, did we collect data? –Outcome: Use of formative & summative evaluation (Systems 4) –Performance indicators: demographic data (S4c) satisfaction data (S4d) awareness, etc. (S4e) outcomes measures (S4g) ongoing feedback (S4j) dissemination (S4k)

Data Webinar (5 point) Series (5 point) Visits (5 point) Follow-up (3 point) Demo. datam = 4m = 3 Satisfaction data m = 4 m = 3 A, K or S data m = 4 m = 3 Outcomes data m = 4m = 3m = 4m = 3 Ongoing feedback m = 3m = 2.5 Dissemina- tion m = 3m = 2

Aggregate the data Each event carries equal weight Assign different weight to each event because events have different levels of importance Follow-up evaluation should carry more weight because the intensity of the effort was greater

Conversion to 4 point scale Convert 5 point scale to 4 point 5 (strongly agree) = 4 (achieved) 4 (agree) = 3 (nearly) 3 (neither) = 2 (emerging) 2 & 1 (disagree) = 1 (non-existent) Convert 3 point scale to 4 point 3 (substantial) = 4 (achieved) 2 (some) = 3 (nearly) 1 (no) = 1 (non-existent)

Webinar (4 point) Series (4 point) Visits (4 point) Follow-up (4 point) Demo. datam = 3m = 4 Satisfactionm = 3 m = 4 A, K or Sm = 3 m = 4 Outcomesm = 3m = 2m = 3m = 4 Ongoingm = 2m = 3.5 Dissem.m = 2m = 3 Subtotal Weight.25 Total Mean of the means = ( = ) divided by 4 events = 3.03 (nearly achieved)

What does this mean? Effort NTAC conducted one national webinar, a series of webinars for one region, and several onsite consultations with two states to increase the states’ capacity to use formative and summative evaluation systems. Across the trainings, participants indicated 90% satisfaction with the skill of the consultants and the content of the activities.

What does this mean? Effect Across the trainings and consultations, participants report that they are very near achieving the ability to develop, implement, and analyze formative and summative evaluation measures to increase capacity and systems change (m = 3.03/4.0 scale). Could elaborate by listing performance indicators if needed.

Response to NTAC Site Review: 2004 – 2005 Field test 1)Embed Outcomes and Performance Indicators (OPIs) in planning and delivery of service. 2)Embed OPIs in all evaluation measures. 3)Share our evaluation systems and data with our state/multi-state partners

Addressing the constraints Aligning needs, delivery and evaluation through the OPIs yields a more seamless system (Constraint 1) Sharing data facilitates the use of data (Constraint 2) Consistency helps diminish confusion (Constraint 3)

Questions?

Contact Region 1 Shawn Barnard Paddi Davies Region 2 Jon Harding Barb Purvis Region 3 Nancy Donta Amy Parker Region 4 Kathy McNulty Therese Madden Rose

Additional Examples The following information will not be shared during the discussion, but is being provided as additional material.

Weighting the events differently Using the previous examples, let’s say that we believe the follow-up data should carry more weight since it indicates more long-term implementation and attainment of the outcome. We want the follow-up evaluation to carry 40% of the weight.

Webinar (3 point) Series (3 point) Visits (3 point) Follow-up (3 point) Demo. datam = 3m = 4 Satisfactionm = 3 m = 4 A, K or Sm = 3 m = 4 Outcomesm = 3m = 2m = 3m = 4 Ongoingm = 2m = 3.5 Dissem.m = 2m = 3 Subtotal Weight Total Mean of the means = ( = 14.20) divided by 4 events = 3.55 (with rounding – achieved!)

Embed in basic evaluation (Service Provider 1a) Satisfaction data “I was satisfied with my opportunity to learn about the impact of deaf-blindness on an individual’s overall development (i.e. social, emotional, cognitive).” Change of awareness “I have increased my awareness about the impact of deaf-blindness on an individual’s overall development (i.e. social, emotional, cognitive).” Change of knowledge/skill “As a result of the training, I can use my knowledge about the impact of deaf-blindness on an individual’s overall development (i.e. social, emotional, cognitive) to plan instruction.”

Embed in Follow-up Service Providers “Based on the recent training provided on understanding how a combined vision and hearing loss impacts learning and social/emotional development, please indicate your progress in performing the following tasks…” Child change “Three months ago, you received technical assistance on understanding how a combined vision and hearing loss impacts learning and social/emotional development. As a result of that training, please indicate any progress the student has made in the following skills…”