Gifted and Talented Academy year 3 Session 3 March 11, 2014

Slides:



Advertisements
Similar presentations
1 Slide 1 Ontolog Conference Call – 20 April, 2006 Ontologizing the ONTOLOG Body of Knowledge W hat In It for Me? – Engineering the Value Proposition for.
Advertisements

REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Evaluation: WVDE Innovation Zone and Drop-Out Grant Andu Meharie Office of Research.
Continuing and Expanding Action Research Learning Cedar Rapids Community Schools February, 2005 Dr. Susan Leddick.
CLASSROOM ASSESSMENT FOR STUDENT LEARNING
CLASSROOM ASSESSMENT FOR STUDENT LEARNING
MODULE 8: PROJECT TRACKING AND EVALUATION
Big ideas and Essential Questions Related to Special Education March 14, 2008.
Introduction to Strategic Planning Goals, Mission, and Vision Marc Compeau and Mike Wasserman Wednesday, 6/23/04.
Kathy Keeley Northland Foundation Strengthening Communities April 14, 2011 Writing Result Measures and Program Evaluation.
Iowa Core Leadership Session 1 for Welcome!!
Evaluation.
Getting on the same page… Creating a common language to use today.
Title I Needs Assessment and Program Evaluation
Sustaining Local Public Health and Built Environment Programs Fit Nation NYC November 3, 2011 Annaliese Calhoun.
Program Planning and Evaluation EAD April 2.
An Overview of the New HCPSS Teacher Evaluation Process School-based Professional Learning Module Spring 2013 This presentation contains copyrighted material.
Instructional Design Eyad Hakami. Instructional Design Instructional design is a systematic process by which educational materials are created, developed,
Implementation Phase Great Prairie AEA - December 2012 Iowa Support System for Schools in Need of Assistance TODAY’S AGENDA Event versus Process Monitoring.
The Evaluation Plan.
August 3,  Review “Guiding Principles for SLO Assessment” (ASCCC, 2010)  Review Assessment Pulse Roundtable results  Discuss and formulate our.
© New Zealand Ministry of Education copying restricted to use by New Zealand education sector. Page 1 Consider the Evidence Evidence-driven.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Building Leadership Teams
Unit 10. Monitoring and evaluation
Outcome Based Evaluation for Digital Library Projects and Services
Evaluation Research and Engineering Education Lesley Jolly For AaeE ERM wiki at
Creating a Shared Vision Model. What is a Shared Vision Model? A “Shared Vision” model is a collective view of a water resources system developed by managers.
“Strategies for Effective Clinical Teaching and Evaluation” Assessment & Evaluation – (Part 2) Patricia A. Mahoney, MSN, RN, CNE This presentation is a.
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Office of School Improvement Differentiated Webinar Series A Framework for Formative Assessment November 15, 2011 Dr. Dorothea Shannon Dr. Greg Wheeler.
Teaching Today: An Introduction to Education 8th edition
HRM-755 PERFORMANCE MANAGEMENT OSMAN BIN SAIF Session: ELEVEN 1.
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
Assessment Literacy Interim Assessment Kansas State Department of Education ASSESSMENT LITERACY PROJECT1.
Materials for today’s session  Shared website – Wiki   Wireless.
Take Charge of Change MASBO Strategic Roadmap Update November 15th, 2013.
Host: Don Pratt Monday, April 11, :30 – 10:30 AM EDT Monday, April 11, :30 – 10:30 AM EDT Performance Measures for PHCAST Demonstration Projects.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Aligning Academic Review and Performance Evaluation AARPE Session 5 Virginia Department of Education Office of School Improvement.
Gifted and Talented Academy Year 2 Curriculum and Instruction Session 2 HAEAnet-public Password: education0309.
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
Welcome to Program Evaluation Overview of Evaluation Concepts Copyright 2006, The Johns Hopkins University and Jane Bertrand. All rights reserved. This.
Gifted and Talented Academy Year 2 Curriculum and Instruction Session 1 HAEAnet Password: education
Administration and Supervision of Gifted Programs Weekend 1 February 25-6, 2011
IPSP Outcomes Reporting Framework What you need to know and what you need to do.
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
UNIT 7 MONITORING AND EVALUATION  Monitoring and evaluation is the process of examining progress against institution’s goals or plan.  The term SM &
Gifted and Talented Academy Year 2 Curriculum and Instruction Session 4 HAEAnet-public Password: education0309.
Gifted and Talented Academy Year 2 Curriculum and Instruction Session 4 HAEAnet-public Password: education0309
July 11, 2013 DDM Technical Assistance and Networking Session.
Presentation to the Ad-hoc Joint Sub-Committee on Parliamentary Oversight and Accountability Wednesday 20 March 2002 PUBLIC SERVICE MONITORING AND EVALUATION.
CAREER AND SKILLS TRAINING STRATEGIC FRAMEWORK Planning is key to success.
+ Montgomery College Program Assessment Orientation Spring 2013.
Gifted and Talented Academy Year 2 Curriculum and Instruction Session 1 HAEAnet-public Password: education0309
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Allison Nichols, Ed.D. Extension Specialist in Evaluation.
Gifted and Talented Academy year 3 Session 3 February 23, 2012
Designing Effective Evaluation Strategies for Outreach Programs
UDL Academy Session 7 Cohort 1
General Notes Presentation length - 10 – 15 MINUTES
Designing Comprehensive Gifted and Talented Programming
Work Time Work with your team to further develop and/or refine your written gifted and talented plan. A copy of your new/revised written gifted and talented.
Work Time Work with your team to further develop and/or refine your written gifted and talented plan. A copy of your new/revised written gifted and talented.
Gifted and Talented Academy
Presentation transcript:

Gifted and Talented Academy year 3 Session 3 March 11,

Wireless Connection  haeanet  Password: education

Agenda  Welcome  Processing Home Play  Digging Deeper  Developing an Evaluation Plan  Team Time

Welcome Back!  Form mixed-district triads  Share –What do you know about your program today that you didn’t know at the beginning of the year? –How did you acquire this knowledge? –How can this information be used to improve programming?

Team Time/Home Play  Finish your program brochure(s)  Choose one of your questions and develop a plan to answer it. –Audience –Data collected –Data analysis –Conclusions –Communicate  Implement that plan and bring the results to session 3.

Processing Home Play  Talk with your team/table about your focus question  Share the question, why you needed an answer, what you did to find an answer, what you found, and what you’ll do/did with that information

Quality Questions  What did you discover about the question you were asking? –Information it led you to –The quality of the question itself?

Program Evaluation Is… …the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future programming. --Robinson, 2009

Types of Evaluations  Planning –Needs Assessment  Formative –“Miracle in the Middle”  Summative –Impact

Program Performance Measures: Questions about Program Service Delivery Quantity (How much did we do?) Quality (How well did we do it?) Input (Effort) How much service did we deliver? How well did we deliver service? Output (Effect) How much effect/change did we produce? (#) What Quality of effect/change did we produce? (%) Is anyone better off?

Black Box Evaluation Glass Box Evaluation InputOutput

Worth  The extent to which a program or activity is essential to a school’s, district’s, agency’s, or individual’s mission. Worth is an indication of the program’s or activity’s perceived value to constituents or to a single individual. --Assessing Impact training, 2003

Merit  The value of the program is judged by comparing its performance against established standards of excellence in the profession. --Assessing Impact training, 2003

How Could a Program Have…  …worth but no merit?  …merit but no worth?  …both worth and merit?

Thus  A program may have great merit yet be of little worth because it is not aligned with the organization’s mission or needs or it may have great worth and little merit…programs can be evaluated both on the basis of their worth and merit. --Assessing Impact training, 2003

Program Evaluation Models and Tools  Arkansas Evaluation Initiative  Borland Evaluation Template  Maker’s Responsive Model  Self-Audit/Reflection Tool

AEI  Designed to support in-house (formative) evaluation  Don’t need to use all data sources

Borland Template

Developing an Evaluation Plan  Read Chapter 18 –Form groups of 3-4 –Read a segment & discuss –Read and discuss –Repeat until done  Make connections to previous learning about program evaluation.  Identify key points to consider in developing your evaluation plan.

Team Time/Home Play  Develop a program evaluation plan.  Include formative and summative components  Use the template on p to help you develop the framework  Exact format is up to you, but make it flexible enough that the evaluation question(s), data collection instruments and methods, and communication of results could change from year to year within the same framework.

Home Play  Read “Collecting Student Impact Data in Gifted Programs: Problems and Processes” (found on the Wiki under session 3)  Identify 5 key points from the chapter and offer a 3-5 sentence summary of each along with a short explanation of why it’s important.  Identify another aspect of programming you need to take a closer look at and develop some possible evaluation questions.

Next Meeting  April 8, 2013  8:30 a.m. – 4:30 p.m.  Room 14, Heartland AEA, Johnston