You have a ton of data about your 1:1 program, now what? Jeni Corn, Elizabeth Halstead,

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Quality Assurance Review Team Oral Exit Report District Accreditation Forsyth County Schools February 15, 2012.
Conducting the Community Analysis. What is a Community Analysis?  Includes market research and broader analysis of community assets and challenges 
Your MA Professional Development Plan and Action Research Ready! S.M.A.R.T.! Action! MSLA 2014, Hyannis Connecting, Creating, Caring: The School Library.
Collecting and Analyzing Data to Inform Action. Stage 2: A theory of action for your project Exploring research and best practices to provide a strong.
Gathering Evidence Educator Evaluation. Intended Outcomes At the end of this session, participants will be able to: Explain the three types of evidence.
District Collaboration Design Grant Requirements November 21, 2013.
Forward Moving Districts Information Summarized by Iowa Support Team as they Study Identified Buildings and Districts Actions in those Buildings and Districts.
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
The Massachusetts Model System for Educator Evaluation Training Module 5: Gathering Evidence August
Learning Walk High Levels of Learning for All Students Quality Instruction in Every Classroom Skillful Leadership Throughout the School and District.
An Assessment Primer Fall 2007 Click here to begin.
District Collaboration Design Consortia Grant Requirements November 21, 2013.
INSTRUCTIONAL LEADERSHIP: CLASSROOM WALKTHROUGHS
Studying One’s Own Practice through Action Research
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Kermelle Hensley Larry Lipscomb Terri Wyrosdick.  Program evaluation is research designed to assess the implementation and effects of a program.  Its.
Development of Competence Profile Quality managers in VET-institutions Project no: PL1-LEO This publication [communication] reflects the.
Power Pack Click to begin. Click to advance Congratulations! The RtI process has just become much easier. This team member notebook contains all the information.
Developing an Effective Evaluation to Check for Understanding Susan E. Schultz, Ph.D. Evaluation Consultant PARK Teachers.
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
Thinking about assessment…
To use this PowerPoint you will need the following documents found in this module: OIP Stage 3 for the BLT The BLT 5-Step Process Visual The BLT 5-Step.
Student Achievement Teacher and Leader Effectiveness Principal Professional Growth and Effectiveness System Field Test Overview.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Our Leadership Journey Cynthia Cuellar Astrid Fossum Janis Freckman Connie Laughlin.
A Mixed Method Study Explores the Impact of UCI-NSF PreK/K Leadership Training for Early Childhood Educators Integrating Science, Math, and Literacy Linda.
Becoming a Teacher Ninth Edition Forrest W. Parkay Chapter 13 Becoming a Professional Teacher Parkay ISBN: © 2013, 2010, 2007 Pearson Education,
The Evaluation Plan.
Debby Deal Tidewater Team STEM Grades 4-5 August 4, 2011 Action/Teacher Research.
TCS Orientation. NC State Board Policy # TCP-004: “Within two weeks of a teacher’s first day of work in any school year, the principal will provide the.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
Ashley Briggs, Ed.D., case study lead Felix Fernandez, Ph.D, implementation lead ICF International 1 July 8, 2015.
September ERD 5 th Grade. Welcome to Harmony! Norms House keeping.
The Evaluation of IMPACT V Jeni Corn, Friday Institute for Educational Innovation NC State University College of Education.
Jing Wan.  Introduction  Logic model  Evaluation methods(Design, data collection, analysis)  Communication & dissemination.
1 SINA Implementation Action Plan Professional Development Assessment Evaluation Questions Requires ongoing Specifies Monitors Student Success Teacher.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
555 Review. Chapter 2: Introduction The sequence of 8 types of research Differences between qualitative and quantitative research – Components of qualitative.
Applying Scale Research to NSF MSP: Students Discover Friday Institute Scale Research Team Project Leads: Jeni Corn, Sherry Freeman.
Los Angeles Unified School District “MyData 201- Data-Based Decision Making, RtI2 and MyData” Co-hosted by your RtI and MyData Teams
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Diocese of Fort Worth Curriculum Development Process Professional Development Evaluation Report EDU: Dr. Ballenger Authors: Pamela Cooper, Charlene.
Hudson Area Schools - Lincoln Elementary School Improvement Professional Development Friday, February 26, 2010.
TIF Webinar Sustaining Your TIF Grant March 11, 2015.
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
College of Education University of Illinois at Urbana Champaign http://intc.education.illinois.edu/ Illinois.
10. STEP 1: DEFINE & SCOPE Essential EAFM Date Place 10. Step 1: Define and scope the FMU Version 1.
KEEPING THE FOCUS ON STUDENT ACHIEVEMENT Stephanie Benedict Academic Development Institute & Center on Innovations in Learning.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Data Driven Planning and Decision-making for Continuous School Improvement: Developing a Program Evaluation Plan Leadership for Innovative Omani Schools.
What does it mean to be a RETA Instructor this project? Consortium for 21 st Century Learning C21CL
Action Research Purpose and Benefits Technology as a Learning Tool to Improve Student Achievement.
LoFTI. Structured Observations: LoFTI –Determine how technology is being used school-wide. –Record instances of particular uses of technology, not “how.
Consortium for Educational Research and Evaluation– North Carolina Building LEA and Regional Professional Development Capacity First Annual Evaluation.
You have a ton of data about your 1:1 program, now what? Jeni Corn, Elizabeth Halstead,
Background CPRE brings together education experts from renowned research institutions to contribute new knowledge that informs K- 16 education policy &
CONNECT WITH CAEP | | Measures of Teacher Impact on P-12 Students Stevie Chepko, Sr. VP for Accreditation.
District Literacy Leaders Network Meeting March 24, :00am-12:00pm Dr. LaWonda Smith Manager, English Language Arts Dr. Argentina Back Manager, Multilingual.
Professional Learning Communities Creating powerful and effective learning for teachers and students.
Data Coaching Services Types of Data 1. 2 o Qualitative Data vs. o Quantitative Data o Where do student and teacher data fall within these categories?
Superintendents’ Network Welcome! Apprentice Facilitators And Coaches.
Instructional Leadership for a Professional Learning Culture:
Chapter 2: Overview of the Action Research Process
Chapter 20 Action Research Gay, Mills, and Airasian
Educator Effectiveness Regional Workshop: Round 2
Presentation transcript:

You have a ton of data about your 1:1 program, now what? Jeni Corn, Elizabeth Halstead, Ruchi Patel, Danny Stanhope, NCLTI Winter Leadership Institute Friday Institute for Educational Innovation NC State University

Description School and district leaders of 1:1 learning initiatives need data from teachers and students to guide effective decision-making around planning and implementing successful projects. This session will provide opportunities to examine survey and observation data from real 1:1 schools and to discuss with colleagues what decisions should be made based on those results.

Welcome and Introductions Session Agenda o Our name, role, FI Evaluation Team Description o Overview of Evaluation and Instruments o Small Group Review of Results and Report Out o Large Group Discussion

Evaluation Overview Evaluation: Finding what works and what does not work for your project. Purpose of Educational Evaluation: Helping schools/districts use evaluation to continuously monitor and make effective decisions to improve their education projects for the benefit of their students, staff, and community. 4

Evaluation Overview “When the cook tastes the soup, that's formative evaluation. When the guests taste the soup that is summative evaluation.” ~ Bob Stake 5

Evaluation Overview Every evaluation has these components: Planning Data collection Data analysis Results 6

Evaluation Overview Monitor and Adjust 7

Evaluation Overview 8 Evaluation is an iterative process.

Measures/Data Sources Where do you get the data you need? o Check whether data already available to you might help answer your questions. o Determine the data sources you might use to meet remaining data needs. o Note that data sources are not data.  Example – teachers’ lessons plans are a rich data source, but it is necessary to “mine” them for actual data. Meaning you need a Lesson Plan Checklist or Lesson Plan Rubric. 9

Measures/Data Sources Common Data Traps o “Biting off more than you can chew” o Not collecting data needed to answer important questions o Collecting data that is not really useful o Jumping to decisions about sources of data o Neglecting hard-to-quantify data o Not formalizing “informal data” (e.g., anecdote, unrecorded observations) 10

Measures/Data Sources Common Data Traps o Not using valuable data after it has been collected o Neglecting implementation data (about activities and strategies) o Neglecting impact data (about objectives) o Making inferences about implementation from impact data and vice-versa …time spent planning saves more time later! 11

Collecting Evaluative Information: Data Sources and Methods, Analysis, and Interpretation o Selection of sources and methods are dependent on the evaluation questions. o Should consider a wide array of methods to collect data. o Quantitative data are analyzed using descriptive or inferential statistics. o Qualitative data are analyzed for patterns or themes. o Data must be interpreted not just analyzed. 12 Measures/Data Sources

Types of Data Sources o Documents o Records o Observations o Site Visits o Surveys o Interviews o Focus Groups o Tests 13 Measures/Data Sources

Some Examples for Evaluating Technology Projects 1.Documents: Rubrics for lesson plans, student products 2.Records: Logs of teacher technology use 3.Structured Observations: LoFTI 4.Likert-Scale Surveys: STNA, 1:1 Student Survey collaborative/adminresources/ 14 Measures/Data Sources

Introductions Instruments - STNA, LoFTI, and 1:1 Student Survey Results Guiding Questions Share Out 15 Session Activities – Small Groups

Guiding Questions: 15 min. 1. What are the results? How would you summarize the data? 2. What do the results mean for your school or district? 3. What are you going to do now? What decisions would you make about things like PD, infrastructure, tools/resources, staffing, School Improvement Plans, submitting a grant, or partnerships? Share Out 16 Session Activities

Whole Group Guiding Question: What might be next steps or decisions based on all three data sources? 17 Session Activities