Midwinter Inservice January 18, 2013. A WORD FROM OUR LEADER.

Slides:



Advertisements
Similar presentations
Action Research Teams in the Cedar Rapids Community Schools A vehicle for continuous improvement to close identified student achievement gaps.
Advertisements

Long Beach School District Stakeholder Survey Results Results from and Parent, Student, and Staff Surveys.
Local Control Funding Formula & Local Control Accountability Plan Stakeholders Meeting March 12, 2014.
WestEd.org What is Data Literacy and How Do We Achieve It? Ellen B. Mandinach WestEd.
An Overview of the Local Control and Accountability Plan (LCAP) January 25, 2014 FREMONT UNIFIED SCHOOL DISTRICT Educate Challenge Inspire.
On-Site School Review All-Staff Meeting An asset-building model of school improvement... Presenters: ___________________________ The High-Performing School.
RESEARCH METHODS Lecture 19
Title I Needs Assessment and Program Evaluation
PPA 502 – Program Evaluation Lecture 4a – Qualitative Data Collection.
Data Mondays: ACRP 2014 Virtual Conference
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
Assessment and Accountability for School Leaders Judith Richardson Director of Diversity, Equity and Urban Initiatives.
Spring 2012 Pilot Project Module Nine A New Texas Core Curriculum 1.
Achieving Authentic Inquiry in Your Classroom Presented by Eric Garber.
Measuring Charter Quality Eric Paisner, NAPCS Anna Nicotera, NAPCS Lyria Boast, Public Impact.
1 Great Things Are Happening In Paramount Schools - Where We Inspire Great Learning Through Great Teaching Great Things Are Happening In Paramount Schools.
PAULDING COUNTY SCHOOL DISTRICT AdvancED EXTERNAL REVIEW REPORT.
Iowa Support System for Schools and Districts in Need of Assistance (SINA & DINA) Phase I: Audit Keystone AEA January 28, 2008.
1 Michigan Department of Education Office of School Improvement One Common Voice – One Plan Michigan Continuous School Improvement (MI-CSI)
Overview of School Improvement Process
Pontotoc City School District. Pontotoc City School District believes LEARNING is a priority, a need, and a desire. To be successful, we must nurture.
Timberlane Regional School District
On-Site School Review Leadership Meeting An asset-building model of school improvement... Presenters: ___________________________ The High-Performing School.
Data Coaching Getting to the human side of data. Session Goals  Identify the purpose of using data in an educational setting.  Describe ways to connect.
Building Your Data Notebook for School Improvement PURPOSE: In compliance with state mandate, local “Data Diggers” will collect data into a single source.
Leaders of Learners HOW DO WE LEAD THE CHANGE WE WANT TO SEE IN OUR SCHOOLS?
Developing a New Accountability System
1 Using Data to Improve Student Achievement & to Close the Achievement Gap Tips & Tools for Data Analysis Spring 2007.
Heppner Jr/Sr High School Credit for Proficiency in the Classroom (C4P) Clearing Up the Target!
A significant and historic opportunity for states to collectively develop and adopt a core set of academic standards in Mathematics and English/Language.
One Common Voice – One Plan School Improvement Module 2
BIOL 411 Lab. About the course BIOL 411 newly redesigned as an Inquiry course – Meets new Discovery Program requirements Attributes of Inquiry course.
External Review Exit Report Anderson School District 4 November , 2014.
Why Do State and Federal Programs Require a Needs Assessment?
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
A Mission of Restoration
The Scientific Method An approach to acquiring knowledge.
Accreditation (AdvancED) STANDARD #1: PURPOSE AND DIRECTION Office of Service Quality Veda Hudge, Director Donna Boruch, Coordinator, School Improvement.
SOCIAL SCIENCE INQUIRY MODEL
School Accreditation School Improvement Planning.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
USING DATA TO IMPROVE LEARNING FOR ALL STUDENTS Advancing Improvement in Education (AIE) Annual Conference October 18, 2012 Victoria L. Bernhardt Education.
Hudson Area Schools - Lincoln Elementary School Improvement Professional Development Friday, February 26, 2010.
Accreditation (AdvancED) STANDARD #2: GOVERNANCE & LEADERSHIP
Scientific Method Flip Chart Miss Forsythe 7 th Grade Science.
Methods of Data Collection Survey Methods Self-Administered Questionnaires Interviews Methods of Observation Non-Participant Observation Participant Observation.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
School Improvement Plan & Accreditation
Dorthea Litson April 30, Purposes of Assessment Purposes of Assessment Making instructional decisions Monitoring student progress Evaluating programs.
August 2, Welcome Who is the TSD Continuous Improvement Team ? What is the work of the TSD Continuous Improvement Team? What is.
About District Accreditation Mrs. Sanchez & Mrs. Bethell Rickards Middle School
Using Data to Improve Student Achievement December 1, 2010.
Dr. Derrica Davis Prospective Principal Candidate: Fairington Elementary School.
Office of Service Quality
MAPLE VALLEY PUBLIC SCHOOLS DISTRICT PD January 18, 2016.
External Review Exit Report Towns County School System October 4-7, 2015.
Stetson University welcomes: NCATE Board of Examiners.
Research Methods in Psychology Introduction to Psychology.
Scientific Method A blueprint for experiment success.
Michigan Department of Education Office of Education Improvement and Innovation Michigan Continuous School Improvement (MI-CSI) Overview.
© PeopleAdvantage 2013 All Rights Reserved We will Show You How to Easily Conduct Effective Performance Appraisals LCSA Conference 2013.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
Data Coaching Services Types of Data 1. 2 o Qualitative Data vs. o Quantitative Data o Where do student and teacher data fall within these categories?
ADVANCED (SACS) SYSTEM SUMMARY FY15. STANDARD ONE INDICATORS 1.1-The system engages in a systematic, inclusive and comprehensive process to review, revise.
Step # 5: Conclusion Step # 4: Collect & Analyze Data Step # 3: Experiment Step # 2: Hypothesis Step # 1: Problem Scientific Method.
External Review Exit Report Campbell County Schools November 15-18, 2015.
QUALITY IMPROVEMENT [SECOND]/[THIRD] QUARTERLY COLLABORATIVE WORKSHOP
What It Is and How to Design an Action Research Project
Project Outcomes (separate handout)
LEA Planning Cycle.
Presentation transcript:

Midwinter Inservice January 18, 2013

A WORD FROM OUR LEADER

Survey Data Packets An Explanation from our Data Expert Melissa Engel

This is what our raw data looks like…

And after the magic of the chart wizard….. Totals represent all 4’s, 3’s, 2’s, 1’s in that indicator.

1.1 The agency engages in a systematic, inclusive, and comprehensive process to review, revise, and communicate an agency purpose.

Let’s Take A Look! Form groups of five Each group member reviews one of the five standards Summarize your review to your group Large Group Debrief –One finding that: surprises you challenges you warms you

Break Time

Review Stage 1: Ask Good Questions Focus Area Examples: “Improve Student Achievement” “Improve Communication to Stakeholders” “Improve School Needs Assessment Process”

Review Stage 1: Ask Good Questions Ask questions that… data can answer are measurable we can do something about align with the Focus Area are clear

Review Stage 1: Ask Good Questions Three Types of Questions Descriptive Relational Causal

Review Stage 1: Ask Good Questions Broad vs. Sufficiently Narrow Include: Who, What, When

Types of Data Demographic Data Perceptual Data Performance Data –Achievement –Outcomes Program Data

Demographic Data Includes descriptors of organizations and descriptors of students Used to disaggregate focus data –Allows us to answer questions about subgroups –Allows us to answer questions about equity. –Helps us to focus later actions

Perceptual Data Includes observation data as well as stakeholder surveys & questionnaires Used to show correlations between perceptions and the question for inquiry This data is often gathered after the question for inquiry is developed to insure a direct correlation between the data and the question. Do Not underestimate the validity and importance of perceptual data

Performance Data: Achievement Includes state assessments, district assessments, CBM’s, benchmark assessments, grades, and GPA’s. Used to improve or measure student achievement Be careful to avoid overreliance on any single source of achievement data. Validate achievement data by using more than one source.

Performance Data: Outcomes Includes any data that measures final results or conclusions: graduation rates, dropout rates, mobility rates, suspensions, expulsions, behavior rates, remediation rates, college acceptance, attendance, career readiness, long term career achievement or milestones. Useful for determining if a variable has an effect during a specific time period or at the end point of their education.

Program Data Includes descriptive data about how education and related activities are conducted within the organization: professional development, schedules, the physical setting, policies and procedures. Useful for considering the effects that systemic factors have on specific outcomes

Stage 2: Find Trends & Make Observations

AdvancED: What’s Next? Collect data based on your Data Action Plan and your Focus Questions. Bring this data to our next work day, May 15, At this session, you will learn to analyze, interpret, and display your data. You will be receiving bi-monthly to monthly reminders to complete your data collection throughout the semester.

LunchLunch Literal Field Trip