What Does “Data Training” Look Like?. What the Research Says Jimerson and Wayman (2011) – Little research about best ways to provide effective data-related.

Slides:



Advertisements
Similar presentations
Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Advertisements

Drilling Down the Data: Helpful tools for principals Drew Maerz, Director Educational Data, Assessment & Research Moore County Schools February 18, 2010.
Agenda - January 28, 2009 Professional Learning Community – Jefferson HS Learning by Doing What does the data tell us? ITED results SIP Goals Data Questions.
Session 3 The evidence should be credible & helpful. Implications: the assessments should – ◦ Be grounded in real-world applications, supplemented as.
Standardized Tests: What Are They? Why Use Them?
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
General Information --- What is the purpose of the test? For what population is the designed? Is this population relevant to the people who will take your.
PurposeTo practice making valid inferences.Related Documents DescriptionThis activity can be used within your Data Team or with other audiences to improve.
New Mexico Principal Support Network Helping Leaders Use Accountability Data Effectively.
Quantitative Research Design Backdrop to Multivariate Analysis.
CCSSO Leadership Conference September 9, 2008 Milwaukee, WI Margaret Heritage Inquiry-focused Practice: Making Decisions about Learning in the Classroom.
Why Student Perceptions Matter Rob Ramsdell, Co-founder April 2015.
SMART Goal Target Setting: Using Individual Student Data Josephine Virgilio.
New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments Alignment Inclusive Assessment Seminar Brian Gong Claudia.
Standardized Test Scores Common Representations for Parents and Students.
Methods of Psychology Hypothesis: A tentative statement about how or why something happens. e.g. non experienced teachers use corporal punishment more.
Overview of the MAP Assessment Implementation Appoquinimink School District Board Meeting 10/13/09.
Creating Assessments with English Language Learners in Mind In this module we will examine: Who are English Language Learners (ELL) and how are they identified?
Common Core Math Professional Development
Principles and Strategies of KEYS 2.0 Data Analysis and Interpretation GAE Training January 1009 Jacques Nacson Senior Policy Analyst NEA New Products.
E NSURING S TATISTICAL L ITERACY FOR PRE - SERVICE EDUCATION MAJORS Stacy M. Bjorkman, Ph.D., NCSP Walden University & Kelly H. Summers Northern Illinois.
Stepping Stones to Using Data. Agenda 1.Measures of Academic Progress™ (MAP) as an adaptive assessment 2.Student RIT scores 3.RIT scale 4.NWEA Normative.
Data Teams. Data Teams is a six-step process that allows you to examine student data at the micro level (classroom practitioner level). Data Teams provide.
Welcome Oregon Scaling-up EBISS The District Data Team Meeting Blending Behavioral and Academic Multi-tiered Systems of Support Oregon.
5-Step Process Clarification The 5-Step Process is for a unit, topic, or “chunk” of information. One form should be used for the unit, topic, etc. The.
The Role of Information in Systems for Learning Paul Nichols Charles DePascale The Center for Assessment.
Systems Review: Schoolwide Reading Support Cohort 5: Middle Schools Winter, 2009.
Systems Review: Schoolwide Reading Support Cohort 5: Elementary Schools Winter, 2009.
Student Growth in the Washington State Teacher Evaluation System Michelle Lewis Puget Sound ESD
Human Learning Asma Marghalani.
Cohort 5 Middle/Jr. High School Data Review and Action Planning: Schoolwide Reading Spring,
Standardized Testing (1) EDU 330: Educational Psychology Daniel Moos.
Potential Errors In Epidemiologic Studies Bias Dr. Sherine Shawky III.
Lesson Study Opening Activities (Movement Activity) Grouping Subgroup Article Sharing –Subgroup Reporting.
Illinois Community College BoardIllinois State Board of Education Programs of Study Self-Assessment: Starting the Journey on the Right Foot February 4,
Validity Validity is an overall evaluation that supports the intended interpretations, use, in consequences of the obtained scores. (McMillan 17)
Vincent Briseno EDTC  In a technology-enhanced environment, the MTT demonstrates knowledge of:  Instructional design  Development  Assessment.
Powered by ND Center for Nursing Preceptor Course Evaluation Tuesday, February 24, 2015.
APPR: Ready or Not Joan Townley & Andy Greene October 20 and 21, 2011.
Statistics What is statistics? Where are statistics used?
Changing Teaching Behaviors: The Road to Student Achievement Powell et al: Technology as a potentially cost-effective alternative to on-site coaching Research.
POWER POINT CONTENT Slide 2: Pre-Assessment Task Directions Slides 3-5: Whole-Class Introduction Warm-up Slide 6: Collaborative Activity Day One Instructions.
Introduction to Item Analysis Objectives: To begin to understand how to identify items that should be improved or eliminated.
Criteria for selection of a data collection instrument. 1.Practicality of the instrument: -Concerns its cost and appropriateness for the study population.
SPECIAL EDUCATION LITERACY (SPEL) Ronda Hilbert, Special Education Literacy Coach.
EVAAS and Expectations. Answers the question of how effective a schooling experience is for learners Produces reports that –Predict student success –Show.
DATA ANALYSIS Looking at Student Work November 2013.
How to Understand, Interpret, and Use Student Data to Inform Instruction Heather P. Wright & David Bustos Polk County Schools Created by Heather P. Wright.
September 2, 2009 Blakemore Cluster Meeting. Meeting Objectives and Agenda By the end of cluster, teachers will have developed an understanding of the.
Purpose -to analyze skills and knowledge necessary for success in the unit -to build fluency, confidence, and poise when speaking in front of an audience.
Lead Teach Learn PLC Fundamental IV: Multi-Tiered System of Supports.
Principles of Instructional Design Assessment Recap Checkpoint Questions Prerequisite Skill Analysis.
White Pages Team Grey Pages Facilitator Team & Facilitator Guide for School-wide Reading Leadership Team Meetings Elementary.
1 Teacher Evaluation Institute July 23, 2013 Roanoke Virginia Department of Education Division of Teacher Education and Licensure.
DECISION-MAKING FOR RESULTS HSES- Data Team Training.
Creating Student Success through the Data Team Process Mason City Community School District Kathryn Schladweiler TJ.
PLCs Professional Learning Communities Staff PD. Professional Learning Committees The purpose of our PLCs includes but is not limited to: teacher collaborationNOT-
Chapter 14 Repeated Measures and Two Factor Analysis of Variance PowerPoint Lecture Slides Essentials of Statistics for the Behavioral Sciences Seventh.
AP PSYCHOLOGY: UNIT I Introductory Psychology: Statistical Analysis The use of mathematics to organize, summarize and interpret numerical data.
Technology Integration for Teaching and Learning at NES
Evaluation of An Urban Natural Science Initiative
Concept of Test Validity
Prepared by: Toni Joy Thurs Atayoc, RMT
Chapter Eight: Quantitative Methods
MEMORIZE THIS PROPORTION OF VARIANCE IN STUDENT GAIN SCORES-- READING, MATH-- EXPLAINED BY LEVEL--PROSPECTS STUDY STUDENTS 28% R 19% M SCHOOLS 12% R
Standard Setting for NGSS
PLCs Professional Learning Communities Staff PD
Measuring and responding to student growth
Data Meeting One (60 minutes)
Presentation transcript:

What Does “Data Training” Look Like?

What the Research Says Jimerson and Wayman (2011) – Little research about best ways to provide effective data-related professional learning – Implied constellation of knowledge and skills in admonition to use data to inform instruction

What the Research Says Jimerson and Wayman (2011) – Types of data and their interactions – Triangulation of data – Basic data concepts – Best learned by examining data related to important questions Overcomes worry about what should we do with all this data – Confidence that repeated practice will lead to understanding Desire “cheat sheets” and templates

What the Research Says US Department of Education (2011) 1.Data Location 2.Data Comprehension 3.Data Interpretation 4.Data Use in Instructional Planning 5.Question Posing

What the Research Says US Department of Education (2011) 1.Data Location Find information in a table or graph to answer a question 2.Data Comprehension Manipulate data to answer a question Translate numbers to a verbal statement

What the Research Says US Department of Education (2011) 3.Data Interpretation Difference between cross section and longitudinal data Difference between status, improvement, and growth Impact of “N” size Impact of “outliers” Implications of “measurement error” Importance of subgroups

What the Research Says US Department of Education (2011) 3.Data Interpretation Difference between mean, proportion, and range Difference between a continuous data graph and a categorical data graph Difference between a percentage and a percentile Categorical prediction based on population Role of multiple measures (“triangulation”) Validity, reliability, and fidelity Use of standard score to determine degree of growth

What the Research Says US Department of Education (2011) 4.Data Use in Instruction Item analysis Disaggregation by skill clusters Implications for differentiated instruction Implications for reteaching Implications for interventions 5.Question Posing Questions that data can vs. can’t answer Questions that available data can answer

Resources Dan Venables – The Practice of Authentic PLCs – How Teachers Can Turn Data into Action – Data Action Model 1.Ask questions of the data 2.Obtain additional data to triangulate 3.Identify learner gaps and instructional gaps 4.Set an improvement goal 5.Research and commit to improvement strategies 6.Implement strategies 7.Evaluate fidelity and impact