Intro to NRS Data Diving Mary A. Gaston, Ed.D. & Jennifer Cooper-Keels February 4, 2011.

Slides:



Advertisements
Similar presentations
P-20 Data Collaborative Grant University/College Work Group February 24, 2010.
Advertisements

Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Customised training: Learner Voice and Post-16 Citizenship.
Simons Valley School Coming Together Areas of Focus 1. Building Community 2. Team Teaching 3. Reading Comprehension across the curriculum 4.
1 Strengthening Teaching and Learning: Educational Leadership and Professional Standards SABES Directors’ Institute July 2011.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
It must be borne in mind that the tragedy of life does not live in not reaching your goal. The tragedy of live is having no goal to reach. Benjamin E.
Parvaneh Rabiee, Kate Baxter, Gillian Parker and Sylvia Bernard RNIB Research Day 2014: Rehabilitation and social care RNIB, 105 Judd Street, London 20.
Accelerating Opportunity Evaluation Planning the Evaluation with the Accelerating Opportunity States February 10, :30 a.m. – 1:00 p.m.
New Teacher Induction Academy Data Collection November 30, 2011
Supporting and Engaging Online Learners Marisa Greco and Denise Decheck Waterfront Learning February 11th, 2014.
Breakout Sessions Self review Preparing for ALL. Purpose To meet other schools in your ALLS cluster To critically inquire into the effectiveness of current.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
PROFESSIONAL DEVELOPMENT FOR THE TEACHERS: IMPLICATIONS TO QUALITY EDUCATION LYDIA KGOMOTSO MPHAHLELE
Assessment Policy. Reporting Student Data in AERIS All student data must be entered into AERIS by the 15 th and approved by the 22 nd of each month for.
What Can Districts and Schools Do to Make Professional Development Work? Andy Porter Vanderbilt University June, 2004.
The National Reporting System: Foundations Technical College System of Georgia Office of Adult Education.
Implementation & Evaluation Regional Seminar ‘04 School Development Planning Initiative “An initiative for schools by schools”
Experiences and requirements in teacher professional development: Understanding teacher change Sylvia Linan-Thompson, Ph.D. The University of Texas at.
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
Slide 1 of 19 Lessons from the Foundation Learning provision for the new 16 to 19 Study Programmes Discussion materials Issue 1: Attendance, retention,
School’s Cool in Kindergarten for the Kindergarten Teacher School’s Cool Makes a Difference!
Adolescent Literacy – Professional Development
CATESOL Conference 2007 Learner Goal Setting in Adult Education ESL Napa Valley Adult School Laurel Leonard.
All certified staff need to write professional development learning goals. A minimum of three learning goals are required. Individual Professional Development.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Welcome to San Francisco! NRS 2010 Regional Training.
Orientation to 2008 Regional Training: Building and Sustaining Quality in NRS Data National Webinar June 24, 2008 Larry Condelli Steve Coleman.
Understanding the NRS Rosemary Matt NYS Director of Accountability.
Longitudinal Study to Measure Effects of MSP Professional Development on Improving Math and Science Instruction.
1. 2 Collaborative Partnerships It’s that evolution thing again! Adult education has been partnering and collaborating for years.
1 Building Faculty Involvement. 2 Objectives Understand why staff need to be committed to decreasing problem behaviors and increasing academic behaviors.
T 7.0 Chapter 7: Questioning for Inquiry Chapter 7: Questioning for Inquiry Central concepts:  Questioning stimulates and guides inquiry  Teachers use.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
Making the Most of Your Data: Strategies for Evaluating Your Program Greta Colombi, NDTAC; and John McLaughlin, ED.
NRS and Data Collection Part 2. NRS and Data Collection Part 1 and 2  Why? Make sure all programs understand what needs to be collected and definitions.
The Value of Data The Vital Importance of Accountability American Institutes for Research February 2005.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Expanding the Role of Volunteers “Elinor is an extraordinary volunteer who has been an invaluable resource in my ESL class at Dr. King School. The class.
THE ROLE OF FREQUENCY, DURATION, AND INTENSITY IN NONFORMAL SCIENCE PROGRAMMING Martin Smith and Katherine Heck University of California, Davis.
The National Reporting System: Foundations Technical College System of Georgia Office of Adult Education.
NRS JEOPARDY! The Adult Education Community’s Favorite Quiz Show.
1 Support Provider Workshop # East Bay BTSA Induction Consortium.
Other Data Issues Improving Data Quality American Institutes for Research February 2005.
Technical Assistance Workshop Office of Adult Education January 16,
Building a Culture of Leadership at Belmont High School Michael M. Harvey, Ed.D. Principal, Belmont High School.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools February.
Adult Education Assessment Policy Effective July 1 st, 2011.
Keeping Up With Demand: Measuring Labor Market Alignment in TAACCCT Programs Michelle Van Noy and Jennifer Cleary TCI Research Symposium: Evidence of What.
Professional Development
Distance Learning Academy Module 1: Understanding State Policy for Distance Education 1 TRAINER NAME Trainer Date TRAIN PD Consortium Texas Center.
ACS WASC/CDE Visiting Committee Final Presentation South East High School March 11, 2015.
New York State’s Report Card Summary.  Evaluated in TWO Methods NYS Benchmarks Quartile Ranked Educational Gain = 46% Post Test Rate = 65% Follow Up.
Performance Standards for Community Adult Learning Centers Donna Cornellier, MASSACHUSETTS DEPARTMENT OF EDUCATION.
Using Classroom Data to Monitor Student Progress Lani Seikaly, Project Director School Improvement in Maryland Web Site.
NRS Data System Checkup Assessing Your Data System Do You Need an MIS Tune-up? American Institutes for Research February 2005.
Summer ChAMP Carroll County Schools. Summer can set kids on the Right- or Wrong- Course Study links a lack of academic achievement, high drop out rates,
Using Data to Assess Quality Improvement Results February 23, 2009 Presentation at the CLP/ADRC 2010 Annual Meeting Debra J. Lipson.
TOPSpro Special Topics V: Meeting Federal Accountability Requirements.
SNRPDP Self Evaluation
Professional and Curriculum Development Survey
Tutor Staff Room Tutor Guide can be located here under Tutor/Trainer Course File and Learner Documents.
Intro to NRS Data Diving
Using Data for Program Improvement
Introduction to The Many Uses of Data
Annual Title I Meeting and Benefits of Parent and Family Engagement
Using Data for Program Improvement
February 21-22, 2018.
Week 2 Evaluation Framework
CASAS Pre-Test Results All-Staff Site Meeting
Presentation transcript:

Intro to NRS Data Diving Mary A. Gaston, Ed.D. & Jennifer Cooper-Keels February 4, 2011

2 2 Why Look at Data? Data help us to… Replace hunches and anecdotes with facts concerning the changes that are needed; Identify root causes of problems; Identify whether student or program goals are being met; and Tell our stakeholders, including students, about the value of our programs and the return on their investments. 5/24/2015

3 3 Data: A Carrot or a Stick? Data may be used… To highlight, clarify, and explain what’s happening in your program OR To show what’s not happening in your program. “However beautiful the strategy, you should occasionally look at the results.” –W. Churchill 5/24/2015

4 4 Data Tell You Where you’ve been Where you are Where you’re going How to get there Data can help you design a quality program to help meet learners’ goals. 5/24/2015

5 5 Applied to Adult Education… What can data do? Guide you to improve instruction Measure program success & effectiveness Tell you if what you are doing is making a difference Tell you which classes are getting the results you want—and which are not Get to the root of problems, such as poor retention, low educational gains, or low transition rates 5/24/2015

Starting the Dive

7 7 Attendance Educational Gain Transition Outcomes (Goals) 5/24/2015 For this workshop, we will focus on “Attendance” and “Educational Gain.” 3 Main Measures in our Data System:

8 8 Attendance Contact hours of instruction the learner receives (NRS) Includes intensity and duration Can help to tell us whether: –Instruction is successful –Content and materials are relevant –Students are motivated –Students are reaching their goals 5/24/2015

9 9 Examples: What Increases Attendance Quality instruction and relevant content Well-trained teachers Clear goals set at intake, revisited regularly, and matched to teachers and content Reduction of obstacles – flexibility in programming, support services, and access to site off-hours (NCREL; Lieb, 1991; Comings, 2007; Beder, 1988; Beder, 1991; Comings, Parella, & Soricone, 1999; Kerka, 2005; Thoms, 2001; Porter, Cuban & Comings, 2005) 5/24/2015

10 Educational Gain Advancement through 12 educational functioning levels Core NRS measure Can tell us: –Whether the program/students are meeting goals –Which sites/classes/teachers are most effective –Extent of student progress –Impact of changes 5/24/2015

11 Examples: What Increases Ed Gain Make classes learner-centered Focus on relevant knowledge Opportunity for practice and application Coherence Sufficient Intensity and Duration (NRC, 1999; Garet, Porter, Desimone, Birman, & Yoon, 2001) 5/24/2015

12 Do You Trust Your Data? Data analysis is only as good as the original data allow. Keys to good data collection systems include: Clear policies and procedures for data entry Data is entered & reviewed daily, weekly, or monthly Teachers, staff, administrator all have access to data and review regularly Teachers share data with students What does your program do to ensure data is accurate, reporting is timely, and staff have access to the data? 5/24/2015

Take a Dip in the Data Pool

14 Dive into the Data Pool For each of the next few slides write down your observations for discussion What do you see? What is interesting or unusual? Do any questions or hypotheses come to mind as a result? 5/24/2015

15 Write Observations/Questions? 5/24/

16 Write Observations/Questions? 5/24/2015

17 Write Observations/Questions? 5/24/2015

18 Write Observations/Questions? 5/24/2015

19 Write Observations/Questions? 5/24/2015

20 Write Observations/Questions? 5/24/2015 Satellite R Main PM Young Adult Main AM

21 Write Observations/Questions? 5/24/2015

22 Where to Go From Here? 5/24/2015 What should I change or replicate? What data supports this change? What additional data should be reviewed? What is the timeframe for change? Is it realistic? What obstacles/barriers will we encounter? What is the follow-up plan to measure and evaluate change? Based on what was learned from this “data dive”:

23 You Just Did a Data Dive! 5/24/2015

24 Questions: Attendance Attendance & Retention Sample questionsFurther questions Data collection & quality Who enters attendance data at each site? How often is attendance data entered? Who checks the data? How often? StudentsHow does attendance differ by student type (ESL vs. ABE)? When in the term do students tend to drop/stop-out most? Is this the same across sites? TeachersWhich classes have very high (or low) attendance? Do teachers with high attendance have greater educational gains? InstructionDoes attendance vary by instructional content (e.g. GED, workplace) or level? How many hours does it take to achieve a goal, on average? ProgramWhat is the average attendance for my program? Are my program’s attendance hours similar to other programs? Program policy Are my managed enrollment classes more successful than open classes? Does managed enrollment result in higher ed gains or greater goal achievement?

25 Questions: Educational Gain Educational GainSample questionFurther questions Data collection & quality What is the range of pre/posttest scores in my program/site? Are all the test scores within the correct range for the test and class level? StudentsWhich students are most likely to complete a level (student characteristics)? Do students with higher contact hours have greater completion rates? TeachersWhat teacher characteristics are most related to level completion? How high is teacher turnover at each site? Which sites retain teachers longest/best? InstructionWhich instructional approaches have the greatest impact on gain? Do assessments match course content? ProgramHow many hours of PD do our teachers participate in? Which PD have the greatest impact on student learning? Program policyDo placement policies differ among sites? Which placement policies have an impact on educational gains?

26 Tools for Data Diving 5/24/2015

27 What Do I Want to Know? What questions do you want to answer about your own local program? 5/24/2015