Presentation is loading. Please wait.

Presentation is loading. Please wait.

Using Cost-Effective Processes to Develop Large-Scale Data-Driven Continuous Improvement Systems for Local Programs Laurie A. Van Egeren, Jamie Wu, Michigan.

Similar presentations


Presentation on theme: "Using Cost-Effective Processes to Develop Large-Scale Data-Driven Continuous Improvement Systems for Local Programs Laurie A. Van Egeren, Jamie Wu, Michigan."— Presentation transcript:

1 Using Cost-Effective Processes to Develop Large-Scale Data-Driven Continuous Improvement Systems for Local Programs Laurie A. Van Egeren, Jamie Wu, Michigan State University Angelina Garner, Charles Smith, David P. Weikert Center for Youth Program Quality American Evaluation Association Minneapolis, MN October 26, 2012

2 21 st Century Community Learning Centers

3 In 2011-2012: TACSS is Quality Assurance for: 320 elementary, middle and high schools sites 40,000+ students Over $50M investment Michigan 21 st CCLC

4 Evaluation Scope in 2003

5 Evaluation Scope Now

6 AssessPlanImprove Continuous Improvement

7

8 MDE State Education Agency MSU State Evaluator TACSS Program Quality Improvement Quality Improvement Support System

9 Standard Indicators of Quality

10 Cost-Effective Data Reports

11 X

12 Capacity-Building for Data Use

13 1. Standard quality indicators

14 Identify Indicators MDE TACSS MSU Advisory Board Literature

15 1. Instructional Context

16 2. Organizational Context

17 3. Positive Relationships

18 Select Data Sources Youth surveyParent surveyStaff survey Supervisor survey Administrator report Observational program self- assessment Attendance/ activity data (web) School outcomes data

19 Comparability: Weighted 10- pt scale Measure Weight Academic activity participation 1.5 Homework help/tutoring participation for academically at- risk students 1.5 Academic enrichment participation 1.5 Activities informed by grade-level content standards 1 Student reports of academic support quality 1.5 Academics is top priority.5 Supervisor connection to school- day content 1 Staff connection to school-day content 1.5 Total 10

20

21

22

23

24

25 Go deeper – comparisons on measures SitesOrgState

26 Go deeper – comparisons on measures 0-10 pt indicator score Measure scores however defined

27 Go deeper – comparisons on measures Sites Uh oh…

28 Even deeper – item data for sites

29 Uh oh…

30 Grantee Summary Site Comparisons Site Details

31 2. Cost-effective local report production

32 Assumption: You’re analyzing data anyway

33 1. Collect data 2. Develop report template in Word 3. Analyze data to match [decisions] 4. Create excel or.csv file of data 5. Use Word mail merge to populate reports (tweak if necessary) 6. Voila ! Process

34 Step 1: Collect Data Youth surveyParent survey Staff survey Supervisor survey Administrator report Observational program self- assessment Attendance/ activity data (web) School outcomes data

35 Step 1: Collect Data Youth surveyParent survey Staff survey Supervisor survey Administrator report Observational program self- assessment Attendance/ activity data (web) School outcomes data

36 Grantee Summary Site Comparisons Site Details Step 2: Develop report template

37 Step 3: Analyze for report – Decisions! Indicator MIOrg123 1.4 Connection to School Day 4.74.14.46.94.2 Formal policy for connecting with school day a,b 69%75% Supervisor communication with school e 46%11%0% Staff communication with school d 27%21%40% 25% School investment in program b 61%80%Yes

38 Step 3: Analyze for report – Decisions! Indicator MIOrg123 1.4 Connection to School Day 4.74.14.46.94.2 Formal policy for connecting with school day a,b 69%75% Supervisor communication with school e 46%11%0% Staff communication with school d 27%21%40% 25% School investment in program b 61%80%Yes Impute to get indicator score

39 Step 3. Analyze for report – Decisions! What is minimum N (varies) How determine cut-offs?

40 Use syntax!

41 Step 4. Create Excel or.csv file

42 Step 5. Mail merge Excel file into template

43

44

45 3. Capacity-building for data use

46 A Decade in the Making Late 1990s – TA and quality assessment model developed in 2,000 MI School Readiness Classrooms (Preschool) 2003-2005 – Michigan Standards for OST; YPQA Self-Assessment piloted and mandated 2008 – Quality improvement planning and support from local evaluators mandated 2009– TACSS begins

47 Why TACCS?

48 1. Grow a culture of performance accountability 2. Develop a low-stakes infrastructure for continuous quality improvement 3. Improve overall quality of 21 st CCLC services and start up for new sites 4. Improve the instructional quality for young people TACSS Goals

49 POLICY SETTING Important Concepts Underpinning TACSS ORGANIZATION SETTING INSTRUCTIONAL SETTING Instructional Quality Management Skills for Continuous Quality Improvement Low Stakes Accountability TA/Coach Values & Methods

50 Important Concepts Underpinning TACSS Management Skills for CIP Lead a team to assess the quality of instruction Provide real-time staff performance feedback ASSESS Lead team to create an improvement plan based on data Select align methods training for direct staff PLAN Carry out plan to improve instructional quality Monitor progress and repeat IMPROVE

51 Important Concepts Underpinning TACSS High Stakes Accountability Policy Objective Data PublicityAction Improved Outcomes

52 Objective Data Meaningful Information Action/ Expertise Improved Outcomes Low Stakes Accountabilities Learning Community Improvement Efforts Important Concepts Underpinning TACSS Low Stakes Accountability Policy for CQI

53 TACSS Project Model in Detail Regional TA Coaches Regional TA Coaches Improve Service Quality & Child Outcomes Improve Service Quality & Child Outcomes  5-year project  5.5 FTEs (1 manager, 4 TA/Coaches, 1 support staff)  1 PTE (Contract Coach)

54 The TACSS Model Comprehensive Support Sequence MDE Kickoff Event Introductory meeting with Grantee Data Profile assembled Onsite visit, data profile review, and prep for TA Planning Day Director Interview Regarding CQI Practices & red flag issues Team Self Assessment of Instructional Quality Planning with Data sessions; Develop TA-Plan Maintenance of TA Plan with on-going TA/coach support

55 Data Driving the System Leading Indicators to Program Improvement Grantee Profile Site Profiles Site Detail

56 Technical Assistance Plan Co-created Linear/sequential Accountability Intentionality Scheduling Use of Data to drive decision making Living/Working document

57 Core & Supplemental Services Menu

58 POLICY CONTEXT ORGANIZATIONAL CONTEXT INSTRUCTIONAL SETTING Higher Intensity Lower Intensity..... The TACSS Model Comprehensive Supports are Multi-level. School district and union issues around staffing Understanding vendor and partnership relationships Training for Conflict Resolution Support continuation grant /renewal Support program self- assessment Site visits to provide quality coaching

59 TACSS Calendar Year July AugSeptOctNovDecJanFebMarchAprilMayJuneJulyAug Letters Coach's reflection letters Personal invitation to kick off orientation TACSS orientation at Kick off Introductory TACSS meeting TA Planning Self Assessment process support ( YPQA, PIP) External Assessment scheduling/observation Leading Indicator Introduction/review (PD) External Assessment review Data planning session (support PD to lead staff) Mission is Possible professional development opportunity Monthly follow up communications

60 To Sum Up Leading Indicators = roadmap to quality program Founded in mass-reported data Decisions about changes are driven by data Technical assistance supports programs to use that data in ways they identify

61 Questions…


Download ppt "Using Cost-Effective Processes to Develop Large-Scale Data-Driven Continuous Improvement Systems for Local Programs Laurie A. Van Egeren, Jamie Wu, Michigan."

Similar presentations


Ads by Google