Download presentation
Presentation is loading. Please wait.
Published byDorthy Kennedy Modified over 9 years ago
1
1 Part I: Finding the Red Flags in Your Data Part II: Student Retention – Sharing What Works Michigan Administrative Leadership Institute Building World Class Programs March, 2004 It’s All About Program Improvement
2
2 It is all about Program Improvement Improving the Quality of our Services and the Success of our Students
3
3 Program Improvement Three Starting Points Assessing Current Capabilities Programs Staff Using Data to pinpoint program improvement targets Integrating Research
4
4 TODAY Using Data to pinpoint program improvement targets
5
5 Warming Up to Data When I talk about data and graphs, I feel like (a) _______________________ because _________________________.
6
6 Warming Up to Data When I talk about data and graphs, “I feel like Marcia Clark because I’ve got all kinds of data and the jury (my teachers) just won’t believe it.” “I feel like I’m in the wrong room because I noticed the accountants meeting next door.”
7
7 Part I Training Objectives –Developing critical success factors for key program components, –Identifying types and sources of available data for those components, –Determining appropriate indicators for flagging potential problems/red flags, –Generating appropriate questions to identify possible causes of red flags, and –Developing a structure to apply what you learned. By the end of today’s workshop, you will be able to use a data analysis process to promote continuous improvement by:
8
8 OVERVIEW Part I: Types of data you have at hand: Performance data Local census data Enrollment and attendance data Handout # 1A(Handout # 1A)
9
9 OVERVIEW Part II: Using data for decision making Key Program Components 1.What are the Key Program Components of my program? successful 2.How do I know we are successful with each? data 3. What data do I have to determine if we are successful?
10
10 OVERVIEW Part II: Using data for decision making Red Flags 4. Finding the Red Flags ( What is not “good enough?”) Isolating the problem 5. Isolating the problem Pilot Testing 6. Finding and Pilot Testing possible alternatives Integrating 7. Integrating the new alternative throughout my program (Going to Scale)
11
11 OVERVIEW Part III: Making it Work StructureStructure: Creating a structure and process to do the 7 items above Resources:Resources: Setting aside a few resources to perform the 7 items above.
12
12 Data: A Carrot or a Stick? Data can be used… To highlight, clarify, and explain what’s happening in your program, OR To show what’s not happening in your program
13
13 Using Data to Your Advantage (i.e., as a carrot) is up to you.
14
14 Data Tells You… What has happened, What is happening now, Where do we need to focus our program improvement energy?
15
15 What kinds of data do you have? Attendance/Enrollment Numbers Student Test Scores and Learning Gains Student Drop-out/Completion Rates Teacher Characteristics Student Demographics Budgets Census Data Other
16
16 What can the data tell me? Are students coming? (enrollment) Are they staying? (retention) Are they learning? (educational gains) Are they meeting their goals? (student transitions) Other?
17
17 Functions of Data Accountability Help us identify whether goals are being met Program Management and Improvement Help us isolate problems Help us replace hunches and hypotheses (gut and guru) with facts concerning the changes that are needed Marketing Tell our funders, our boards, and our communities about the value of our programs and the return on their investments
18
18 Quote of the Day “People without information cannot act. People with information cannot help but act.” (Ken Blanchard)
19
19 Activity 1: Warming Up to Data Data show that young adults in your program are dropping out at a significantly higher rate than older adults.
20
20 Activity 1: Warming Up to Data –What data would have told you this? –What drop-out rate would you consider to be a problem (i.e., 5%, 20%)? –Are there additional data you would want to look at to explore this issue? –What questions would you ask?
21
21 Program Improvement Three Starting Points Assessing Current Capabilities Programs Staff Using Data to pinpoint program improvement targets Integrating Research 3
22
22 Self Assessment 1.Administer a program self assessment. 2.Engage the workgroup to review the self assessment, clarify and prioritize needs. 3.Set the vision for your program improvement initiative. 4.Gather and select promising alternative strategies. Data Analysis 1.Conduct an analysis of data. 2.Engage the workgroup to review the data analysis, clarify and prioritize needs. 3.Set the vision for your program improvement initiative. 4.Gather and select promising alternative strategies. Research Findings 1.Access research findings. 2.Understand the focus, questions, and implications. 3.Judge potential contributions and compatibility with your program policies and practices. 4.Use: Assess the impact and set the vision for your program improvement initiative. 5. Cost out and budget the initiative. 6. Set six-month benchmarks. 7. Pilot and adapt it to fit your program (for research, adapt the system to integrate the findings). 8. Link to other initiatives. 9. Design and provide staff development resources. 10. Design and provide support and other resources. 11. Bring on the first wave of classes. 12. Measure the impact on goals and tweak. 13. Bring on subsequent waves of classes. 14. Measure the impact on goals and tweak. 15. Fully integrate the initiatives. 16. Measure impact based on goals. 17. Celebrate success! HandoutHandout#1HandoutHandout#1
23
23 Data Analysis 1.Conduct an analysis of data. 2.Engage the workgroup to review the data analysis, clarify and prioritize needs. 3.Set the vision for your program improvement initiative. Data Analysis Process Identify your key program components. Determine the critical success factors for each component. Identify and collect the data you need. Analyze and interpret the data to determine problems/red flags. Develop probing questions to isolate possible causes. Return to the Trident.
24
24 Step 1: Identify the Key Program Components What is most important to you For today’s workshop: –Enrollment: Are the students coming? –Retention: Are they staying? –Federal Performance Measures: Are they learning? Are they meeting their goals? Learning Gains High School Credential Transition to Postsecondary/Job Training Employment Goals
25
25 Step 2: Determining Critical Success Factors What criteria do you use to determine if your program components are successful?
26
26 Step 2: Determining Critical Success Factors Program Component Sample Critical Success Factors Student EnrollmentClasses are full. Enrollment reflects the community. Our cost per student reflects the state’s average for similar programs. Student RetentionStudents are staying for at least 12 hours. Students are staying until they meet their goals. Learning GainsStudents are completing EFL’s. Completion of HS Credential Students with that goal are earning a high school credential. Transition to Postsecondary/Job Training Students with that goal are enrolling in postsecondary education/job training. EmploymentStudents with employment goals are employed. HO #3
27
27 Step 2: Determining Critical Success Factors Any other success factors?
28
28 Step 3: Identifying and Collecting the Data 1.What data do you already have that will answer your question? 2.What additional existing data, if any, will you need to answer your question? 3.Where are you going to get the additional data?
29
29 Activity 2: Data Data Everywhere Small Groups: Group 1: Enrollment Group 2: Learning Gains Group 3: High School, Postsecondary, Employment
30
30 Activity 2: Data Data Everywhere For your group’s designated program component/s and critical success factors, determine: 1.What data do you already have that will answer your question? 2.What additional existing data, if any, will you need to answer your question? 3.Where are you going to get the additional data?
31
31 A Look at the Census Making Sense of the Census Types of Data Available –Data Tables –Data Maps Aldklakljf akldflkasd apsodrfedsojk rtyjklc234567803948577 ertkijrd7568393202938384746
32
32 Census Levels 1.US 2.State 3.County 4.County Subdivisions 5.Census Tracts 6.Zip Codes
33
33 NRS Tables: Up Close and Personal A Look at Your MAERS Reports What information do they contain? What do they tell us? Do you trust the data?
34
34 Finding the Red Flags Step 4: Analyzing and Interpreting the Data Finding the Red Flags Determining what is good enough –Are there specific performance benchmarks you must meet? –Are there other program standards you have established? –Do you know the state average for a particular component (i.e., average cost per student, average number of instructional hours/student)?
35
35 Sample Red Flag Indicators Program Component Red Flag Indicators: What is good enough? What could indicate a problem? Student Enrollment (40%) or more of classes have less than 10 student hours for each teacher hour. There is a difference of (__%) or more between the percentage of a particular population trait enrolled in our program and the percentage of that population trait in our community. Our cost per student is (125%) or more of the state cost for similar programs. Student Retention(10%) or more of our students leave before reaching 12 hours. Less than (60%) of our students meet their primary/secondary goals. HO #4
36
36 Sample Red Flag Indicators Program Component Red Flag Indicators: What is good enough? What could indicate a problem? Learning Gains (HO#7) One or more of our EFL performance percentages is below the state target percentages. Completion of HS Credential Our performance is below the state target percentage. Postsecondary/Job Training Our performance is below the state target percentage. EmploymentOur performance is below the state target percentage. Less than (__%) of our unemployed students have an employment goal.
37
37 Presenting the Data
38
38 Presenting the Data
39
39 Data Presentation Examples Line Chart
40
40 Activity 3: Red Light, Green Light For your group’s program component: –Examine the data you have Were there additional data that would have been helpful to have? What? –Determine if there are any red flags Were the data consistent with the critical success factors? Were there gaps? –Graphically display your findings on the flip chart (graph, chart, etc.)
41
41 Focusing the Data If you know why, you can figure out how… (W. Edwards Deming)
42
42 Step 5: Developing probing questions to isolate possible causes. What questions do I need to ask? Resources: –Red Flag chart –50 Questions handout –Super Duper program self-assessment
43
43 Developing Measurable Questions RED FLAG: Failed to meet state performance target for Low Intermediate Students Poor Question: Does my program have good teachers? Good Question: Does student learning differ by average number of instructional hours? Better Question: What are the differences between the average number of instructional hours for low intermediate students and high intermediate students?
44
44 Developing Measurable Questions RED FLAG: Failed to meet state performance target for students earning a high school credential. Poor Question: –Do teachers/directors know how to set realistic student goals for high school completion? Good Question: –What entry EFL produces the most high school/GED completions? Better Question: –How do the entry EFLs differ among students with high school/GED completion goals who actually earn a credential?
45
45 “Let’s do unto the data before the data gets done unto you.” (Edie Holcomb)
46
46 Sample Probing Questions Program Component Sample Probing Questions Student EnrollmentAre there differences in instructional setting (e.g., instructional setting, types of students, location, recruitment strategies, teacher preparation) between classes that meet the standard and those that don’t? What strategies are we using to recruit each under-enrolled target group? What evidence do we have that demonstrates the most effective recruitment strategies for each under-enrolled target population? Are there particular circumstances (e.g., full- time teachers) that are driving our cost per student above the state average? HO#5
47
47 Sample Probing Questions Program Component Sample Probing Questions Student Retention Is there a difference between the student intake and orientation procedures in classes with strong student retention versus weak student retention? Is there a difference between student demographics (e.g., age, gender, ethnicity, goals) between classes with strong and weak student retention? Is there a relationship between average hours of attendance and % completing EFLs? Is a thorough process being used for realistic student goal setting? What is our process? Is the individual learning plan, curriculum, and instruction linked to student goals? Are there significant barriers to attendance (e.g., childcare, transportation, etc.)? HO#6
48
48 Sample Probing Questions Program Component Sample Probing Questions Learning GainsIs there a relationship between completion of EFL’s and average instructional hours? Can our teachers identify what is not working (e.g., assessment, placement, instruction, materials, data input, special learning needs)? Is there a relationship between completion of EFL’s and instructional setting (e.g., learning lab versus classroom)? Is there a relationship between years of teacher experience and completion of EFL’s? Is there a relationship between part-time and full-time teachers with completion of EFL’s? Additional Questions for Class/Teacher- Specific Data
49
49 Sample Probing Questions Program Component Sample Probing Questions High School Credential What criteria are teachers/directors using in setting GED completion goals? Are they realistic for the program year? How accurate is the process we are using to document completion of GED’s? How do the entry EFL’s differ among students with high school/GED completion goals who actually earn and don’t earn a credential? Can our teachers identify what is not working (i.e., assessment, placement, instruction, materials, data input, special learning needs)?
50
50 Sample Probing Questions Program Component Sample Probing Questions Transition to Postsecondary Education/Job Training What criteria are being used when setting postsecondary education goals? Are they realistic for the program year? What transitional services are being provided to assist students with the enrollment process? How reliable is our student follow-up process to document enrollment in college or job training programs? Can our teachers identify what is not working (i.e., goal setting, transition strategies, follow- up data, curriculum matched to goal)?
51
51 Sample Probing Questions Program Component Sample Probing Questions EmploymentWhat criteria are being used when setting employment goals? Are they realistic for the program year? What transitional services are being provided to assist students with obtaining and/or retaining employment? How reliable is our student follow-up process to document employment? Can our teachers identify what is not working (i.e., goal setting, transition strategies, follow- up data, curriculum matched to goal)? What is the relationship between the number of unemployed students and those with employment goals?
52
52 Activity 4: What Questions Would You Ask? In your small groups: Identify some possible reasons for the scenarios in the Activity 4 handout, all of which were gleaned from various data reports. What questions would you need to ask to isolate the problem? What might you do to address the problem?
53
53 Self Assessment 1.Administer a program self assessment. 2.Engage the workgroup to review the self assessment, clarify and prioritize needs. 3.Set the vision for your program improvement initiative. 4.Gather and select promising alternative strategies. Data Analysis 1.Conduct an analysis of data. 2.Engage the workgroup to review the data analysis, clarify and prioritize needs. 3.Set the vision for your program improvement initiative. 4.Gather and select promising alternative strategies. Research Findings 1.Access research findings. 2.Understand the focus, questions, and implications. 3.Judge potential contributions and compatibility with your program policies and practices. 4.Use: Assess the impact and set the vision for your program improvement initiative. 5. Cost out and budget the initiative. 6. Set six-month benchmarks. 7. Pilot and adapt it to fit your program (for research, adapt the system to integrate the findings). 8. Link to other initiatives. 9. Design and provide staff development resources. 10. Design and provide support and other resources. 11. Bring on the first wave of classes. 12. Measure the impact on goals and tweak. 13. Bring on subsequent waves of classes. 14. Measure the impact on goals and tweak. 15. Fully integrate the initiatives. 16. Measure impact based on goals. 17. Celebrate success!
54
54 Data Analysis 1.Conduct an analysis of data. 2.Engage the workgroup to review the data analysis, clarify and prioritize needs. 3.Set the vision for your program improvement initiative. Data Analysis Process Identify your key program components. Determine the critical success factors for each component. Identify and collect the data you need. Analyze and interpret the data to determine problems/red flags. Develop probing questions to isolate the problems. Return to the Trident.
55
55 Planning the Work/Working the Plan Creating a Structure and Process for Data Use 1.How will you engage staff in looking at data (i.e., annually during a staff meeting, creating a data use study group, etc.)? 2.When will this occur? 3.What resources, if any, will you need to do this? 4.How will you evaluate the process and results?
56
56 Part I Training Objectives By the end of today’s workshop, you will be able to: –Identify types and sources of available data for use in program improvement, –Generate appropriate questions to ask about the data, –Determine appropriate indicators for flagging potential problems/red flags, and –Establish a structure and process for using data to promote continuous improvement.
57
57 Part I Training Objectives –Developing critical success factors for key program components, –Identifying types and sources of available data for those components, –Determining appropriate indicators for flagging potential problems/red flags, –Generating appropriate questions to identify possible causes of red flags, and –Developing a structure to apply what you learned. By the end of today’s workshop, you will be able to use a data analysis process to promote continuous improvement by:
58
58 Using Data to Your Advantage (i.e., as a carrot) is up to you.
59
59 Quote of the Day “People without information cannot act. People with information cannot help but act.” (Ken Blanchard)
60
60 Focusing the Data If you know why, you can figure out how… (W. Edwards Deming)
61
61 Presenters Always willing to help… –Lennox McLendon lmclendon@naepdc.org –Kathi Polis kpolis@cox.net
62
62 Thank you very much! This project was developed by Florida Human Resources Development, Inc. (FHRD) and the National Adult Education Professional Development Consortium, Inc., in cooperation with the Michigan Department of Labor and Economic Growth and funded through a grant under Section 222(a)(2) State Leadership Activities of the Adult Education and Family Literacy Act, Title II of the Workforce Investment Act of 1998, amended.This project was developed by Florida Human Resources Development, Inc. (FHRD) and the National Adult Education Professional Development Consortium, Inc., in cooperation with the Michigan Department of Labor and Economic Growth and funded through a grant under Section 222(a)(2) State Leadership Activities of the Adult Education and Family Literacy Act, Title II of the Workforce Investment Act of 1998, amended. For more information visit:For more information visit: http://www.maepd.orghttp://www.maepd.org
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.