Download presentation
Presentation is loading. Please wait.
Published byPatrick Ralls Modified over 9 years ago
1
A State Staff Guide to Intervention Strategies for Low-Performing Programs COABE State Staff Pre-Conference developed for adult education state staff through the NAEPDC State Staff Workgroup National Adult Education Professional Development Consortium
2
Polis and McLendon, 20082 The Role of State Staff Low-performing programs Don’t need to know all of the answers EVERY state staff member has a role Need to provide Clear expectations on effective program performance A structure and process for defining, identifying, and prioritizing low-performing programs A technical assistance structure and resources to promote continuous improvement
3
Polis and McLendon, 20083 Training Objectives You will: Examine ten state-level decision points for intervening with low- performing programs Determine the most viable options for your state Create a structure and process for identifying, prioritizing, monitoring, and assisting low-performing programs
4
Polis and McLendon, 20084 What questions to ask? Activity 1: Scenario: You are given the assignment to design a state intervention structure and process for low-performing programs. Make a list of some of the questions you would ask yourself as you begin that task.
5
Polis and McLendon, 20085 Introduction to Decision Points 1. Have we set clear expectations for effective program practices? 2. What are our criteria for defining low- performing programs? 3. How should we prioritize low performance? What is our capacity to provide assistance? 4. Who has the expertise to provide targeted technical assistance? 5. How can we get low-performing programs to feel ownership in the program improvement process? Pg. 4
6
Polis and McLendon, 20086 Introduction to Decision Points 6. What approach or method will we use to help low-performing programs identify and prioritize needs? 7. How do we match identified needs to best practices and appropriate resources? 8. How will local programs pilot and monitor the impact of their program improvement efforts? 9. How do we monitor program improvement and measure impact at the state level? 10. What is our exit strategy?
7
Polis and McLendon, 20087 DP #1: Setting clear expectations What does an effective program look like? Setting clear expectations through program standards Indicators of program quality State samples on NAEPDC website http://naepdc.org/resource_library/program %20planning%20library/QSProgram_Standa rds.html http://naepdc.org/resource_library/program %20planning%20library/QSProgram_Standa rds.html Sample in your packet Pg. 11
8
Polis and McLendon, 20088 DP#2: Criteria for low- performing programs What criteria will you use? Failure to meet core performance measures? Unacceptable on-site review? Results of annual desk monitoring? Other developed state criteria? State samples
9
Polis and McLendon, 20089 DP#2: Criteria for low- performing programs Michigan Meeting or exceeding the state performance targets for completion of individual Educational Functioning Levels (EFL) as determined by standardized assessments; Meeting or exceeding the state’s overall EFL completion rate; Helping adult learners set realistic follow-up goals related to employment, post secondary education/job training, and GED/high school diploma; Meeting or exceeding the state’s overall attainment rate of follow-up goals; Meeting or exceeding the state’s target for pre-testing and post-testing of students to determine level completion; and Meeting or exceeding the state’s student attendance hour targets.
10
Polis and McLendon, 200810 DP#2: Criteria for low- performing programs Michigan Uses a weighting system For example, Student outcomes are worth more than some of the other measures. Total score determines ‘rating’ of Exemplary Superior Acceptable Not acceptable
11
Polis and McLendon, 200811 DP#2: Criteria for low- performing programs Program ComponentMissouri Critical Success Factors Targets Student Enrollment1. Enrollment increases or hours/student increases over three years Upward trends in enrollment or contact hours/student Student Retention2Students are staying at least 12 hours. 3.Students are staying until they post-test. 4.Students on average are attending with sufficient duration. 70% of enrollees attend at least 12 hours and are pre-tested. 55% of students are post-tested. The program’s average student contact hours meet or exceed the state average. Learning Gains5.Students are completing EFLS. Program meets or exceeds state benchmarks. High School Completion, Employment, Postsecondary Goals 6. Students with designated NRS follow-up goals are meeting those goals. Program meets or exceeds state benchmarks.
12
Polis and McLendon, 200812 DP#2: Criteria for low- performing programs What criteria will you use? Additional state samples Is your data accurate and reliable in identifying these criteria? Pg. 17
13
Polis and McLendon, 200813 DP#2: Criteria for low- performing programs Activity 2 Think about the most reliable data you have on local programs. Review the list of data elements in your packet. Which of these data would be most appropriate for the initial identification of low-performing programs? When would you examine these data to identify low-performing programs? Who would do this? Pg. 14
14
Polis and McLendon, 200814 DP#2: Criteria for low- performing programs Initial data checks for red flags Was valid and reliable pre-testing and post-testing conducted? What percentage of students were actually post-tested? If the percentage is low, why? Did teachers just not post-test, or did students not remain in the program long enough to be post-tested? What percentage of students exited within the first 12 – 20 hours of instruction?
15
Polis and McLendon, 200815 DP#2: Criteria for low- performing programs Initial data checks What level of confidence do you have with the follow-up data? High school/GED completion Entry into post-secondary/job training Employment Job retention How confident are you that student outcomes were accurately input into your state’s data system?
16
Polis and McLendon, 200816 DP#2: Criteria for low- performing programs Initial data checks How pervasive was the program’s low performance? Did one or two classes affect the whole program, or did multiple classes have low performance? Was there a sufficient number of students enrolled in a particular functioning level, or did the low number of students negatively impact performance?
17
Polis and McLendon, 200817 Initial Data Checks Who will conduct the initial data checks on the identified low-performing programs?
18
Polis and McLendon, 200818 DP#3: Prioritizing low performance Are there different levels of low performance? How do you prioritize? NGA recommendation: Target more intensive technical assistance to identified programs that: Are weaker performers, Have low internal accountability, or Have limited capacity to improve. Triage recommendation: Concentrate on those who just need moderate assistance
19
Polis and McLendon, 200819 DP#3: Prioritizing low performance Example: West Virginia At-Risk: failure to meet at least 60% of performance measures in the prior program year Targeted Technical Assistance: failure to meet at least 60% of performance measures for two of prior three years Low-Performing: Failure to meet at least 60% of performance measures for three consecutive years Pg. 16
20
Polis and McLendon, 200820 DP#3: Prioritizing low performance Each tier has a different level of corrective action, technical assistance and support. Important consideration: What types of incentives, resources, and assistance do you have the capacity to provide to low-performing programs? Answer may influence your tiered structure.
21
Polis and McLendon, 200821 DP#3: Prioritizing low performance Incentives, resources, and support NGA: sanctions resulted in lower program improvement than did increased, focused TA and support TA and support require time, staff, and energy What percentage of your resources (financial and human) can you dedicate to assist low-performing programs? Who makes the initial contact with program director?
22
Polis and McLendon, 200822 DP#3: Prioritizing low performance Activity 3 Activity 3 Should you prioritize low- performing programs? If so, what criteria would be used to define each level? Do you want to target programs most in need but may require significant, prolonged assistance for program improvement or low-performing programs with the greatest chance of improvement with nominal assistance?
23
Polis and McLendon, 200823 DP #4: Expertise for technical assistance NGA: Provide extensive on-site follow-up support from expert educators to implement research- based instructional improvement strategies NGA: Invest energy in training expert educators, instructional specialists, and assistance team members to work with programs.
24
Polis and McLendon, 200824 DP #4: Expertise for technical assistance Designate lead state staff person to oversee technical assistance process His/her role Take the lead in identifying low- performing programs based on developed criteria Coordinate scheduled TA visits and meetings Match identified needs to TA sources Monitor program improvement efforts
25
Polis and McLendon, 200825 DP #4: Expertise for technical assistance Identifying your own expertise In what areas do state staff feel confident in providing direct technical assistance? In what areas would you prefer to use experts in the field – local directors and instructors with proven track records? How will you identify them? How will you train them for their new role? Will they be compensated for their efforts?
26
Polis and McLendon, 200826 DP #5: Ownership in program improvement “People don’t argue with what they help to create.” Ron Froman Low-performing programs must feel ownership in the program improvement process.
27
Polis and McLendon, 200827 DP #5: Ownership in program improvement Program improvement at the local level requires: Leadership Time Skills Will
28
Polis and McLendon, 200828 DP #5: Ownership in program improvement Programs don’t get repaired unless questions are raised by those who know the program best. Create and nurture a culture of inquiry and continuous improvement
29
Polis and McLendon, 200829 DP #5: Ownership in program improvement Local program effectiveness teams (PET’s) A group of people who work together to develop, lead, and coordinate the program improvement process Six to eight people Representative group Coordinated effort Commitment to the task
30
Polis and McLendon, 200830 DP #5: Ownership in program improvement Local program effectiveness team responsibilities Obtain input from other staff and incorporate it into the program improvement process Collect data Meet regularly to discuss progress, make preliminary conclusions, reflect on what data shows Assist with documentation and evaluation of the process
31
Polis and McLendon, 200831 DP #5: Ownership in program improvement State staff role: Facilitate a local meeting with all staff to provide an overview of the program improvement process Outline the role that each staff member plays in program improvement Facilitate first meeting of the Program Effectiveness Team
32
Polis and McLendon, 200832 DP #6: Identifying and prioritizing needs Need a deliberate and strategic approach for identifying and prioritizing needs Two approaches Possible Causes, Probing Questions, and Strategies Chart The Program Improvement Prioritization Process
33
Polis and McLendon, 200833 DP #6: Identifying and prioritizing needs Possible Causes, Probing Questions, and Strategies Chart Aligns with criteria for identifying low- performing programs Possible causes and probing questions help programs isolate the root causes of low program performance Pgs. 26
34
Polis and McLendon, 200834 DP #6: Identifying and prioritizing needs The Program Improvement Prioritization Process The Program Improvement Prioritization Process More global approach Refer to flowchart in your packet Prioritization charts Plotting charts Pg. 25 Pgs. 17 - 22 Pgs. 23
35
Polis and McLendon, 200835 DP #7: Matching identified needs to best practices Critical need – most difficult step Invest time and resources to match needs to strategies NCSALL, CAELA, TESOL, LINCS NAEPDC Sample Causes, Probing Questions, and Strategies chart Pg. 26
36
Polis and McLendon, 200836 DP #8: Pilot testing local program improvement efforts Local programs need to pick their best sites to: Ensure the impact of the new strategy on correcting the problem. If the impact is positive: build the professional development to implement it program wide. recommend policy and procedure changes to support its use throughout the program propose financial needs to scale it up. identify the data that will need to be collected to monitor the impact program wide. Scale it up!
37
Polis and McLendon, 200837 DP #9: Monitoring program improvement and measuring impact Make sure you collect the right data. Engage the Program Effectiveness Team in collecting and analyzing the data to monitor the impact program wide. Report the results to the agency head and the state office.
38
Polis and McLendon, 200838 DP #10: The exit strategy Positive exit How good is good enough? Negative exit De-funding
39
Polis and McLendon, 200839 Introduction to Decision Points 1. Have we set clear expectations for effective program practices? 2. What are our criteria for defining low- performing programs? 3. How should we prioritize low performance? What is our capacity to provide assistance? 4. Who has the expertise to provide targeted technical assistance? 5. How can we get low-performing programs to feel ownership in the program improvement process? Pg. 4
40
Polis and McLendon, 200840 Introduction to Decision Points 6. What approach or method will we use to help low-performing programs identify and prioritize needs? 7. How do we match identified needs to best practices and appropriate resources? 8. How will local programs pilot and monitor the impact of their program improvement efforts? 9. How do we monitor program improvement and measure impact at the state level? 10. What is our exit strategy?
41
Polis and McLendon, 200841 Always willing to help Lennox McLendon lmclendon@naepdc.org Kathi Polis klpolis@suddenlink.net
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.