Download presentation
Presentation is loading. Please wait.
Published byHilda Stokes Modified over 9 years ago
1
Performance Measurement for Comprehensive Cancer Control Programs: An Update RTI International November 18, 2008
2
Presentation outline Background Background Results of the pilot year of Performance Measures Results of the pilot year of Performance Measures Next steps Next steps
3
Background
4
Background Since 1998, the Division of Cancer Prevention and Control (DCPC) has provided funding to states, tribes and tribal organizations, territories and jurisdictions to conduct comprehensive cancer control (CCC) in their areas Since 1998, the Division of Cancer Prevention and Control (DCPC) has provided funding to states, tribes and tribal organizations, territories and jurisdictions to conduct comprehensive cancer control (CCC) in their areas In 2003 the DCPC initiated a process of developing an evaluation framework and performance measures for all CCC programs In 2003 the DCPC initiated a process of developing an evaluation framework and performance measures for all CCC programs Beginning in 2008 CDC is asking programs to submit information to CDC on performance measures Beginning in 2008 CDC is asking programs to submit information to CDC on performance measures
5
Initial statement of purpose of performance measurement to establish accountability for public investments in cancer prevention and control to establish accountability for public investments in cancer prevention and control to document the outcomes and accomplishments of funded programs and partners; and, to document the outcomes and accomplishments of funded programs and partners; and, to facilitate a quality improvement process at the National program level to facilitate a quality improvement process at the National program level
6
Purposes of performance measures as the project developed General information on CCC nationally General information on CCC nationally Ex: the number of sectors, organizations and individuals meeting around the CCC table Ex: the number of sectors, organizations and individuals meeting around the CCC table Ex: Funds and in-kind other than from CDC Ex: Funds and in-kind other than from CDC General knowledge of the program to help in technical assistance General knowledge of the program to help in technical assistance Ex: sharing of satisfaction surveys Ex: sharing of satisfaction surveys Ex: Proportion of plan that is truly being implemented Ex: Proportion of plan that is truly being implemented Ex: Proportion of interventions that are truly “evidence- based” Ex: Proportion of interventions that are truly “evidence- based”
7
Further purposes of performance measures for CDC Monitoring of the most important activities Monitoring of the most important activities Ex: Reviewing data at specified intervals Ex: Reviewing data at specified intervals Ex: Coalition representing “all” of your people Ex: Coalition representing “all” of your people Documenting processes and, eventually, outcomes of CCC programs to demonstrate accountability for CCC funding Documenting processes and, eventually, outcomes of CCC programs to demonstrate accountability for CCC funding
8
Scope of performance measure domains Infrastructure for CCC Infrastructure for CCC Strength of partnerships Strength of partnerships Data used to monitor burden Data used to monitor burden Implementation of a Plan that Implementation of a Plan that Has evidence-based interventions Has evidence-based interventions Is evaluated systematically Is evaluated systematically Policy changes effected Policy changes effected Population-based measures achieved Population-based measures achieved
9
Results of the Pilot Year
10
Timeline DCPC/RTI workgroup began meeting in June 2007. DCPC/RTI workgroup began meeting in June 2007. Year 1 Performance Measures worksheet was sent out December 2007, and due back to program consultants February 15, 2008. Year 1 Performance Measures worksheet was sent out December 2007, and due back to program consultants February 15, 2008. Data cleaning, coding, entry, and analysis occurred between February 2008 and May 2008 Data cleaning, coding, entry, and analysis occurred between February 2008 and May 2008
11
Methods (1) Prior work by CDC-Macro and a workgroup to develop a list of measures Prior work by CDC-Macro and a workgroup to develop a list of measures CDC-RTI workgroup developed the Performance Measures Worksheet (PMW) over a 5-month period in 2007. CDC-RTI workgroup developed the Performance Measures Worksheet (PMW) over a 5-month period in 2007. CDC distributed the PMW to programs in January, 2008. CDC distributed the PMW to programs in January, 2008. Programs were required to provide data, but the worksheet was optional Programs were required to provide data, but the worksheet was optional 61 of a possible 69 programs responded. 61 of a possible 69 programs responded.
12
Methods (2) Data entry of quantitative data into Access Data entry of quantitative data into Access Extensive cleaning of data Extensive cleaning of data Data entry into Word of 2 types: qualitative data pertaining to each performance measure and also measure-specific and general comments about the worksheet Data entry into Word of 2 types: qualitative data pertaining to each performance measure and also measure-specific and general comments about the worksheet A few of the measures required extensive review of the qualitative data (e.g. 7.1 policies enacted) A few of the measures required extensive review of the qualitative data (e.g. 7.1 policies enacted) Analysis of the measure and worksheet comments Analysis of the measure and worksheet comments
13
Preliminary results and issues raised Year 01 Performance measures
14
Sample and Response Rate RTI received performance measures from 88% of the Program Population, N=61 (out of 69) RTI received performance measures from 88% of the Program Population, N=61 (out of 69) State N=47 (out of 51) State N=47 (out of 51) Tribes N=6 (out of 7) Tribes N=6 (out of 7) Territories/Jurisdictions N=8 (out of 11) Territories/Jurisdictions N=8 (out of 11) The response rate for individual Performance Measure items ranged from 72% - 100% (n=44– 61) The response rate for individual Performance Measure items ranged from 72% - 100% (n=44– 61) Performance Measure 5.3 Evidenced-based Interventions had the lowest response rate (n=44) Performance Measure 5.3 Evidenced-based Interventions had the lowest response rate (n=44)
15
Interpretation of Results The following section will provide an overview of each performance measure. The following section will provide an overview of each performance measure. The results displayed represent an average of the proportions across all programs. The results displayed represent an average of the proportions across all programs.
16
1.1 The percent of organization types represented in the CCC partnership How it is measured: How it is measured: Denominator: Programs were provided with a maximum of 34 different organization types* Denominator: Programs were provided with a maximum of 34 different organization types* Numerator: Programs reported how many of these organization types are members of the partnership Numerator: Programs reported how many of these organization types are members of the partnership The score for a program is the % of organization types that are represented in the partnership The score for a program is the % of organization types that are represented in the partnership * Note: Programs may strike out any sectors that are not relevant to their areas, creating a “shifting denominator”
17
1.1 Results The average program reports that 67% of the organization types are represented in the partnership (n=61) The average program reports that 67% of the organization types are represented in the partnership (n=61) This means that the programs on average checked off 23 of a possible 34 (67%) organization types on the worksheet as participating in their partnership This means that the programs on average checked off 23 of a possible 34 (67%) organization types on the worksheet as participating in their partnership Range: 15%- 97% Range: 15%- 97% * Note: The numbers are the averages of all the programs’ percentages
18
Comments The next version will The next version will Clarify instructions regarding striking out organizations Clarify instructions regarding striking out organizations Include suggestions from programs for additional organizations Include suggestions from programs for additional organizations
19
1.2 The percent of minority population groups that are represented in the partnership How it is measured: How it is measured: Denominator: Programs indicate which major racial/ethnic minority groups are prevalent in their state/area (>2%) * Denominator: Programs indicate which major racial/ethnic minority groups are prevalent in their state/area (>2%) * Numerator: Programs then indicate if they have a membership organization in the partnership that represents the ethnic group Numerator: Programs then indicate if they have a membership organization in the partnership that represents the ethnic group The score for a program is the % of population groups that are represented in the partnership The score for a program is the % of population groups that are represented in the partnership * Note: The denominator is determined by the number of racial/ethnic populations checked off, creating a “shifting denominator”
20
1.2 Minority Population Results The average program reports that 86% of racial/ethnic population groups are represented in the partnership (n=60) The average program reports that 86% of racial/ethnic population groups are represented in the partnership (n=60) Range: 0%-100% Range: 0%-100% * Note: The numbers are the averages of all the programs’ percentages
21
1.2 The percent of urban/rural populations that are represented in the partnership How it is measured: How it is measured: Denominator: Programs indicate whether urban/rural populations are present in the state/area Denominator: Programs indicate whether urban/rural populations are present in the state/area Numerator: Programs then indicate if they have a membership organization that represents the urban/rural population Numerator: Programs then indicate if they have a membership organization that represents the urban/rural population The score for a program is the % of urban/rural populations that are represented in the partnership The score for a program is the % of urban/rural populations that are represented in the partnership * Note: The denominator is determined by the number of racial/ethnic populations checked off, creating a “shifting denominator”
22
1.2 Urban/Rural Results The average program reports that 91% of urban or rural regions are represented in the partnership (n=58) The average program reports that 91% of urban or rural regions are represented in the partnership (n=58) Range: 0-100% Range: 0-100% * Note: The numbers are the averages of all the programs’ percentages
23
1.2 The percent of regions that are represented in the partnership How it is measured How it is measured Denominator: Programs indicate the number of regions in the state/area (maximum of 5)* Denominator: Programs indicate the number of regions in the state/area (maximum of 5)* Numerator: Programs indicate if they have a membership organization in the partnership that represents each of these regions Numerator: Programs indicate if they have a membership organization in the partnership that represents each of these regions The program score is the % of regions that are represented in the partnership The program score is the % of regions that are represented in the partnership * Note: The denominator is based on the number of regions indicated by the program, creating a “shifting denominator” * Note: The denominator is based on the number of regions indicated by the program, creating a “shifting denominator”
24
1.2 Region Results The average program reports that 98 % of geographic regions (maximum of 5) are represented in the partnership (n=56) The average program reports that 98 % of geographic regions (maximum of 5) are represented in the partnership (n=56) Range: 0%-100% Range: 0%-100% * Note: The numbers are the averages of all the programs’ percentages
25
1.2 Region Results (2) Also, programs were asked the total number of regions in their area (in order to refine the measure for the future) Also, programs were asked the total number of regions in their area (in order to refine the measure for the future) Mean=5 Mean=5 Range= 0-15 Range= 0-15
26
Comments There will be a change in the structure of this question. Programs will be asked: There will be a change in the structure of this question. Programs will be asked: How many regions do you have? How many regions do you have? How many of these regions have a member organization located in that region? How many of these regions have a member organization located in that region?
27
1.3 Assessment of coalition members’ satisfaction How it is measured: Program reports whether it has measured member satisfaction with the partnership in the last 2 years How it is measured: Program reports whether it has measured member satisfaction with the partnership in the last 2 years
28
1.3 Results 67% (41 of 61 programs) stated that they assessed members’ satisfaction with various aspects of the partnership within the past two years. (n=61) 67% (41 of 61 programs) stated that they assessed members’ satisfaction with various aspects of the partnership within the past two years. (n=61)
29
2.1 Burden is assessed How it is measured: Programs report whether cancer burden was assessed using 4 prescribed data sources in the last 5-year grant cycle How it is measured: Programs report whether cancer burden was assessed using 4 prescribed data sources in the last 5-year grant cycle * Note: Programs were able to strike out data sources not available in their areas, and thus were able to answer “yes” to this measure even if not all 4 datasources were used
30
2.1 Results 97% (59 of 61 programs) stated that they assessed the burden of cancer in their jurisdiction in last 5-year grant cycle. (n=61) 97% (59 of 61 programs) stated that they assessed the burden of cancer in their jurisdiction in last 5-year grant cycle. (n=61)
31
Comments The next version will: The next version will: Clarify instructions regarding striking out data sources Clarify instructions regarding striking out data sources Clarify the time frame for the most recent assessment Clarify the time frame for the most recent assessment
32
3.1 Amount of non-CDC funds How it was assessed: Programs reported the amount of all non-CDC funds in the current cooperative agreement How it was assessed: Programs reported the amount of all non-CDC funds in the current cooperative agreement
33
3.1 Results The median value reported by all programs who received non-CDC funds in the current cooperative agreement period was $119,315 (n=61) The median value reported by all programs who received non-CDC funds in the current cooperative agreement period was $119,315 (n=61) Range = $0-$28,068,000 Range = $0-$28,068,000 Interquartile Range (IQR)= $8,669-$659,570 Interquartile Range (IQR)= $8,669-$659,570 IQR gives the 25% value and the 75% value IQR gives the 25% value and the 75% value Used to report range when the data is skewed Used to report range when the data is skewed
34
3.1 Figure 1 * Note: The final interval represents funding > $1.5 million
35
3.1 Programs with greater than $1.5 million in non-CDC funding ProgramNon-CDC funds A$1,799,603 B$1,913,800 C$2,180,000 D$6,000,000 E$7,120,000 F$16,200,000 G$16,700,000 H$21,705,750 I$28,068,000
36
3.1 Results The most common agencies cited for providing non-DCPC funds to the state/tribe/territory/jurisdiction CCC program are: The most common agencies cited for providing non-DCPC funds to the state/tribe/territory/jurisdiction CCC program are: State State American Cancer Society American Cancer Society Foundations Foundations C Change C Change Private Companies Private Companies Universities Universities
37
3.1 State funding received by programs The median value reported by all programs who received State funds in the current cooperative agreement period was $556,150 (n=26) The median value reported by all programs who received State funds in the current cooperative agreement period was $556,150 (n=26) Range = $0-$28,000,000 Range = $0-$28,000,000 Interquartile Range (IQR)= $206,590- $1,632,750 Interquartile Range (IQR)= $206,590- $1,632,750 IQR gives the 25% value and the 75% value IQR gives the 25% value and the 75% value Used to report range when the data is skewed Used to report range when the data is skewed
38
Comments The next version will The next version will Clarify the timeframe for these funds Clarify the timeframe for these funds The next version will have the most frequently cited categories as options for the program to fill in. The next version will have the most frequently cited categories as options for the program to fill in.
39
3.2 Amount of in-kind resources How it was calculated: Programs reported the amount of in-kind resources for the current cooperative agreement. How it was calculated: Programs reported the amount of in-kind resources for the current cooperative agreement.
40
3.2 Results The median value reported by all programs who received in-kind resources in the current cooperative agreement period was $29,000 (n=61) The median value reported by all programs who received in-kind resources in the current cooperative agreement period was $29,000 (n=61) Range = $0-$11,800,000 Range = $0-$11,800,000 Interquartile Range=$11,866-$83,785 Interquartile Range=$11,866-$83,785 IQR gives the 25% value and the 75% value IQR gives the 25% value and the 75% value Used to report range when the data is skewed Used to report range when the data is skewed
41
3.2 Figure 2 * Note: The final interval represents funding > $200,000
42
3.2 Programs with in-kind funding > $200,000 Program nameIn-kind Amounts A$208,454 B$255,945 C$276,185 D$317,771 E$1,211,204 F$11,311,000 G$11,800,000
43
Comments The next version will: The next version will: Clarify the timeframe for these funds Clarify the timeframe for these funds Use standardized budget categories for data collection Use standardized budget categories for data collection
44
5.1 Proportion of partner organizations implementing interventions How it was calculated: How it was calculated: Denominator: Number of partners involved in the coalition Denominator: Number of partners involved in the coalition Numerator: Number of partners implementing at least one intervention focused on the priorities of the plan Numerator: Number of partners implementing at least one intervention focused on the priorities of the plan The program score is the % of partners that are implementing at least one intervention The program score is the % of partners that are implementing at least one intervention
45
5.1 Results The average program reports 60% of its partners are implementing interventions (n=56) The average program reports 60% of its partners are implementing interventions (n=56) Range = 17-100% Range = 17-100% Also, the range of partner organizations in the partnership (denominator) = 18-1678 Also, the range of partner organizations in the partnership (denominator) = 18-1678 * Note: The numbers are the averages of all the programs’ percentages
46
Comments The next version will clarify the definition of partners The next version will clarify the definition of partners
47
5.2 Proportion of plan objectives being addressed How it was calculated: How it was calculated: Denominator: Programs reported the total number of objectives in the plan Denominator: Programs reported the total number of objectives in the plan Numerator: Program reported the number of objectives currently being addressed. Numerator: Program reported the number of objectives currently being addressed. The program score is the % of objectives currently being addressed The program score is the % of objectives currently being addressed
48
5.2 Results The average program reports that 64% of the plan objectives are being addressed (n=58) The average program reports that 64% of the plan objectives are being addressed (n=58) Range 8-100% Range 8-100% Also, the range of objectives in the plan (denominator) is 4-349 Also, the range of objectives in the plan (denominator) is 4-349 * Note: The numbers are the averages of all the programs’ percentages
49
Comments The next version will refine the definition of objectives The next version will refine the definition of objectives
50
5.3 Interventions are evidence-based How it was measured: How it was measured: Denominator: Programs reported the number of interventions in the plan that are being implemented Denominator: Programs reported the number of interventions in the plan that are being implemented Numerator: Programs reported the number of interventions in the plan that are being implemented and are evidence-based Numerator: Programs reported the number of interventions in the plan that are being implemented and are evidence-based The program score is the portion is the percentage of interventions being implemented that are evidence-based The program score is the portion is the percentage of interventions being implemented that are evidence-based
51
5.3 Results The average program reports that 60% of interventions being implemented are evidence- based (n=44) The average program reports that 60% of interventions being implemented are evidence- based (n=44) Range = 0-100% Range = 0-100% Also, the range of number of interventions (denominator) was 5-100 Also, the range of number of interventions (denominator) was 5-100 * Note: The numbers are the averages of all the programs’ percentages
52
Comments There was a high rate of non-response for this question There was a high rate of non-response for this question
53
6.1 Program has a written evaluation plan How it was measured: Programs reported whether they had any written evaluation plan How it was measured: Programs reported whether they had any written evaluation plan
54
6.1 Results 66% (40 out of 61 programs) have an evaluation plan (n=61) 66% (40 out of 61 programs) have an evaluation plan (n=61)
55
Comments Results are based on a broad definition of evaluation plan. This definition is currently being reviewed by the CDC-RTI workgroup. Results are based on a broad definition of evaluation plan. This definition is currently being reviewed by the CDC-RTI workgroup.
56
7.1 Number of state and local-level policy changes How it was measured: Programs listed any state or local level policy changes that were achieved with a contribution from the CCC partnership How it was measured: Programs listed any state or local level policy changes that were achieved with a contribution from the CCC partnership Policies were then categorized according to whether they were “enacted, pending, or does not qualify as policy” Policies were then categorized according to whether they were “enacted, pending, or does not qualify as policy”
57
7.1 Results The total number of policies submitted by all programs is 297 The total number of policies submitted by all programs is 297 Total number of policies enacted =240 Total number of policies enacted =240 Total number of policies pending =20 Total number of policies pending =20 Total number of policies submitted but determined to not be policies= 37 Total number of policies submitted but determined to not be policies= 37 75% (46 of 61 programs) listed at least one enacted policy (n=50) 75% (46 of 61 programs) listed at least one enacted policy (n=50)
58
Comments Due to the nature of qualitative data, results of this measure will require continued analysis Due to the nature of qualitative data, results of this measure will require continued analysis The next version will provide categories for policies The next version will provide categories for policies
59
Time to complete worksheet The program median of time to complete the worksheet was 10 hours (n=54) The program median of time to complete the worksheet was 10 hours (n=54) Range = 2-100 hours Range = 2-100 hours Interquartile Range= 6-20 hours Interquartile Range= 6-20 hours IQR gives the 25% value and the 75% value IQR gives the 25% value and the 75% value
60
Conclusions from the pilot Successful pilot process Successful pilot process CDC gained both important data on programs as well as critical feedback to improve the measures and the process CDC gained both important data on programs as well as critical feedback to improve the measures and the process Changes needed for future performance measure worksheets Changes needed for future performance measure worksheets Revisions to the methods for year 2 Revisions to the methods for year 2
61
Next steps
62
Next Steps – Presenting Year 01 Results RTI and CDC will be developing a manuscript for publication in a peer reviewed journal. RTI and CDC will be developing a manuscript for publication in a peer reviewed journal.
63
Next Steps – Refining Year 02 Measures Exhaustive review by the Workgroup: Exhaustive review by the Workgroup: Face-to-face meeting with the Program Consultants and CDC staff Face-to-face meeting with the Program Consultants and CDC staff Detailed review of all the comments provided on the worksheet Detailed review of all the comments provided on the worksheet A draft of the Performance Measures worksheet has been developed A draft of the Performance Measures worksheet has been developed No change in the list of measures but many clarifications and definitions No change in the list of measures but many clarifications and definitions
64
Year 02 Data Collection Year 2 Performance Measures worksheets will be completed by programs in September 2009 Year 2 Performance Measures worksheets will be completed by programs in September 2009 Year 02 Performance Measures will only encompass the 2008 – 2009 fiscal year Year 02 Performance Measures will only encompass the 2008 – 2009 fiscal year Worksheets will be incorporated into the electronic reporting system under development Worksheets will be incorporated into the electronic reporting system under development
65
Next Steps – Year 03 The development of performance measures is ongoing The development of performance measures is ongoing RTI worked with CDC to draft the Year 3 Performance Measures in September 2008 RTI worked with CDC to draft the Year 3 Performance Measures in September 2008 Official notice of any changes will come with the guidance for Interim Progress Reports sent from PGO Official notice of any changes will come with the guidance for Interim Progress Reports sent from PGO
66
Final Comments Performance measurement is an important tool for public health agencies to establish accountability and a culture of quality improvement. Performance measurement is an important tool for public health agencies to establish accountability and a culture of quality improvement. With further refinement of the measures and evolution of a system of routine measurement, this activity has the potential to stimulate improvements in CCC at the program level and to document nationwide progress in CCC efforts With further refinement of the measures and evolution of a system of routine measurement, this activity has the potential to stimulate improvements in CCC at the program level and to document nationwide progress in CCC efforts
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.