Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluators In the Midst of Policy Makers

Similar presentations


Presentation on theme: "Evaluators In the Midst of Policy Makers"— Presentation transcript:

1 Evaluators In the Midst of Policy Makers
Presented By George F. Grob Center for Public Program Evaluation June 2010

2 What Does It Take? What Does It Take for an Evaluator to Succeed in the World of Policy Making?

3 Winning Tactics for Evaluators Who Want to Improve Policy
Hit a home run?

4 Or Get In the Game?

5 Grand Strategy Policy Mechanisms Master Them! Body of Work
Contribute to It! Reporting Keep It Simple Tell Everyone Thought Leaders Join Them!

6 On Being A Professional
Evaluator Subject Matter Specialist Policy Development Expert Mostly, we will talk about the latter

7 What is Policy? A Really Big Decision About Affecting Spanning
Several Programs Or One Huge One A Lot of Money Affecting Many People A Large Economic Sector Spanning Several Years A Large Geographic Area Involving High Level Officials Attracting Attention Trade Press National Media

8 Policy Mechanisms Budget Legislation Regulation Goals Themes
Public Affairs Key Appointments Reorganization Management Initiatives

9 How to Exploit Them Use all of them Master them
Get to know the gate-keepers Respect their frantic lifestyle BE ON TIME!!!!!!!!

10 Befriend the Gatekeepers
Work with your organization’s office responsible for Budget Congress Regulations Public Affairs Management Never work around them Listen to their advice They will want to accompany you to meetings—welcome them Always inform them of what you are doing

11 Body of Work Know It Fill In the Gaps

12 REPORTING Simple Message The Mom Test All Kinds of Formats
Say it so your Mom can understand it All Kinds of Formats Short reports Briefing charts Distribution Tell Everybody Use Today’s Electronic Technology

13 Features of a Successful Evaluation
Relevant Context Persuasive Evidence Relevant Strength Compelling Findings Recommendations Geared for Impact A Killer Executive Summary

14 Four Types of Evidence Physical Documentary Testimonial Analytical
Center for Public Program Evaluation

15 TWO OVERARCHING PRINCIPLES
Use more than one type of evidence Especially, combine Quantitative Qualitative Use more than one source Be cautious about using anecdotes Center for Public Program Evaluation

16 Thought Leaders Congress GAO CBO OMB Researchers Secretary’s Office
Members Staff OMB Secretary’s Office Budget Planning and Evaluation Program Office GAO CBO Researchers Interest Groups Think Tanks Reporters

17 Be Helpful LISTEN TO THEM!!!
Tell the something they don’t already know Make PRACTICAL suggestions Lots of them ANSWER THEIR QUESTIONS!!!

18 The Evaluation Questions Change
As A Program Matures The Evaluation Questions Change Are people really better off because of this program? In what way? How is the implementation going? Any Issues? Are grantees in compliance with program rules? Are the required services being delivered? Are beneficiaries satisfied with their benefits? Are they being treated respectfully? Are some grantees doing better than others? How so? How do the services stack up to goals and standards? Center for Public Program Evaluation

19 And So Do the Evaluation Methods
Some need to be done fast. All must be done carefully, but some more so than others. Some require more resources than others Discuss the pros, cons, and expectations with the stakeholders Center for Public Program Evaluation

20 Sources of Distrust and Resentment
Lack of knowledge Irrelevance Unreasonableness

21 The Opposite Traits Lead to Buy-in, Action, and Gratitude
Know what your are talking about Work on important matters Be reasonable in your dealings One of the best ways to gain respect for program knowledge is to develop the work plan around the strategic areas as previously discussed. In a short time, the IGO staff will be regarded as experts on the strategic topics. The importance of issues is derived from a balance of perspectives—those of management and the IGO. Reasonableness is derived from a common interest in solving problems and the display of mutual respect.

22 The Key to Collaboration
Consultation Developing an Annual Plan Designing an Evaluation Early Alerts if Appropriate Working Draft Formal Draft Reacting to Official Comments Final Report Follow-up Consultations will occur at different but appropriate staff levels throughout the process. For example, consultations on the working draft are likely to be at a level lower than consultations on the formal draft.

23 The Evaluator’s Soul and Character
What Personal Qualities Will Facilitate Access to Policy Makers and Stakeholders and Otherwise Promote Success in the Policy Development Arena?

24 Ethics Professional Standards Independence Conflicts of Interest
Full Disclosure

25 Professional Standards
American Evaluation Association --Guiding Principles for Evaluators— President’s Council on Integrity and Efficiency --Quality Standards for Inspections— Government Accountability Office --Performance Audits— Joint Committee on Standards for Educational Evaluation --Program Evaluation Standards-- Standards Association of Inspectors General Website: Principles and Standards for Offices of Inspector Generals: President’s Council on Integrity and Efficiency, and Executive Council on Integrity and Efficiency Website: Quality Standards for Inspectors General (2003): Quality Standards for Inspections (2005): Quality Standards for Investigations (2005): The Institute of Internal Auditors Website: International Standards for the Professional Practice of Internal Auditing Professional Practices Framework, including Code of Ethics and Standards:

26 Working With Stakeholders --A Balancing Act--
Independence Integrity Credibility Cooperation Relevance Credibility

27 The Jewell of Independence
Never Jeopardize Independence Never! It Is Highly Valued By Management Especially High Level Officials There are some exceptions worth discussing There are no exceptions to the rule of not jeopardizing independence. Most managers value the IGO’s independence because they know that their study results are not biased or self serving. They have difficulty getting such information and appreciate it when they can. Higher level managers especially appreciate the ability to get help from the IGO to look into areas that could cause them problems if scandals are brewing there. However, there are some exceptions to the observation that it is highly valued by management. In some cases, the IGO’s independence is denigrated. It is in these rare cases that it is most needed.

28 Requested Work Requested work often leads to action
While requested, it should still be done independently After initial discussions, it should be requested In writing By a very high official Not all such requests can or should be honored

29 What’s Negotiable? By Mutual Agreement, the Evaluation Can Be Tailored to Stakeholders’ needs The evaluators capacity The Following are Negotiable Scope Method Schedule But Not the Results

30 The Magic Formula for Balancing Independence and Cooperation
Start with a polite reminder of independence “As you know, professional evaluators are required to remain independent in deciding what evaluations to perform and how to do them.” Then express openness to other viewpoints “But it would sure be helpful to get your insights on this project before we start.”

31 Conflicts of Interest The key to maintaining independence is to actually be independent Don’t fall victim to conflicts of interest Relatives Gift taking Potential for financial gain Or even the appearance of conflict Impediments are spelled out in professional standards, especially GAO’s Those impediments made their way into the standards because of lost independence of evaluators in the past

32 Full Disclosure There are no secrets in policy making.
Everyone will know What you said and wrote Who you met with What documents you shared The shape of your eyebrow when you did it.

33 Full Disclosure Everyone has a right to an evaluator’s evidence
Evaluators owe anyone who wants: The report The files Freedom of Information is the most effective quality insurance method ever devised

34 The Evaluator As Advocate
Is it allowable, feasible, practical, defensible for an Evaluator to be an Advocate? Can an Evaluator trust himself or herself to preserve independence while making strong recommendations? DISCUSSION

35 HANG IN THERE!

36 President, Center for Public Program Evaluation
Learn More George F. Grob President, Center for Public Program Evaluation 38386 Millstone Drive Purcellville, VA 20132

37 References See New Directions in Evaluation, Volume 2006, Issue 112, Promoting the Use of Government Evaluations in Policy Making, edited by Rakesh Mohan and Kathleen Sullivan. See especially “Managing the Politics of Evaluation to Achieve Impact,” (pp 7-23) by Mohan and Sullivan, and “The Evaluator’s role in Policy Development,” (pp ) by George Grob Grob, G. “How Policy Is Made and How Evaluators Can Affect It.” Evaluation Practice, 1992, 13, 175–183. Grob, G. “A Truly Useful Bat Is One Found in the Hands of a Slugger.” American Journal of Evaluation, 2003, 24, 499–505. Grob, G. “Writing for Impact.” In J. Wholey, H. Hatry, and K. Newcomer (eds.), Handbook of Practical Program Evaluation. (2nd ed.) San Francisco: Jossey-Bass, 2004. Henry, G. T. “Why Not Use?” In V. J. Caracelli and H. Preskill (eds.), The Expanding Scope of Evaluation Use. New Directions for Evaluation, no. 88. San Francisco: Jossey-Bass, 2000. Henry, G. T., and Mark, M. M. “Beyond Use: Understanding Evaluation’s Influence on Attitudes and Actions.” American Journal of Evaluation, 2003, 24, 293–314. Jonas, R. K. “Against the Whim: State Legislatures’ Use of Program Evaluation.” In R. K. Jonas (ed.), Legislative Evaluation: Utilization-Driven Research for Decision Makers. New Directions for Program Evaluation, no. 81. San Francisco: Jossey-Bass, 1999. Kirkhart, K. E. “Reconceptualizing Evaluation Use: An Integrated Theory of Influence.” In V. J. Caracelli and H. Preskill (eds.), The Expanding Scope of Evaluation Use. New Directions for Evaluation, no. 88. San Francisco: Jossey-Bass, 2000.

38 AEA Guiding Principles
Systematic Inquiry Competence Integrity/Honesty Respect for People Responsibilities for General & Public Welfare

39 PCIE Standards Competency Independence Quality control Planning
Data Collection and Analysis Evidence Records maintenance Timeliness Fraud, Illegal Acts, Abuse Reporting Follow-Up Performance Measurement Working Relationships and Communications

40 GAO Standards for Performance Audits
Chapter 3: General Standards Independence Professional Judgment Competence Quality Control Chapter 7: Field Work Planning Supervision Evidence Documentation Chapter 8: Reporting Form Contents Quality Elements Report Issuance and Distribution

41 for Educational Evaluation
Joint Committee The Program Evaluation Standards of the         Joint Committee on Standards for Educational Evaluation                  


Download ppt "Evaluators In the Midst of Policy Makers"

Similar presentations


Ads by Google