Dr. Julia H. Bryan College of Public Affairs Doctor in Public Administration University of Baltimore (2013 Graduate) October 19, 2013.

Slides:



Advertisements
Similar presentations
Financial Literacy for Executives and Non-financial Staff Bonnie Janicki, Corporation for National and Community Service.
Advertisements

Program Evaluation. Overview and Discussion of: Objectives of evaluation Process evaluation Outcome evaluation Indicators & Measures Small group discussions.
Project L.O.F.T. Report May 2007 through October 2007 Creating a design to meet stakeholder desires and dissolve our current set of interacting problems.
Thematic evaluation on the contribution of UN Women to increasing women’s leadership and participation in Peace and Security and in Humanitarian Response.
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
Arts in Basic Curriculum 20-Year Anniversary Evaluation the Improve Group.
Moving Forward with Safety Management Systems December 9, 2014 Standing Committee on Public Transportation Winter Meeting American Association of State.
Evaluating Outcomes of Federal Grantee Collaborations Patricia Mueller, Chair AEA Conference 2014 Denver, CO The National Center on Educational Outcomes-National.
Public Health Financial Management: Building on Survey Findings Julia F. Costich, J.D., Ph.D. APHA Public Health Finance Roundtable October 26, 2008.
Criminal Justice Organizations: Administration and Management
INITIATIVES & STRATEGIES
Specific outcomes can be compared to resources expended; and successful programs can be highlighted;
Relationships between Involvement and Use in the Context of Multi-site Evaluation American Evaluation Association Conference November 12, 2009.
HIV Capacity Building Summit  March 19, 2013  Johannesburg Building local NGO capacity, effectively and sustainably: Implications of selected USAID-
Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health.
Cost Sharing Update Facilities, Equipment & Other Resources – New format will assist proposers in complying with NSF cost sharing policy and is a required.
Abt Associates Inc. In collaboration with: I Aga Khan Foundation I BearingPoint I Bitrán y Asociados I BRAC University I Broad Branch Associates I Forum.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
National Public Health Performance Standards Local Assessment Instrument Essential Service:8 Assure a Competent Public Health and Personal Healthcare Workforce.
Evaluation of OCHA’s Role in Humanitarian Civil-Military Coordination Findings and Recommendations Seminar on Evaluation of UN Support for Conflict Affected.
MGT-555 PERFORMANCE AND CAREER MANAGEMENT
National Public Health Performance Standards Local Assessment Instrument Essential Service:10 Research for New Insights and Innovative Solutions to Health.
Dr.Mohamed E. Osman & Prof.Thuwayba A. Al Barwani With Dr.Abdo M. Al Mekhlafi Dr. Khalid Al Saadi Ms.Laila Alhashar Ms.Fathiya Al Maawali Ms.Zuhor Al lawati.
WHAT IS “CLASS”? A BRIEF ORIENTATION TO THE CLASS METHODOLOGY.
Critical Role of ICT in Parliament Fulfill legislative, oversight, and representative responsibilities Achieve the goals of transparency, openness, accessibility,
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Updated Performance Management for Exempt Staff Fall 2009.
GSA OGP Advisory Committee Engagement Survey ACES 2004 Overall Results September 23, 2004.
2012 Hired Highway Safety Services to assist in the management of SMSA Opened the SMSA Business Office Created a new Web Page Distributed the SMSA Newsletter.
Annual Meeting of INTOSAI Public Debt Working Group Nadi, Republic of Fiji Islands July 24-25, 2008 RECOMMENDATIONS ON XIX INCOSAI THEME 1 “MANAGEMENT,
Participatory research to enhance climate change policy and institutions in the Caribbean: ARIA toolkit pilot 27 th meeting of the CANARI Partnership January.
1 UNDECLARED WORK IN CROATIA Executive Capacity of Governance and Underground Economy: The Case of Croatia Zagrebl, September 1, 2015.
1 Promoting Evidence-Informed Practice: The BASSC Perspective Michael J. Austin, PhD, MSW, MSPH BASSC Staff Director Mack Professor of Nonprofit Management.
© All rights reserved 2014 Great Colleges Survey All Campus Update January 21, 2015.
California Statewide Prevention and Early Intervention (PEI) Projects Overview May 20, 2010.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
January – March PERSONNEL SERVICES DIVISION KEY PERFORMANCE INDICATORS FY11 Supporting DOP FY11 Key Strategies Unless indicated, all measures are.
For over 70 years, the Texas Commission for the Blind has assisted blind and visually impaired Texans in achieving independent and successful lives through.
Dr. David Mowat June 22, 2005 Federal, Provincial & Local Roles Surveillance of Risk Factors and Determinants of Chronic Diseases.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Special Education Advisory Committee Virginia Department of Education.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Virginia Association of School Superintendents Annual Conference Patty.
IAEA International Atomic Energy Agency. IAEA Outline LEARNING OBJECTIVES REVIEW TEAM AND COUNTERPARTS Team Composition Qualification PREPARATORY PHASE.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Quantitative and Qualitative Approaches
Situation Analysis Determining Critical Issues for Virginia Cooperative Extension.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
DETERMINE Working document # 4 'Economic arguments for addressing social determinants of health inequalities' December 2009 Owen Metcalfe & Teresa Lavin.
Edit the text with your own short phrases. To change the sample image, select the picture and delete it. Now click the Pictures icon in the placeholder.
Use of Policy Analysis in Local Government in the Greater Las Vegas Metropolitan Area PUA 726 Policy Analysis Spring 2009 Dawn Barlow Shane Dillon Lynn.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
Writing Proposals Nayda G. Santiago Capstone CpE Jan 26, 2009.
CAHPS PATIENT EXPERIENCE SURVEYS AHRQ ANNUAL MEETING SEPTEMBER 2012 Christine Crofton, PhD CAHPS Project Officer.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
October 20 th, Beliefs and Expectations for Site Council Seek and listen to the insights of all stakeholder perspectives and groups. Deal with issues.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Planning for School Implementation. Choice Programs Requires both district and school level coordination roles The district office establishes guidelines,
T HE G ALLUP O RGANIZATION GSA OGP Advisory Committee Engagement Survey ACES 2004 Overall Results October 14, 2004.
Accreditation (AdvancED) STANDARD #2: GOVERNANCE & LEADERSHIP
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
بسم الله الرحمن الرحیم.
IAEA International Atomic Energy Agency. IAEA Outline LEARNING OBJECTIVES REVIEW TEAM AMD COUNTERPARTS Team Composition Qualification PREPARATORY PHASE.
SAN JOSE STATE UNIVERSITY SCHOOL OF SOCIAL WORK FIELD INSTRUCTION INITIATIVE PARTNERED RESEARCH PROJECT Laurie Drabble, Ph.D., MSW, MPH Kathy Lemon Osterling,
ORGANIZATION CAPACITY ASSESSMENT (2013) Produced in April 2013.
Report on Higher Education Strategic Plan Initiatives Maggie Dalrymple Purdue University.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
1 Office of ASG/CITO Crisis Information Management Strategy UNGIWG-11, Geneva 15 March 2011 A written consent by the UN is required to use the information.
United Nations Voluntary Fund on Disability (UNVFD)
Presentation transcript:

Dr. Julia H. Bryan College of Public Affairs Doctor in Public Administration University of Baltimore (2013 Graduate) October 19, 2013

» There has been an increase in practice of performance measurement information but use of this information… » There has been minimal research on performance measurement use at the federal level » It remains unclear whether performance measurement information is used for decision-making and management practices » This represents a gap in the literature on ways to improve and enhance performance measurement use in federal programs 2

» Led to 3 overarching research questions: (1)What factors influence the use of performance measurement information and how? (2)Under what circumstances is performance measurement information being used? (3)Who is using this information? 3

» Has potential to positively influence decision and management practices » Helps to inform policy, research and practice -Better equip federal managers and staff to use performance measurement information in a way that best meets their program needs 4

» Builds upon a performance measurement utilization model » Several uses to consider in the decision making process +Instrumental +Non-instrumental » Factors influencing the use of performance measurement information +Rational/Technocratic Factors +Political/Culture Factors +Organizational Complexity 5

H1Use of performance measurement information consists of both instrumental and non-instrumental use and each type of use is influenced by different factors. H2Organizations with increased complexity are more likely to use performance measurement information. H3Rational/technocratic factors are positively related with performance measurement use. H4Political/cultural factors are positively related with performance measurement use. H4-aOrganizations that involve stakeholders in the performance measurement process are more likely to use performance measurement information. H4-bOrganizations with high innovative and reward practices are more likely to use performance measurement information. 6

» Comparative case study » Mixed methods approach » Why two program offices? +Diversity in size, culture and structure +Both have performance measurement systems 7

» 28-item cross-sectional survey » Piloted in January 2012 » Population: 326 managers and staff » Administered both web-based and paper-based between March 1, May 1, Thank you for your participation in this survey. The purpose of this survey is to examine how and why federal programs use performance measurement information at the program level (not individual grantee or personnel performance level). This study is important to your program office and the field of public administration because it will provide a better understanding of factors that influence the use of performance measurement information. Your knowledge and expertise can provide valuable insight on this topic and your participation is greatly appreciated.

» Dependent Variables ˃Instrumental use: accountability, improvement ˃Non-instrumental use: understanding, mobilization » Independent Variables ˃Organizational Complexity: levels of involvement, formulation and specialization ˃Rational/Technocratic: budget, staffing, training and information ˃Stakeholder Involvement: developing performance measurement, supporting change and hold office accountable ˃Organizational Culture: innovative practices, rewarding managers/staff » Control Variables ˃program office, position, length of time in Agency and program office 9

Descriptive N= 114 Managers= 29% Staff= 70% Length of time in Office= 85.5% Overall program office distinctions: (1) Non-instrumental use for understanding (2) Rational/technocratic factors (3) Organizational culture Factor Dependent variables 3 Constructs -Performance Measurement Use -Non-instrumental Use -Instrumental Use Independent variables 4 Constructs -Organizational Complexity -Organizational Culture -Rational/Technocratic -Stakeholder Involvement 10

Unstandardized B Coefficients Independent VariablesPerformance Measurement UseNon-instrumental UseInstrumental Use Organizational Complexity.259* ** Organizational Culture.207*.227*.055 Rational/Technocratic.587***.412**.419** Stakeholder Involvement Program Office Management Time in Program Office Time in Agency R-squared=.476 Significance=.000 N= 114 R-squared=.245 Significance=.0001 N= 114 R-squared=.341 Significance=.000 N= p<.05*, p<.01**, p<.001***

Interviews N= 20 Length: minutes Conducted 9/2012 – 1/2013 General Themes: program improvement; role of leadership; lack of resources; role of stakeholders; structure of program office Program Specific Themes: - Program Office 1: stakeholders role in technical assistance; timely/accessible data; policies/procedures - Program Office 2: need for understanding; challenges with developing performance measures; stakeholders role in performance measurement development Document Review Data collection and review: 3/2012 to 2/2013 Types of documents review included annual performance reports, Congressional budget justifications, funding announcements, and annual program reports Themes - Use and/or intended use of performance measurement information for accountability, program improvement and understanding 12

Interview Quotes: “There are standard operational procedures for most of the work performed by POs [project officers]. While many of these documents are “dense”, they provide a great foundation in order to execute tasks from start to finish” “Making performance information more readily accessible….” “It’s a relatively new system and … still identifying ways to help grantees actually utilize the performance data information versus just reporting it because it is a requirement” 13

» Instrumental use of performance measurement information was indicated by both program offices » Rational/technocratic factors play a key role in the performance measurement use » Instrumental and non-instrumental use were influenced by both similar and different factors Role of stakeholders in developing performance measures as well as providing technical assistance to grantee organizations Structure of the program office appears to effect the use of performance measurement information 14

» Communicating the value and importance of performance measurement information » Continuous assessment of performance measurement resources » Cultivating and rewarding innovative practices for performance measurement use » Broadening the interpretation of performance measurement use to include both instrumental and non- instrumental use » Expanding research to be inclusive of both managers and staff and to assess use at the federal program level 15

» Low response rate » Self-reported data » Response bias » Organizational types 16

» Contributed to a limited body of knowledge on the use of performance measurement information at the federal level » Added a unique perspective for examining the use of performance measurement information » Provided a greater understanding that may help determine ways to enhance the linkage between performance measurement use and management practices in federal agencies 17

18 Thank you! Julia Bryan, DPA, MPH, CHES