Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of Energy Efficiency R&D Programs Presented at the American.

Slides:



Advertisements
Similar presentations
1 Utilization-focused Evaluation of a Portfolio of Research, Development, & Demonstration Programs Helen Kim, Larry Pakenas - NYSERDA Rick Ridge – Heschong.
Advertisements

A “Best Fit” Approach to Improving Teacher Resources Jennifer King Rice University of Maryland.
Regional Trajectories to the Knowledge Economy: A Dynamic Model IKINET-EURODITE Joint Conference Warsaw, May 2006.
Building Up a Real Sector Confidence Index for Turkey Ece Oral Dilara Ece Türknur Hamsici CBRT.
Update from the UNC General Education Council [presented to the UNC Board of Governors’ Educational Planning, Programs, and Policies Committee on February.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Evaluation. Practical Evaluation Michael Quinn Patton.
Creating a Successful Portfolio to Assess Student Learning Deborah A. ThompsonDeborah A. Thompson.
Understanding Validity for Teachers
Trini Torres-Carrion. AGENDA Overview of ED 524B Resources Q&A.
Supply Management Chapter 7.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
World Languages Portfolio. Student Growth Portfolio with Peer Review 2  THE GOAL: A holistic and meaningful picture of the value a teacher adds to students,
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
Impact assessment framework
The County Health Rankings & Roadmaps Take Action Cycle.
Performance Measurement and Analysis for Health Organizations
March 2009 PERFORMANCE EVALUATION OF PUBLIC SERVICE DELIVERY (Social Services)– KENYAN EXPERIENCE March 2009 PERSPECTIVES ON IMPACT EVALUATION Presenter:
Work Programme for the specific programme for research, technological development and demonstration "Integrating and strengthening the European Research.
Recap and Synthesis of National and Regional Research MK21 Inception workshop for local research projects Yangon, June 2015.
Outcome Based Evaluation for Digital Library Projects and Services
1. IASC Operational Guidance on Coordinated Assessments (session 05) Information in Disasters Workshop Tanoa Plaza Hotel, Suva, Fiji June
1 Patent information for strategic technology management 作者: Holger Ernst 報告者:楊易霖 World Patent Information 25 (2003) 233–242.
CHAPTER 6, INDEXES, SCALES, AND TYPOLOGIES
SSHRC Partnership and Partnership Development Grants Rosemary Ommer 1.
Evaluating a Research Report
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 7 Portfolio Assessments.
KATEWINTEREVALUATION.com Education Research 101 A Beginner’s Guide for S STEM Principal Investigators.
8 TH -11 TH NOVEMBER, 2010 UN Complex, Nairobi, Kenya MEETING OUTCOMES David Smith, Manager PEI Africa.
National Institute of Standards and Technology Technology Administration U.S. Department of Commerce Accelerating Emerging Technologies to the Marketplace.
National Health Services Planners Forum, Melbourne, Thursday 7 April 2011 Population health planning: prospects and possibilities Professor.
Measuring Efficiency CRJS 4466EA. Introduction It is very important to understand the effectiveness of a program, as we have discovered in all earlier.
Evaluating the impact of health research: Revisiting the Canadian Institutes of Health Research Impact Assessment Framework Nicola Lauzon, Marc Turcotte.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
Student assessment Assessment tools AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Dr Ritva Dammert Director Brussels May 27, 2009 Evaluation of the Finnish Centres of Excellence Programmes
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
DETERMINE Working document # 4 'Economic arguments for addressing social determinants of health inequalities' December 2009 Owen Metcalfe & Teresa Lavin.
FHWA Office of Freight Management Freight Technology Assessment Tool.
An Analysis of Three States Alignment Between Language Arts and Math Standards and Alternate Assessments Claudia Flowers Diane Browder* Lynn Ahlgrim-Delzell.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Meeting an Evaluation Challenge: Identifying and Overcoming Methodological Problems Joint CEA/AEA Evaluation 2005 “Crossing Borders, Crossing Boundaries”
Using propensity score matching to understand what works in reducing re-offending GSS Methodology Symposium Sarah French & Aidan Mews, Ministry of Justice.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Livia Bizikova and Laszlo Pinter
Prepared by: Forging a Comprehensive Initiative to Improve Birth Outcomes and Reduce Infant Mortality in [State] Adapted from AMCHP Birth Outcomes Compendium.
1 INDUSTRIAL ENERGY EFFICIENCY CONFERENCE ON GREEN INDUSTRY IN ASIA Robert Williams Energy Efficiency and Policy Unit United National Industrial Development.
National Institute of Standards and Technology U.S. Department of Commerce Technology Program Evaluation: Methodologies from the Advanced Technology Program.
Document number Anticipated Impacts for FRRS Pilot Program ERCOT TAC Meeting September 7, 2012.
Status Reports: Measuring against Mission National Institute of Standards and Technology U.S. Department of Commerce 1 Technology Program Evaluation: Methodologies.
‘Real Options’ Framework to Assess Public Research Investments Nicholas S. Vonortas Center for International Science and Technology Policy & Department.
APPG Equalities 29 th October 2014 Fair Financial Decision-Making: Follow up to the Equality & Human Rights Commission’s S31 assessment of the 2010 Spending.
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
Introduction Extensive Experience of ex-post evaluation of national support programmes for innovation; less experience at regional level; Paper aims to.
CRJS 4466 PROGRAM & POLICY EVALUATION LECTURE #6 Evaluation projects Questions?
Session 2: Developing a Comprehensive M&E Work Plan.
Project VIABLE - Direct Behavior Rating: Evaluating Behaviors with Positive and Negative Definitions Rose Jaffery 1, Albee T. Ongusco 3, Amy M. Briesch.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
How to Demonstrate the Value of Contract Education Jack J. Phillips, Ph.D. Chairman ROI Institute, Inc. May 11, :00 AM- 12:00 PM 5 Levels of Evaluation.
Torbay Council Partnerships Review August PricewaterhouseCoopers LLP Date Page 2 Torbay Council Partnerships Background The Audit Commission defines.
Imagery For The Nation Cost Benefit Analysis October 24, 2007.
Approaches to Linking Process and Outcome Data in a Cross-Site Evaluation Laura Elwyn, Ph.D. Kristin Stainbrook, Ph.D. American Evaluation Association.
GEO Strategic Plan : Implementing GEOSS Douglas Cripe GEO Work Programme Symposium 2-4 May 2016, Geneva.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Measuring Data Quality and Compilation of Metadata
Exploring Assessment Options NC Teaching Standard 4
The GEF Public Involvement Policy
EM&V Planning and EM&V Issues
Presentation transcript:

Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of Energy Efficiency R&D Programs Presented at the American Evaluation Association/Canadian Evaluation Society Joint Conference Toronto, Canada October 28, 2005 Scott Albert, GDS Associates Helen Kim, NYSERDA Rick Ridge, Ridge & Associates Gretchen B. Jordan, Sandia National Laboratory

2 The NYSERDA Portfolio

3 R&D Budget Through 12/31/04

4 Objective Develop and pilot-test an evaluation model for NYSERDA’s R&D program area covering 1998 through 2004 that recognizes: – R&D programs and their societal impacts are difficult to evaluate by their nature. – The outcomes are subject to multiple and uncontrollable influences that are difficult to foresee. – The cycle for product development is 5 to 15 or 20 years and many of the energy and economic impacts of R&D projects may not be fully realized and measured for many years. – Given the multiple and compounding effects that happen along the way, it is also very difficult to be exact about the attribution of impacts to any one particular effort. – When evaluating an entire portfolio of R&D projects, objectives and outcomes vary by project.

5 R&D Portfolio Logic Model

6 Six Stages of the R&D Model Information for policy makers and R&D community Product development stage 1 – study and prove concepts Product development stage 2 – develop new or improved products Product development stage 3 – product testing Demonstration Pre-deployment

7 The Value/Cost Method Combines Two Approaches Aggregate approach – Analyzed data collected for each of NYSERDA’s 638 R&D projects (since 1998) in the portfolio. – Basic statistics, such as the number of projects, expenditures by technology type, leveraged funds, and the stage of development were calculated to describe the entire R&D portfolio. Peer Review – Analyzed using an adaptation of the Composite Performance Rating System (CPRS) used to evaluate the U.S. Department of Commerce’s Advanced Technologies Program (ATP). – Peer review approach was applied to a small sample of successful R&D projects, covering each of the six R&D stages (project types).

8 ATP: Composite Performance Rating System Constructed Bottom-up; Used Top-down Performance Distribution for the Portfolio Distribution by Tech Area Distribution by Firm Size Distribution by Location etc. Project 1 Case Study Project 2 Case Study Project 3 Case Study Project 4 Case Study Project n Case Study... Unique cases Aggregate statistics Composite scores Performance distributions Minimum net portfolio benefits CPRS 1CPRS 2CPRS 3CPRS 4CPRS n ATP Method R.Ruegg, Nov. 2002

9 AGGREGATE ANALYSIS Expanded and updated R&D database in order to carry out a comprehensive descriptive analysis of the entire R&D portfolio. Variables Considered – Funding – Technology Area – Co-Funding Entity – Project Status – Expected Benefits from R&D Projects

10 Questions Addressed by Aggregate Analysis How does NYSERDA funding per project vary by project type? How does NYSERDA funding per project vary by program? What is the frequency of the various project types? What goals are being served by the various project types? What are the primary goals served by the portfolio? What are the sources of funding, by project type? What is the funding share contributed by partners? How does NYSERDA funding and co-funding vary by project type over time? How does the mix of technologies and issues examined change over time?

11 Results: Aggregate Analysis

12 NYSERDA Funding, by Project Type

13 Funding by Goals

14 Co-Funding Sources

15 Percent of Projects by Technology and Year

16 Peer Review Focused on Six Success Stories as a Pilot Test

17 Indicator Variables Choice of indicator variables for the R&D portfolio guided by the R&D portfolio logic model. Six categories of outcomes, identified in the logic model were selected: – Knowledge creation, – Knowledge dissemination, – Commercialization progress, – Energy benefits, – Economic benefits, and – Environmental benefits.

18 Accomplishment Packets Project-specific accomplishment packets were then developed to document objective evidence regarding the six outcomes: – Knowledge creation – Knowledge dissemination – Commercialization progress – Realized and potential energy benefits – Realized and potential economic benefits – Realized and potential environmental benefits – Value versus cost (not a specific outcome, but this item was also included in the peer-reviewer response packet for 0 to 4 rating)

19 Review Process Reviewers willing to participate were sent: – Peer Review Instructions, – Conflict of Interest Form – Peer Review Assessment Form, and – the Peer Review Information Packet for their specific project. Over a period of five weeks, the reviewers completed their assessment and returned them for data entry.

20 Results: Peer Review

21 Weighted Rating By Project

22 Overall Ratings by Outcome

23 Overall Ratings, by Project, by Outcomes

24 Conclusions: Aggregate Analysis Assumes more risk than the commercial sector in the earlier stages of technology development, while in the latter stages, the reverse is true. Covers a wide range of technologies that are aimed at achieving potentially significant energy, economic and environmental benefits. Leverages funds on a 4.3 to 1 ratio. Partners with a wide range of public and private organizations and institutions. Evolves over time in response to the societal needs and opportunities to address them (i.e., the technologies and issues addressed in the R&D portfolio are not static).

25 Conclusions: Peer Review Peer review scores from the pilot test averaged 3.34 (on a 0-to-4 scale) across all assessment categories. There are substantial benefits across all documented accomplishment areas for the five projects assessed. Significant progress is being made toward the eventual achievement of measurable 3-E benefits.

26 Conclusions: Peer Review Process The information provided in the review packets for the five selected projects was adequate The instructions provided were clear The criteria used in the assessments were clearly defined The criteria used in the assessments were the right ones It is very important for NYSERDA to assess the value of its R&D programs The results of the peer review process should be useful for NYSERDA decision-makers Reviewers can assess a fair amount of information if the information is presented in a clear and organized format. Statistical analyses revealed that the ratings provided by the peer reviewers were reliable.

27 Next Steps Routinize the collection of key indicator data for all R&D projects. Perform aggregate analysis on all projects Focus significant effort on a more representative sample of projects