CSREES Reporting Web Conference April 14, 2008. questions to User Support (202) 690-2910 or

Slides:



Advertisements
Similar presentations
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
Advertisements

Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Dept. of Computing and Technology (CaT) School of Science and Technology B.S. in Computer Information Systems (CIS) CIP Code: Program Code: 411.
1 Performance Assessment An NSF Perspective MJ Suiter Budget, Finance and Awards NSF.
Session 5 Intellectual Merit and Broader Significance FISH 521.
Sustainability Planning Pat Simmons Missouri Department of Health and Senior Services.
USDA Forest Service Research and Development Tribal Engagement Roadmap Consultation - January 10 to May 11, 2014 [DATE of PRSTN]
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
SEM Planning Model.
STANDARDS FOR SCHOOL LEADERS DR. Robert Buchanan Southeast Missouri State University.
Occur when a societal condition is improved due to a participant’s action taken in the previous column. For example, specific contributions to: - Increased.
ANR Program Reporting Systems Campus Contacts Meeting June 20, 2007.
College Strategic Plan by Strategic Planning and Quality Assurance Committee.
National Science Foundation: Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES)
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
Desired Outcomes / Impacts ActionsKnowledge Occurs when there is a behavior change based upon what participants have learned (medium term): -Adoption of.
Quality Assurance at the University St. Kliment Ohridski Elizabeta Bahtovska National Bologna promoter TEMPUS SCM C-032B06 West Balkan Bologna Promoters.
Reporting and Using Evaluation Results Presented on 6/18/15.
The Competitive Grants Environment Presented by: Dr. Deborah Sheely Dr. Mark Poth Competitive Programs Unit.
Foundations of Educating Healthcare Providers
Writing Impact into Research Funding Applications Paula Gurteen Centre for Advanced Studies.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Dr. Anna Palmisano, Deputy Administrator, Competitive Programs The Cooperative State Research, Education and Extension Service Competitive Programs.
Overview of ARS National Programs Steven Kappes Deputy Administrator Animal Production & Protection National Program Staff Agricultural Research Service.
Understanding Impact Statements Session 66 Bret W. Hess Director Wyoming Agricultural Experiment Station Revised 2010.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Outcome Based Evaluation for Digital Library Projects and Services
National Science Foundation 1 Evaluating the EHR Portfolio Judith A. Ramaley Assistant Director Education and Human Resources.
Strategic Thinking/Action Planning: Addressing the Grand Challenges Douglas A. Steele Director, Texas A&M AgriLife Extension Service June 19, 2013.
HECSE Quality Indicators for Leadership Preparation.
Commissioning Self Analysis and Planning Exercise activity sheets.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Virginia Association of School Superintendents Annual Conference Patty.
NSF IGERT proposals Yang Zhao Department of Electrical and Computer Engineering Wayne State University.
AASCB The Assurance of Learning AASCB Association to Advance Collegiate Schools of Business Marta Colón de Toro, SPHR Assessment Coordinator College of.
CONDUCTING A PUBLIC OUTREACH CAMPAIGN IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Conducting a Public Outreach Campaign.
CSREES Reporting Web Conference February 14, 2008.
Military Family Services Program Participant Survey Briefing Notes.
Desired Outcomes / Impacts ActionsKnowledge Occurs when there is a behavior change based upon what participants have learned (medium term): Development.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Improving and Integrating Evaluation into Program Management Panel Presentation: Henry M. Doan, Ph.D. Suzanne Le Menestrel, Ph.D. CSREES Steve Loring,
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
1Mobile Computing Systems © 2001 Carnegie Mellon University Writing a Successful NSF Proposal November 4, 2003 Website: nsf.gov.
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
New Mexico State University Land-Grant System Accountability: Learning from the CSREES Portfolio Review Process Steven Loring Assistant Director Agricultural.
Outcome-based Planning and Evaluation Gloria Latimer, Ed.S, Director of Community Programs Jason Vahling, M.P.H., Community Program Specialist.
The United States Foreign Assistance Reforms: An Overview.
Giving Them our Best: 4-H Professional Development Logic Model Outcome: 4-H educators reflect quality, distinction and leadership in the field of youth.
NIH Change Management Program Change Management Program Overview March 8,
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
February 2, PM ET. Since the Summit… WE LISTENED…. Here’s what’s happening….. Curriculum Working Group is hard at work …… Why we are having these.
Cooperative State Research, Education, and Extension Service Quantitative Measures in the Evaluation of Extramural Research.
Logic Models Performance Framework for Evaluating Programs in Extension.
Occur when a societal condition is improved due to a participant’s action taken in the previous column. \ -Increased number and more diverse pool of youth.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
David M. Murray, Ph.D. Associate Director for Prevention Director, Office of Disease Prevention Multilevel Intervention Research Methodology September.
January 23,  Balance state’s higher education long range plan and agency operations in the required strategic plan;  Involve agency staff in.
Logic Models How to Integrate Data Collection into your Everyday Work.
Intellectual Merit & Broader Impact Statements August 2016
OUTCOME BASED EDUCATION
Using Logic Models in Program Planning and Grant Proposals
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Ross O. Love Oklahoma Cooperative Extension Service
“CareerGuide for Schools”
Intellectual Merit & Broader Impact Statements August 2018
Implementation Guide for Linking Adults to Opportunity
Blueprint Outlines practical, consumer-focused, state and local strategies for improving eating and physical activity that will lead to healthier lives.
Intellectual Merit & Broader Impact Statements August 2017
What is your impact pathway?
Intellectual Merit & Broader Impact Statements August 2019
Presentation transcript:

CSREES Reporting Web Conference April 14, 2008

questions to User Support (202) or Do not contact Texas A&M support FAQs and other information on the CSREES Reporting Web Conference web page at

questions to Format and Logistics questions to Also topic suggestions to Opportunity to vote for topics for next conference Conferences are recorded and will be available on the Reporting Web Conference web page at

questions to Deb Hamernik Deb is the CSREES National Program Leader for Animal Physiology and NRI Program Director for Bovine Genome Sequencing and Porcine Genome Sequencing. She has represented USDA on the NSTC-COS-Research Business Models Subcommittee to develop the Research Performance Progress Report (RPPR) since (202)

Standard Progress Report

questions to Standard Reporting Across the Federal Government Implementation of the Federal Financial Assistance management Improvement Act of 1999 (Public Law ) Facilitate information collection in lieu of numerous agency-specific forms Does not change reporting requirements in OMB Circulars A-102 and A-110. Provides a standard format for collecting the information Draft Research Performance Progress Report (RPPR) available at:

questions to Standard Reporting Across the Federal Government Agencies will use the standard categories & instructions developed for each category Agencies may provide additional program- specific instructions to clarify program requirements Agencies may develop additional agency- or program-specific reporting categories & instructions

questions to One Solution: CRIS Transition Standard Report The revised CRIS AD 416 sections include: Goals/Objectives/Expected Outputs Methods Non-Technical Summary The revised CRIS AD 421 sections include: Outputs Outcomes/Impacts Publications Participants Target Audiences Project Modifications

questions to Outputs Activities, events, services, or products that reach people Examples: conferences, field days, videos, curricula, patent applications, germplasm, genetic maps, students graduated, etc. Do not report publications in this category

questions to Outputs--Dissemination Dissemination refers to outreach activities to reach intended audiences to advance knowledge, encourage positive actions, or change conditions. If educational materials and resources were distributed, describe the distribution method and the intended audience.

questions to Publications Publications are outputs. For technical reasons, the CRIS system collects publications in a separate box. Include paper or electronic publications Include status of publication (e.g., pending, accepted, in press)

questions to Outcomes/Impacts Changes in knowledge, actions, conditions Results of basic research projects should be described as a change in knowledge (rather than experimental/technical details) Results of extension activities should be described as a change in actions or conditions

questions to Participants Provide information about individuals who worked on the project—their role and how they participated in the project If applicable, describe partner organizations, collaborators, and contacts. Include collaborators outside the U.S. Describe opportunities for training or professional development (trainees, K-12 teachers, producers, farmers, staff, volunteers, etc.

questions to Target Audiences Provide information on target audiences for efforts designed to cause a change in knowledge, actions or conditions. Include: individuals, groups, communities served by the project Delivery of science-based knowledge to people through formal or informal educational programs

questions to Project Modifications Describe major changes in approach and reason for change Examples: changes in Assurance Statements (animals, humans, or biohazards); major problems or delays that have significant impact on rate of expenditures

questions to Tips Do NOT re-enter the objectives and methods (already entered on the AD416) Do NOT copy and paste abstracts for scientific meetings into the Standard Report Use general terms for a lay audience More information is not necessarily better information

questions to Standard Progress Report: For More Information Deb Hamernik (202)

questions to Questions? questions to For more information, visit the One Solution web page at

Djimé Adoum Djimé assists the Director, Office of Planning and Accountability, in developing monitoring and evaluation systems to analyze program activities funded by CSREES and implemented by our Land Grant System partners, and provides leadership in strategic planning and the CSREES Portfolio Review process. (202)

Practical, Realistic Approaches to Measuring Impacts of Basic Research

Outline of the Presentation Reasons for Measuring Impact of Basic Science Difficulties with Measuring Impact of Basic Science Metrics and Efforts to Date R&D Criteria as Starting Point Experience from CSREES PREP Use of the Logic Model Summary and Conclusions

Why Measure the Impact of Basic Research? Pressure due to limited resources Problems have become extremely complex and require multi-disciplinary collaboration To secure public buy-in To demonstrate public value

Stating the Obvious Measuring the impact of Basic Research is difficult Attempts have been made to identify ways to measure A few approaches have been determined to be of value

A Few Suggested Indicators Inputs/Investments in S&T OutputsOutcomes Expenditures Comparison of expenditures Expenditures related to a product line Source of funding Publications Patents Training of scientific and technical people Special Presentations and Honors Development of new testing methodologies Intellectual challenges Measuring things of value to society, to the economy, to the nation and to society. Vaccines New plant variety Sustainable socio- economic and environmental development Less dependency on foreign oil

Metric Defined Definition of a metric (Geisler 2000): It is a system of measurement that includes three elements: the item being measured the unit of measurement the value of the unit

A Few Suggested Metrics Bibliometric Analysis refers to measures of scientific and technical published outputs from science and its disciplines. It measures both quantity and quality (Geisler 2000).

A Few Suggested Metrics Economic Analysis: a process that correlates financial measures for both investments/expenditures and outputs (source: Geisler 2000). It is extremely difficult to predict the outcomes of R&D.

A Few Suggested Metrics Peer Review: A process by which a selective jury of experts in a given scientific field is asked to evaluate the undertaking of scientific activity or its outcomes (e.g., research, projects, or scientific publications.) Source: Geisler 2000

Efforts to date A few models are selected to highlight evaluation of publically funded research: United Kingdom’s Research Assessment Exercise (RAE) The Japanese model The Australian (Australia’s Relative Funding Model) - RFM The United States: GPRA and PART (NIH, NSF, ARS and CSREES, NAS)

Evaluation based on the R&D Criteria R&D Criteria Relevance R&D investments must have clear plans, must be relevant to national priorities, agency missions, relevant fields, and “customer” needs, and must justify their claim on taxpayer resources. Quality Programs should maximize the quality of the R&D they fund through the use of a clearly stated, defensible method for awarding a significant majority of their funding. Performance R&D programs should maintain a set of high priority, multi-year R&D objectives with annual performance outputs and milestones that show how one or more outcomes will be reached. To be used as broad guidelines applicable to all Federally funded R&D efforts.

CSREES Experience: Portfolio Review Expert Panel Process R&D Criteria and Dimensions Relevance Scope Focus Contemporary and/Emerging Issues Solicitation and/or receptiveness for Stakeholders Input Utilization of Stakeholder Input Quality Significance of Results Usefulness and Utilization of Results Integration Interdisciplinary Balance Alignment with Current State of Science Performance Productivity Comprehensiveness of Work Produced Accountability Management To be used as broad guidelines applicable to all Federally funded R&D efforts.

The Logic Model What is it? A roadmap, a conceptual framework, a program theory, program theory of action (Weiss, 1998; Patton, 1997, Bickman, 1987) It is a concise way to show how a program is designed and will make a difference ( Harvard Family Research Project ) It is the core of program planning, evaluation, program management and communications (NAS, Kellogg Foundation, and UNW)

Use of logic model to Set the context within which research takes place Consider the concept of public value Provide a conceptual roadmap Ascertain the extent to which outputs led to new knowledge, applications, solutions and reasonably consistent with expenditures Provide framework for evaluation

Occur when a societal condition is improved due to a participant’s action taken in the previous column. For example, specific contributions to: - Increased market opportunities overseas and greater economic competitiveness - Better and less expensive animal health - Vibrant & competitive agricultural workforce - Higher productivity in food provision - Better quality-of-life for youth & adults in rural communities - Safer food supply - Reduced obesity and improved nutrition & health - Higher water quality and a cleaner environment Generic Logic Model for CSREES Reporting CSREES – Office of Planning & Accountability ( This model is intended to be illustrative guide for reporting on CSREES-funded research, education and extension activities. It is not a comprehensive inventory of our programs.) Outcomes Actions InputsSituationActivities Knowledge What we invest: - Faculty - Staff - Students - Infrastructure - Federal, state and private funds - Time - Knowledge - The collection of stakeholder opinions Occurs when there is a change in knowledge or the participants actually learn: - New fundamental or applied knowledge - Improved skills - How technology is applied - About new plant & animal varieties - Increased knowledge of decision-making, life skills, and positive life choices among youth & adults - Policy knowledge - New improved methods Description of challenge or opportunity - Farmers face increasing challenges from globalization - Opportunity to improve animal health through genetic engineering - Insufficient # of trained & diverse professionals entering agricultural fields - Youth at risk - Invasive species is becoming an increasing problem - Bioterrorism - Obesity crisis - Impaired water quality EXTERNAL FACTORS - A brief discussion of what variables have an effect on the portfolio, program or project, but which cannot be changed by managers of the portfolio, program, or project. For example, a plant breeding program’s success may depend on the variability of the weather...etc. Occur when there is a change in behavior or the participant’s act upon what they’ve learned and: - Apply improved fundamental or applied knowledge - Adopt new improved skills - Directly apply information from publications - Adopt and use new methods or improved technology - Use new plant & animal varieties - Increased skill by youth & adults in making informed life choices - Actively apply practical policy and decision-making knowledge Conditions ASSUMPTIONS - These are the premises based on theory, research, evaluation knowledge etc. that support the relationships of the elements shown above, and upon which the success of the portfolio, program, or project rests. For example, finding animal gene markers for particular diseases will lead to better animal therapies. What we do (Activities): - Design and conduct research - Publish scientific articles - Develop research methods and procedures - Teach students - Conduct non-formal education - Provide counseling - Develop products, curriculum & resources Who we reach (Participation): - Other scientists - Extension Faculty - Teaching Faculty - Students - Federal, state & private funders - Scientific journal, industry & popular magazine editors - Agencies - Policy and decision- makers - Agricultural, environmental, life & human science industries - Public Outputs Version New fundamental or applied knowledge - Scientific publications - Patents - New methods & technology - Plant & animal varieties - Practical knowledge for policy and decision-makers - Information, skills & technology for individuals, communities and programs - Participants reached - Students graduated in agricultural sciences

Logic model as a planning and an evaluation tool SituationInputsActivitiesOutputsOutcomes KnowledgeActionsConditions A problem of local and national interest For example: Reduce the nation’s dependency on foreign oil $$$$ Expenditures or Investments Staff and partners Genetically engineer a bacterium to convert cellulosic material into ethanol One bacterium engineered Patents New applications Advanced knowledge High efficiency conversation Cost savings Improved quality Reduced dependency on foreign oil Assumptions External factors

Summary and Conclusions It is fully recognized that measuring the impact of basic science is difficult (OMB and NAS) Impact might not be realized long after studies are completed Debate not over yet but limited public resources have led to scrutiny about return on investments and the need to document effectiveness and efficiency of investment The use of logic model as a planning and evaluation tool has gained some ground The CSREES PREP Process has been quite helpful

Questions? questions to For more information, visit the Planning and Accountability web page at

questions to Topics for Next Time …

questions to See you in June!!! Next CSREES Reporting Web Conference will be on Thursday, June 12 from 2-4 pm (Eastern) Visit the conference web site at for: The recording of this conference The slides from this conference Recordings and slides from past conferences Announcements