Presentation is loading. Please wait.

Presentation is loading. Please wait.

CSREES Reporting Web Conference April 14, 2008. questions to User Support (202) 690-2910 or

Similar presentations


Presentation on theme: "CSREES Reporting Web Conference April 14, 2008. questions to User Support (202) 690-2910 or"— Presentation transcript:

1 CSREES Reporting Web Conference April 14, 2008

2 E-mail questions to rwc@csrees.usda.gov User Support (202) 690-2910 or C2IT@csrees.usda.govC2IT@csrees.usda.gov Do not contact Texas A&M support FAQs and other information on the CSREES Reporting Web Conference web page at www.csrees.usda.gov/rwc www.csrees.usda.gov/rwc

3 E-mail questions to rwc@csrees.usda.gov Format and Logistics E-mail questions to rwc@csrees.usda.govrwc@csrees.usda.gov Also e-mail topic suggestions to rwc@csrees.usda.gov rwc@csrees.usda.gov Opportunity to vote for topics for next conference Conferences are recorded and will be available on the Reporting Web Conference web page at www.csrees.usda.gov/rwcwww.csrees.usda.gov/rwc

4 E-mail questions to rwc@csrees.usda.gov Deb Hamernik Deb is the CSREES National Program Leader for Animal Physiology and NRI Program Director for Bovine Genome Sequencing and Porcine Genome Sequencing. She has represented USDA on the NSTC-COS-Research Business Models Subcommittee to develop the Research Performance Progress Report (RPPR) since 2004. (202) 401-4202 dhamernik@csrees.usda.gov www.csrees.usda.gov/onesolution

5 Standard Progress Report

6 E-mail questions to rwc@csrees.usda.gov Standard Reporting Across the Federal Government Implementation of the Federal Financial Assistance management Improvement Act of 1999 (Public Law 106- 107) Facilitate information collection in lieu of numerous agency-specific forms Does not change reporting requirements in OMB Circulars A-102 and A-110. Provides a standard format for collecting the information Draft Research Performance Progress Report (RPPR) available at: www.nsf.gov/bfa/dias/policy/rppr/draftformat.pdf

7 E-mail questions to rwc@csrees.usda.gov Standard Reporting Across the Federal Government Agencies will use the standard categories & instructions developed for each category Agencies may provide additional program- specific instructions to clarify program requirements Agencies may develop additional agency- or program-specific reporting categories & instructions

8 E-mail questions to rwc@csrees.usda.gov One Solution: CRIS Transition Standard Report The revised CRIS AD 416 sections include: Goals/Objectives/Expected Outputs Methods Non-Technical Summary The revised CRIS AD 421 sections include: Outputs Outcomes/Impacts Publications Participants Target Audiences Project Modifications

9 E-mail questions to rwc@csrees.usda.gov Outputs Activities, events, services, or products that reach people Examples: conferences, field days, videos, curricula, patent applications, germplasm, genetic maps, students graduated, etc. Do not report publications in this category

10 E-mail questions to rwc@csrees.usda.gov Outputs--Dissemination Dissemination refers to outreach activities to reach intended audiences to advance knowledge, encourage positive actions, or change conditions. If educational materials and resources were distributed, describe the distribution method and the intended audience.

11 E-mail questions to rwc@csrees.usda.gov Publications Publications are outputs. For technical reasons, the CRIS system collects publications in a separate box. Include paper or electronic publications Include status of publication (e.g., pending, accepted, in press)

12 E-mail questions to rwc@csrees.usda.gov Outcomes/Impacts Changes in knowledge, actions, conditions Results of basic research projects should be described as a change in knowledge (rather than experimental/technical details) Results of extension activities should be described as a change in actions or conditions

13 E-mail questions to rwc@csrees.usda.gov Participants Provide information about individuals who worked on the project—their role and how they participated in the project If applicable, describe partner organizations, collaborators, and contacts. Include collaborators outside the U.S. Describe opportunities for training or professional development (trainees, K-12 teachers, producers, farmers, staff, volunteers, etc.

14 E-mail questions to rwc@csrees.usda.gov Target Audiences Provide information on target audiences for efforts designed to cause a change in knowledge, actions or conditions. Include: individuals, groups, communities served by the project Delivery of science-based knowledge to people through formal or informal educational programs

15 E-mail questions to rwc@csrees.usda.gov Project Modifications Describe major changes in approach and reason for change Examples: changes in Assurance Statements (animals, humans, or biohazards); major problems or delays that have significant impact on rate of expenditures

16 E-mail questions to rwc@csrees.usda.gov Tips Do NOT re-enter the objectives and methods (already entered on the AD416) Do NOT copy and paste abstracts for scientific meetings into the Standard Report Use general terms for a lay audience More information is not necessarily better information

17 E-mail questions to rwc@csrees.usda.gov Standard Progress Report: For More Information Deb Hamernik (202)401-4202 dhamernik@csrees.usda.gov dhamernik@csrees.usda.gov

18 E-mail questions to rwc@csrees.usda.gov Questions? E-mail questions to rwc@csrees.usda.govrwc@csrees.usda.gov For more information, visit the One Solution web page at www.csrees.usda.gov/onesolutionwww.csrees.usda.gov/onesolution

19 Djimé Adoum Djimé assists the Director, Office of Planning and Accountability, in developing monitoring and evaluation systems to analyze program activities funded by CSREES and implemented by our Land Grant System partners, and provides leadership in strategic planning and the CSREES Portfolio Review process. (202) 720-4564 dadoum@csrees.usda.gov www.csrees.usda.gov/opa

20 Practical, Realistic Approaches to Measuring Impacts of Basic Research

21 Outline of the Presentation Reasons for Measuring Impact of Basic Science Difficulties with Measuring Impact of Basic Science Metrics and Efforts to Date R&D Criteria as Starting Point Experience from CSREES PREP Use of the Logic Model Summary and Conclusions

22 Why Measure the Impact of Basic Research? Pressure due to limited resources Problems have become extremely complex and require multi-disciplinary collaboration To secure public buy-in To demonstrate public value

23 Stating the Obvious Measuring the impact of Basic Research is difficult Attempts have been made to identify ways to measure A few approaches have been determined to be of value

24 A Few Suggested Indicators Inputs/Investments in S&T OutputsOutcomes Expenditures Comparison of expenditures Expenditures related to a product line Source of funding Publications Patents Training of scientific and technical people Special Presentations and Honors Development of new testing methodologies Intellectual challenges Measuring things of value to society, to the economy, to the nation and to society. Vaccines New plant variety Sustainable socio- economic and environmental development Less dependency on foreign oil

25 Metric Defined Definition of a metric (Geisler 2000): It is a system of measurement that includes three elements: the item being measured the unit of measurement the value of the unit

26 A Few Suggested Metrics Bibliometric Analysis refers to measures of scientific and technical published outputs from science and its disciplines. It measures both quantity and quality (Geisler 2000).

27 A Few Suggested Metrics Economic Analysis: a process that correlates financial measures for both investments/expenditures and outputs (source: Geisler 2000). It is extremely difficult to predict the outcomes of R&D.

28 A Few Suggested Metrics Peer Review: A process by which a selective jury of experts in a given scientific field is asked to evaluate the undertaking of scientific activity or its outcomes (e.g., research, projects, or scientific publications.) Source: Geisler 2000

29 Efforts to date A few models are selected to highlight evaluation of publically funded research: United Kingdom’s Research Assessment Exercise (RAE) The Japanese model The Australian (Australia’s Relative Funding Model) - RFM The United States: GPRA and PART (NIH, NSF, ARS and CSREES, NAS)

30 Evaluation based on the R&D Criteria R&D Criteria Relevance R&D investments must have clear plans, must be relevant to national priorities, agency missions, relevant fields, and “customer” needs, and must justify their claim on taxpayer resources. Quality Programs should maximize the quality of the R&D they fund through the use of a clearly stated, defensible method for awarding a significant majority of their funding. Performance R&D programs should maintain a set of high priority, multi-year R&D objectives with annual performance outputs and milestones that show how one or more outcomes will be reached. To be used as broad guidelines applicable to all Federally funded R&D efforts.

31 CSREES Experience: Portfolio Review Expert Panel Process R&D Criteria and Dimensions Relevance Scope Focus Contemporary and/Emerging Issues Solicitation and/or receptiveness for Stakeholders Input Utilization of Stakeholder Input Quality Significance of Results Usefulness and Utilization of Results Integration Interdisciplinary Balance Alignment with Current State of Science Performance Productivity Comprehensiveness of Work Produced Accountability Management To be used as broad guidelines applicable to all Federally funded R&D efforts.

32 The Logic Model What is it? A roadmap, a conceptual framework, a program theory, program theory of action (Weiss, 1998; Patton, 1997, Bickman, 1987) It is a concise way to show how a program is designed and will make a difference ( Harvard Family Research Project ) It is the core of program planning, evaluation, program management and communications (NAS, Kellogg Foundation, and UNW)

33 Use of logic model to Set the context within which research takes place Consider the concept of public value Provide a conceptual roadmap Ascertain the extent to which outputs led to new knowledge, applications, solutions and reasonably consistent with expenditures Provide framework for evaluation

34 Occur when a societal condition is improved due to a participant’s action taken in the previous column. For example, specific contributions to: - Increased market opportunities overseas and greater economic competitiveness - Better and less expensive animal health - Vibrant & competitive agricultural workforce - Higher productivity in food provision - Better quality-of-life for youth & adults in rural communities - Safer food supply - Reduced obesity and improved nutrition & health - Higher water quality and a cleaner environment Generic Logic Model for CSREES Reporting CSREES – Office of Planning & Accountability ( This model is intended to be illustrative guide for reporting on CSREES-funded research, education and extension activities. It is not a comprehensive inventory of our programs.) Outcomes Actions InputsSituationActivities Knowledge What we invest: - Faculty - Staff - Students - Infrastructure - Federal, state and private funds - Time - Knowledge - The collection of stakeholder opinions Occurs when there is a change in knowledge or the participants actually learn: - New fundamental or applied knowledge - Improved skills - How technology is applied - About new plant & animal varieties - Increased knowledge of decision-making, life skills, and positive life choices among youth & adults - Policy knowledge - New improved methods Description of challenge or opportunity - Farmers face increasing challenges from globalization - Opportunity to improve animal health through genetic engineering - Insufficient # of trained & diverse professionals entering agricultural fields - Youth at risk - Invasive species is becoming an increasing problem - Bioterrorism - Obesity crisis - Impaired water quality EXTERNAL FACTORS - A brief discussion of what variables have an effect on the portfolio, program or project, but which cannot be changed by managers of the portfolio, program, or project. For example, a plant breeding program’s success may depend on the variability of the weather...etc. Occur when there is a change in behavior or the participant’s act upon what they’ve learned and: - Apply improved fundamental or applied knowledge - Adopt new improved skills - Directly apply information from publications - Adopt and use new methods or improved technology - Use new plant & animal varieties - Increased skill by youth & adults in making informed life choices - Actively apply practical policy and decision-making knowledge Conditions ASSUMPTIONS - These are the premises based on theory, research, evaluation knowledge etc. that support the relationships of the elements shown above, and upon which the success of the portfolio, program, or project rests. For example, finding animal gene markers for particular diseases will lead to better animal therapies. What we do (Activities): - Design and conduct research - Publish scientific articles - Develop research methods and procedures - Teach students - Conduct non-formal education - Provide counseling - Develop products, curriculum & resources Who we reach (Participation): - Other scientists - Extension Faculty - Teaching Faculty - Students - Federal, state & private funders - Scientific journal, industry & popular magazine editors - Agencies - Policy and decision- makers - Agricultural, environmental, life & human science industries - Public Outputs Version 1.2 - New fundamental or applied knowledge - Scientific publications - Patents - New methods & technology - Plant & animal varieties - Practical knowledge for policy and decision-makers - Information, skills & technology for individuals, communities and programs - Participants reached - Students graduated in agricultural sciences

35 Logic model as a planning and an evaluation tool SituationInputsActivitiesOutputsOutcomes KnowledgeActionsConditions A problem of local and national interest For example: Reduce the nation’s dependency on foreign oil $$$$ Expenditures or Investments Staff and partners Genetically engineer a bacterium to convert cellulosic material into ethanol One bacterium engineered Patents New applications Advanced knowledge High efficiency conversation Cost savings Improved quality Reduced dependency on foreign oil Assumptions External factors

36 Summary and Conclusions It is fully recognized that measuring the impact of basic science is difficult (OMB and NAS) Impact might not be realized long after studies are completed Debate not over yet but limited public resources have led to scrutiny about return on investments and the need to document effectiveness and efficiency of investment The use of logic model as a planning and evaluation tool has gained some ground The CSREES PREP Process has been quite helpful

37 Questions? E-mail questions to rwc@csrees.usda.govrwc@csrees.usda.gov For more information, visit the Planning and Accountability web page at www.csrees.usda.gov/opa www.csrees.usda.gov/opa

38 E-mail questions to rwc@csrees.usda.gov Topics for Next Time …

39 E-mail questions to rwc@csrees.usda.gov See you in June!!! Next CSREES Reporting Web Conference will be on Thursday, June 12 from 2-4 pm (Eastern) Visit the conference web site at www.csrees.usda.gov/rwc for: www.csrees.usda.gov/rwc The recording of this conference The slides from this conference Recordings and slides from past conferences Announcements


Download ppt "CSREES Reporting Web Conference April 14, 2008. questions to User Support (202) 690-2910 or"

Similar presentations


Ads by Google