Presentation is loading. Please wait.

Presentation is loading. Please wait.

SAN DIEGO SUPERCOMPUTER CENTER Impact Requirements Analysis Team Co-Chairs: Mark Sheddon (SDSC) Ann Zimmerman (University of Michigan) Members: John Cobb.

Similar presentations


Presentation on theme: "SAN DIEGO SUPERCOMPUTER CENTER Impact Requirements Analysis Team Co-Chairs: Mark Sheddon (SDSC) Ann Zimmerman (University of Michigan) Members: John Cobb."— Presentation transcript:

1 SAN DIEGO SUPERCOMPUTER CENTER Impact Requirements Analysis Team Co-Chairs: Mark Sheddon (SDSC) Ann Zimmerman (University of Michigan) Members: John Cobb (ORNL) Dave Hart (SDSC) Lex Lane (NCSA) Scott Lathrop (UC/ANL) Sergiu Sanielevici (PSC) Kevin Walsh (SDSC) GDO Supervisor: Dane Skow (ANL, Deputy Director) Final Report in the Forum: RATS: Science Impact and Metrics mi-rat@teragrid.org

2 SAN DIEGO SUPERCOMPUTER CENTER Impact Requirements Analysis Team Purpose: Investigate and recommend measures to assess the short-, mid-, and long-term effects of the TG’s capabilities and resources on scientific discovery

3 SAN DIEGO SUPERCOMPUTER CENTER Guiding Questions What impact has the TeraGrid had on the practice of scientific research? What impact has the TeraGrid had on the production of scientific knowledge?

4 SAN DIEGO SUPERCOMPUTER CENTER Guiding Principles Strike a balance between: the usefulness of the potential approaches, the effort required from TG users and reviewers to provide data, and the number and type of TG personnel necessary to collect, manage, and analyze data Consider all aspects of TG, including people and non-compute and compute resources Consider concerns related to data privacy and confidentiality

5 SAN DIEGO SUPERCOMPUTER CENTER Summary of (short-medium term) Impact RAT Recommendations #1: Modify the Partnerships Online Proposal System (POPS) to make it more useful and mineable #2: Create a “nugget” database #3: Instrument (compute and) non-compute resources for usage data collection #4: Categorize publications #5: Look deeper into the user community #6 Continue doing an annual user survey to gain direct feedback #7 Learn from others

6 SAN DIEGO SUPERCOMPUTER CENTER #1: Modify POPS to make it more useful/mineable Standardize the collection of existing and new data gathered from users Examples: Standardize PI name as last name, first name; select funding agency from a stored list; specify structure to proposal attachments Improve qualitative impact information requested from users during renewal process Examples: Why is the computational part hard? How did TeraGrid help you accomplish this part? Consider requesting standardized impact-related information from reviewers Examples: a) Type of research (e.g. incremental; high-risk, high-payoff, etc.); b) Numeric rating of impact quality

7 SAN DIEGO SUPERCOMPUTER CENTER #2: Create a “nugget” database Our current collection method is ad-hoc NSF would like us to improve on our nugget submittals NSF has their own Many could contribute Would not have to be only science successes (e.g., “practice” successes) Components of a good nugget (Guy Almes) Why is this science important? Why is the computational/cyberinfrastructure part hard? What did the TeraGrid do to help accomplish the computational/cyberinfrastructure part?

8 SAN DIEGO SUPERCOMPUTER CENTER #3: Instrument (compute and) non-compute resources for usage data collection Particularly those related to the “grid” part of TG Cross-site runs Grid Middleware Global File Systems Data Collections We have lots of SU related now

9 SAN DIEGO SUPERCOMPUTER CENTER #4: Categorize publications Recommend additional analysis of the POPS publication list Categorize citations according to journal (as applicable), discipline, “ranking,” and add the POPS proposal # associated with the publication. Provides greater detail on publication impact by showing quality of journal, etc Including the POPS proposal number will provide a means to tie publications to the TG resources and capabilities used and reviewer input.

10 SAN DIEGO SUPERCOMPUTER CENTER #5: Look deeper into the user community Improve the usage database so that it is possible to examine trends among “non-standard” users, such as: Social sciences Minority Serving Institutions For all users, track by: Institution and type of institution (e.g., 2-year, 4-year, MSI) Type of user (e.g., race, gender, and status) History of allocations received Over time, these data would be useful to help discern: Whether education, outreach, and training programs are having and impact. How usage changes over time Whether users continue to use TG (would be helpful in gaining an understanding of why users “leave”).

11 SAN DIEGO SUPERCOMPUTER CENTER #6 Continue doing an annual user survey to gain direct feedback A brief, focused survey minimizes the burden on users. Coordinating random samples among different surveys reduces the chance that the same users will be solicited more than once. TeraGrid should follow these and other guidelines to improve the reliability and validity of the surveys. In 2006, TG is doing this by participating in a survey being conducted by the University of Michigan evaluation team. Smaller surveys directed toward particular audiences or topics should also be considered. For example, pre- and post-surveys of researchers that benefit from ASTA support could be very informative.

12 SAN DIEGO SUPERCOMPUTER CENTER #7 Learn from others We should share what we’ve learned and monitor what others are doing. Share this report with a broad range of individuals and institutions to gain their feedback. DOD and DOE, Science Gateways, representative users, NSF officials, and experts in the measurement of science and technology impacts. Hold a workshop.

13 SAN DIEGO SUPERCOMPUTER CENTER Longer-term possibilities Social organization of research, economic impact, users and usage of hpc resources, etc. Potential methods: Network analysis, other forms of peer review, ongoing interviews and focus groups, and historical case studies

14 SAN DIEGO SUPERCOMPUTER CENTER Impact Requirements Analysis Team Co-Chairs: Mark Sheddon (SDSC) Ann Zimmerman (University of Michigan) Members: John Cobb (ORNL) Dave Hart (SDSC) Lex Lane (NCSA) Scott Lathrop (UC/ANL) Sergiu Sanielevici (PSC) Kevin Walsh (SDSC) GDO Supervisor: Dane Skow (ANL, Deputy Director) Final Report in the Forum: RATS: Science Impact and Metrics mi-rat@teragrid.org


Download ppt "SAN DIEGO SUPERCOMPUTER CENTER Impact Requirements Analysis Team Co-Chairs: Mark Sheddon (SDSC) Ann Zimmerman (University of Michigan) Members: John Cobb."

Similar presentations


Ads by Google