TeraGrid Allocations Discussion John Towns Director, Persistent Infrastructure National Center for Supercomputing Applications University of Illinois.

Slides:



Advertisements
Similar presentations
WENJIN ZHOU, PH.D. COMPUTER SCIENCE AND ENGINEERING WHAT I LEARNED FROM NSF CONFERENCE.
Advertisements

User Support Coordination Objectives for Plan Years 4 and 5 (8/1/2008 to 3/1/2010) Sergiu Sanielevici, GIG Area Director for User Support Coordination.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program How to write a good HBCU Proposal George Seweryniak DOE Program.
Merit Review and Proposal Preparation Mark Courtney Division of Environmental Biology
NSF Merit Review and Proposal Preparation Mark Courtney, Ph.D Adjunct, Department of Biology New Mexico State University 24 September 2008.
NSF Merit Review Criteria Revision Background. Established Spring 2010 Rationale: – More than 13 years since the last in-depth review and revision of.
Proposal Writing Workshop Features of Effective Proposals: Fellowship Track Washington, DC January 9, 2014.
Professional Development and Appraisal System
SAN DIEGO SUPERCOMPUTER CENTER Accounting & Allocation Subhashini Sivagnanam SDSC Special Thanks to Dave Hart.
Welcome and thanks for coming. Before we get started Please be sure to sign in!
National Science Foundation: Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES)
Overview of the National Science Foundation (NSF) and the Major Research Instrumentation (MRI) Program Office of Integrative Activities National Science.
UTIA Promotion & Tenure Workshop May 19, 2015 UTIA Promotion & Tenure Workshop May 19, 2015 Overall Philosophy: Maximize faculty FTE while maintaining.
The Learning Agreement, Intellectual Property Rights and Project Approval Professor Dianne Ford Director of PhD Studies, Faculty of Medical Sciences.
 NSF Merit Review Criteria Intellectual Merit Broader Impacts  Additional Considerations Integration of Research & Education Integrating Diversity into.
December, 2009 David Hart.  Allocation Stats  Processing  Interfaces.
December, 2009 Kent Milfeld, TG Allocations Coordinator.
Proposal Writing Workshop Features of Effective Proposals.
Basic Energy Sciences Advisory Committee Meeting March 3, 2010 Overview of the DOE Office of Science Graduate Fellowship Program Julie Carruthers, Ph.D.
FY Division of Human Resources Development Combined COV COV PRESENTATION TO ADVISORY COMMITTEE January 7, 2014.
Presentation by Wendy Launder General Manager CRC and Small Business Programs.
NSF CAREER Program & CAREER Proposals Claudia Rankins Program Director, Directorate of Education and Human Resources NSF CAREER Program.
Responsible Conduct of Research (RCR) What is RCR? New Requirements for RCR Who Does it Affect? When? Data Management What is the Institutional Plan? What.
SAN DIEGO SUPERCOMPUTER CENTER Impact Requirements Analysis Team Co-Chairs: Mark Sheddon (SDSC) Ann Zimmerman (University of Michigan) Members: John Cobb.
Coordinating the TeraGrid’s User Interface Areas Dave Hart, Amit Majumdar, Tony Rimovsky, Sergiu Sanielevici.
Introduction to The Grant Center Grant Center, Fitchburg State College.
PROMOTION AND TENURE FOR CLINICAL SCIENTISTS – BOTH PATHWAYS Peter Emanuel, M.D. Laura Lamps, M.D.
UFP/CS Update David Hart. Highlights Sept xRAC results POPS Allocations RAT follow-up User News AMIE WebSphere transition Accounting Updates Metrics,
October 21, 2015 XSEDE Technology Insertion Service Identifying and Evaluating the Next Generation of Cyberinfrastructure Software for Science Tim Cockerill.
Introduction to The Grant Center Fitchburg State University.
NSF IGERT proposals Yang Zhao Department of Electrical and Computer Engineering Wayne State University.
1 QEM/BIO Workshop October 21, 2005 Award Administration.
Workshop for all NSF-funded PIs regarding new NSF policies and requirements. America COMPETES Act contains a number of new requirements for all those funded.
 NSF Merit Review Criteria Intellectual Merit Broader Impacts  Additional Considerations Integration of Research & Education Broadening Participation.
Obtaining Computer Allocations and Monitoring Use SCD User Meeting at AMS January 11, 2005 Ginger Caldwell, SCD.
Session B – Broader Impacts: What’s the big idea? J. Britt HolbrookSharon Franks Center for the Study of InterdisciplinarityResearch Proposal Development.
Merit Review and Proposal Preparation JUAN CARLOS MORALES Division of Environmental Biology
Funding Caroline Wardle Senior Science Advisor, CISE Directorate National Science Foundation
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
1Mobile Computing Systems © 2001 Carnegie Mellon University Writing a Successful NSF Proposal November 4, 2003 Website: nsf.gov.
NSF policies and requirements for Implementation of the America COMPETES Act. America COMPETES Act contains a number of new requirements for all those.
DOSSIER PREPARATION MENTORING PROGRAM Session #3 June 17, 2014  CV and Summary Statements (feedback)  Review Teaching Statement of Endeavors and Supporting.
Limited Submissions NCURA Region III Spring Meeting.
1 Policy Recommendations for TeraGrid Resource Allocation Process Richard Moore TeraGrid’08 - June 2008 These are draft recommendations.
TeraGrid Institute: Allocation Policies and Best Practices David L. Hart, SDSC June 4, 2007.
NSF Funding Opportunities Anthony Garza. General Funding Opportunities Standard proposals or investigator-initiated research projects (submission once.
How to Obtain NSF Grants Review of Proposal Pieces A workshop providing information on the process of applying for external research awards. Sponsored.
BIO AC November 18, 2004 Broadening the Participation of Underrepresented Groups in Science.
1 Community-Based Care Readiness Assessment and Peer Review Overview Department of Children and Families And Florida Mental Health Institute.
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Tapping into National Cyberinfrastructure Resources Donald Frederick SDSC
December, 2009 Kent Milfeld, TG Allocations Coordinator.
SAN DIEGO SUPERCOMPUTER CENTER Allocation Policies and Proposal Best Practices David L. Hart, TeraGrid Area Director, UFP/CS Presenter:
Major Research Instrumentation- COV Discussion of the Issues and Recommendations with SMART October 18, 2005.
Data Infrastructure Building Blocks (DIBBS) NSF Solicitation Webinar -- March 3, 2016 Amy Walton, Program Director Advanced Cyberinfrastructure.
Accountability & Program Assessment Governing Board Online Training Module.
TG ’08, June 9-13, State of TeraGrid John Towns Co-Chair, TeraGrid Forum Director, Persistent Infrastructure National Center for Supercomputing.
TeraGrid’s Process for Meeting User Needs. Jay Boisseau, Texas Advanced Computing Center Dennis Gannon, Indiana University Ralph Roskies, University of.
User Representation in TeraGrid Management Jay Boisseau Director, Texas Advanced Computing Center The University of Texas at Austin.
1 TeraGrid Annual Review (2009) and TeraGrid Extension Proposal John Towns Chair, TeraGrid Forum Director, Persistent Infrastructure National Center for.
Intellectual Merit & Broader Impact Statements August 2016
Promotion & Tenure Workshop
Center for Excellence in Applied Computational Science and Engineering
Updating the Regulation for the JINR Programme Advisory Committees
NSF/NIH Review Processes University of Southern Mississippi
NSF/NIH Review Processes University of Southern Mississippi
Conflict of Interest in Research
Intellectual Merit & Broader Impact Statements August 2018
Intellectual Merit & Broader Impact Statements August 2017
S-STEM (NSF ) NSF Scholarships for Science, Technology, Engineering, & Mathematics Information Materials 6 Welcome! This is the seventh in a series.
Intellectual Merit & Broader Impact Statements August 2019
Presentation transcript:

TeraGrid Allocations Discussion John Towns Director, Persistent Infrastructure National Center for Supercomputing Applications University of Illinois

Please help us with the following questions: How well does the current TeraGrid allocation process serve the needs of both the research community and the providers of TeraGrid resources and services? Does the SAB have any advice on how this process might evolve as both the nature of research problems and the nature of TeraGrid resources and services evolve?

Overview of Allocations Process (I) Who Can Receive Allocations? –any Ph.D.-level researcher (or NSF- eligible PI) from a U.S. academic or non-profit research institution –NSF Graduate Research Fellows and Honorable Mention awardees –any discipline, any institution, any funding agency support What Can Be Requested? –three request sizes Small, Medium, or Large –more than 20 resources Computational Data Storage Visualization Advanced User Support –no cost to PIs

Overview of Allocations Process (II) When Are Requests Accepted? –Small requests: Anytime 30,000 CPU-hours 5 TB disk 25 TB tape –Medium requests: Quarterly 500,000 CPU-hours 25 TB disk 100 TB tape –Large requests: Semi- annually Where Do You Apply? –POPS

Overview of Allocations Process (III) How Do You Get Started? –for Small (start-up) requests, just need an abstract of the work and a CV for the PI allow 2 weeks to get everything set up. –for Medium or Large requests, a proposal is reviewed by independent committee at quarterly meetings.

Medium or Large Allocations PIs need to be aware of the lead time for getting an MRAC or LRAC award –Requires a written proposal –Reviewed by domain experts LRAC (Large) –Reviewed semi-annually –Awards begin April 1, Oct. 1 MRAC (Medium) –Reviewed quarterly –Awards begin Jan. 1, April 1, July 1, Oct. 1

Using Your Allocation New User Packet –provides your TeraGrid- wide and sSite-specific logins TeraGrid User Portal –online hub for monitoring and using your allocation –portal.teragrid.org guests welcome User Support

Allocations Policies High-level description of the nuggets of the policy –PI eligibility –proposal format, page length, etc –review process; review panel –review criteria –award management Who sets them and how are they evolved? –ad hoc team to develop recommendations and comments John Towns, Nancy Wilkin-Diehr, Ralph Roskies, Dave Hart, et.al. –community input to gain consensus PIs, xRAC Panel, NSF Program Officers, RP representatives, etc. –has evolved policies: community account/science gateways storage support services what’s next?

Specific Issues (I) How does the allocation panel review take into account the review of scientific merit and broader impacts of the proposed work already done by the NSF? How does “Double jeopardy” with separate financial and resource allocation proposals get handled? –policy if review of science has been done, it is NOT the purview of xRAC Panel panel only does this in cases of requests not supported financially from grants that result for proposals that have had such review –practice of enforcement remanded at start of every meeting; reigned in during panel reviews How does the allocation process take into account the science impact, if at all, of the proposed TG usage? –Actually, that is not really the specific purview of the review panel, though they do consider it if there is ambiguity amongst the panel on how to handle a request. This gets at the review of the science, which is typically NOT what the xRAC reviews. –Question for SAB - Should this factor be given more or less (or any) emphasis?

Specific Issues (II) How are the allocations reviewers selected? What process or criterion is used to ensure that they have adequate expertise? What term limits (if any) should be applied to their service? –typically look for recommendations of outgoing reviewers –also look for recommendations of NPS program officers –try to assess the work they do from on-line information –do not conduct a formal review process of potential reviewers –from policy document: “The committees consist of volunteers who are selected from the faculty and staff of U.S. universities, laboratories and other research institutions. All of the committee members have expertise in some area of computational science or engineering and serve a term of 2-5 years.” –typically try to keep most reviewer to 2-3 years. –some kept 4-5 years for various reasons (reviewer availability, quality of reviews, etc.)

Proposal Counts Typically 3-6 reviewers per proposal –larger number for really big LRAC requests Typically ~35 members of review panel High reviewer load: –some reviewers will have proposals when MRAC and LRAC coincide Sep-05Dec-05Mar-06Jun-06Sep-06Dec-06Mar-07Jun-07Sep-07Dec-07 # of Proposals MRAC LRAC

Specific Issues (III) Who should be serving as panel discussion chair at the allocations committee meetings, and what guidelines are in place for the conduct/role of this chairperson? –in practice, selected by Allocations Officers from panelists –guidance given to chair by Allocations Officers –Invite input from SAB!!

Specific Issues (IV) What role, if any, should NSF program directors funding projects in large-scale computation have in the allocations process? –in practice have only been observers in the process and provided responses on NSF-level policy issues

Specific Issues (V) Are the conflict-of-interest policies applied to allocation panelists more or less stringent than those used for NSF panels? –COI policy generally follow NSF COI rules –collaborators, advisors, etc. –institutional conflicts difference: proposers may be on panel that reviews their proposal conflicted reviewers are dismissed from room during discussions involving conflicts –Question for SAB - Should they be?

Specific Issues (VI) How are reviewers' areas of expertise matched to the allocation requests? –reviewers provide list of primary and secondary FOS –PIs indicate primary and secondary FOS of proposal –provides first-order matching –Allocations Officers review and address unassigned proposals –Question for SAB - Is this the most effective strategy?

Specific Issues (VII) How does the allocation process enable individual projects to have dedicated access to large fractions of a given system for significant periods of time? How do TG management practices and metrics likewise enable such partitioning of the systems? –PIs can specifically request –ask PIs to provide 1 one page additional description additional information regarding this needs; allow RPs to prepare –fundamentally, availability is controlled by the RP; nothing specifically requires this is seen as being in RP’s own best interest to provide requested by community and typically well justified. –Question for SAB - Are these sufficient?

Specific Issues (VIII) How will the allocation process take into account the jumps in available resources and system size when each Track 2 system and, eventually, the Track 1 system come online? Do the Track 2 and Track 1 systems present any particular challenges to the allocation process? –addressing this and related issues within the TG Extreme Scalability RAT –fundamentally, current process is sufficient –NSF has initiated “PRAC” solicitation and this will impact process for Track 1