Writing a Successful XSEDE Proposal

Slides:



Advertisements
Similar presentations
How to Register as an Employer and Post Jobs in Job-Link Job-Link Training for Institutional & Federal Student Employment Supervisors Rhonda Crisp Employer.
Advertisements

CareCentrix Direct Training.
TITLE OF PROJECT PROPOSAL NUMBER Principal Investigator PI’s Organization ESTCP Selection Meeting DATE.
Author Instructions How to upload a full session proposal with abstracts – two step process.
Author Instructions How to upload Single Abstract to the paper management system Single Abstract is a document that describes one presentation that someone.
How To Use NCA’s Online Grant Application System
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program How to write a good HBCU Proposal George Seweryniak DOE Program.
HOW TO USE THE SYSTEM Specialty Crop Block Grant Program Online System.
Updated 1/16/2014 How-To Guide for ‘Student Intern’ positions with the Forest Service Application Process for Pathways Program.
Avon and Avon Lake High School Night Wednesday, December 9, 2009 Karen Tijanich Lorain County Community College Ginny Biada Stautzenberger College.
FastLane Project Reports System 101 For Faculty and Staff Submitting Annual, Final or Interim Reports on National Science Foundation Projects May 1999.
New School Process Please read the following slides carefully, as they provide helpful information about the new school account creation process.
The Registration Experience Student Registration via Self-Service.
Emily Lynn Grant Administrator Office of Sponsored Projects and Research Administration.
Avon Foundation for Women Breast Health Outreach Program Online Application Tutorial.
Strategies for Effective Grantwriting Katherine (Katie) McGraw Howard University Graduate School Responsible Conduct of Research Workshop October 25, 2011.
The Proposal Process, Best Practices & Policies Kent Milfeld TeraGrid Allocations Coordinator
Grant Maintenance Title I Technical Assistance & Networking Session October 6, 2011.
1 Framework Programme 7 Guide for Applicants
December, 2009 David Hart.  Allocation Stats  Processing  Interfaces.
December, 2009 Kent Milfeld, TG Allocations Coordinator.
NIH Implementation and Guidance Syracuse University Office of Sponsored Programs Stuart Taub.
Lead Management Tool Partner User Guide March 15, 2013
RISE Grant: Proposal Writing Workshop UAF RISE Board February 21, 2014.
San Joaquin Delta College Flex Calendar Program General Flex at Delta Types of Activities Administration of Program Process Filling Out the Flex Contract.
October 21, 2015 XSEDE Technology Insertion Service Identifying and Evaluating the Next Generation of Cyberinfrastructure Software for Science Tim Cockerill.
The Basics of the Effort Certification and Reporting Technology (ECRT) System.
Define the project identify potential funding sources gather information write and package the proposal submit the proposal to a funder Piece of cake?
How to Submit An Amendment Tips from the 21 st CCLC Unit Updated September 17, 2009.
Submitting Course Outlines for C-ID Designation Training for Articulation Officers Summer 2012.
240-Current Research Easily Extensible Systems, Octave, Input Formats, SOA.
Tutorial Contents Text Cut/Paste & Format OhioLINK ETD Home UT ETD Home ETD Entry Form UT Grad School University Libraries ETD Guide OhioLINK Electronic.
J.P. Hornak, , 2004 Research Practices http://
TeraGrid Allocations Discussion John Towns Director, Persistent Infrastructure National Center for Supercomputing Applications University of Illinois.
AEBG Webinar September 25, Agenda for Today MOE & Consortia allocations update Governance Questions Adult Education Block Grant Reporting Toolkit.
Richard MocarskiLauren Wilson Coord. of Res. Comms.Senior Associate Dir.OSP.
Proposal Preparation NSF Regional Grants Conference October 4 - 5, 2004 St. Louis, MO Hosted by: Washington University.
2015 HUD Continuum of Care NOFA Bidder’s Conference 1.Review of Community Input Session 2.Local Process and Strategic Changes 3.Application Highlights.
Grants and Contracts Jim Butterfield and Kathy Blackwood October 5, 2004.
Author Instructions How to upload Abstracts and Sessions to the Paper Management System.
TeraGrid Institute: Allocation Policies and Best Practices David L. Hart, SDSC June 4, 2007.
OCTOBER 18, 2011 SESSION 9 OF AAPLS – SELECTED SUPPORTING COMPONENTS OF SF424 (R&R) APPLICATION APPLICANTS & ADMINISTRATORS PREAWARD LUNCHEON SERIES Module.
NSF Core Documents and Online Resources for Proposal Preparation and Post-Award Activities Jeffrey G. Ryan School of Geosciences Former NSF Program Director.
Application guidelines, Forms and evaluation criteria CBO Window Fannie Nthakomwa December 2015.
How to Obtain NSF Grants Review of Proposal Pieces A workshop providing information on the process of applying for external research awards. Sponsored.
Copyright 2010, The World Bank Group. All Rights Reserved. Statistical Work Plan Development Section A 1.
Research Methods Technical Writing Thesis Conference/Journal Papers
December, 2009 Kent Milfeld, TG Allocations Coordinator.
IllinoisJobLink.com Training Video Creating a Resume Copyright © 2015, America’s Job Link Alliance–Technical Support (AJLA–TS) All rights reserved. This.
SAN DIEGO SUPERCOMPUTER CENTER Allocation Policies and Proposal Best Practices David L. Hart, TeraGrid Area Director, UFP/CS Presenter:
Submitting Your Thesis/Dissertation into Digital Southern.
How To Use NCA’s Online Grant Application System.
SFSP Pre-Qualification Packet Returning Sponsors This institution is an equal opportunity provider and employer.
PROPOSAL REVIEW AND SUBMISSION FYAP May 5, 2016 Julie Wammack Sponsored Research Administration.
Non-Tenure Track (NTT) Appointment Process (Preparer Edition) Version 1.0 Presented by the Office of Organizational Research and Data Management School.
Principal Investigator ESTCP Selection Meeting
REUs: Technical Issues and Funding Sources
The University of Delaware Higher Education Consortia
How to Navigate IRB Paperwork.
How to Navigate IRB Paperwork.
Principal Investigator ESTCP Selection Meeting
Introduction to XSEDE Resources HPC Workshop 08/21/2017
Project Grant: Fall 2016 Competition
How to Navigate IRB Paperwork.
Monster.com: An Introduction
How to Navigate IRB Paperwork.
Principal Investigator ESTCP Selection Meeting
Scholarship America Dollars for Scholars: Completing the Student Profile All Dollars for Scholars scholarships are applied for online via the Dollars.
Principal Investigator ESTCP Selection Meeting
Presentation transcript:

Writing a Successful XSEDE Proposal Ken Hackworth XSEDE Allocations Coordinator

Outline References & Terms Main Document and guidelines for XSEDE Research (XRAC) Request Research Objectives Computational Methodology (and Applications/Codes to be used) Application Efficiencies Computational Research Plan Justification for SUs (TB) requested Additional considerations Why all of this  If you do this right, all goes well. If you don’t, expect delays. What are the costs $40M/yr  $10M/quarter, with about 300M SUs/quarter that is $0.03/CPU hr. 10M SUs = $300,000 dollars; 1M SUs = $30,000.

Outline continued: Other Documents (*Progress Report and Publications for Renewals) Review Criteria Overview of Proposal (Request) Types and Actions XSEDE Awards (Allocations) XSEDE Systems (Resources) Procedures for submitting Allocation request

Allocation Request Types The Lingo Allocation Request Types Startup Development/testing/ porting/benchmarking Education Classroom, Training Research Program (usually funded) PI Principal Investigator POPS Partnerships Online Proposal System XRAC XSEDE Resource Allocations Committee SU Service Unit = 1 Core-hour 3 Types of XSEDE Projects

“Traditional” v. Community XRAC proposals are accepted in four general categories of research activities Single Principal Investigator Large research Collaborations (e.g., MILC consortium) Community Consortiums (e.g., NEES) Community Services (e.g., XSEDE Gateways) The general requirements for proposals of all four types remain largely the same. Whether requesting compute, storage, visualization, or advanced support or some combination For this discussion, I’ll consider individual investigators and research collaborations as “Traditional” proposals community projects and services as “Community” proposals This award is an outcome of the NSF 08-519 program solicitation George E. Brown, Jr. Network for Earthquake Engineering Simulation (NEES) Research (NEESR) ... We use large scale numerical simulations to study quantum chromodynamics (QCD), the theory of the strong interactions of subatomic physics.

General Proposal Outline Research Objectives Computational methodology (Applications/Codes) Application efficiencies Computational Research Plan Justification for SUs(TB) requested Additional considerations Note: Sections III and IV are often integrated. Explain your Science. Have a few lines about the Benefits to Society and Broad Impact. We’ll go through these sections in turn. As noted, you can apply these guidelines to requests for all types of resources (tweaked appropriately)

I. Research Objectives Traditional proposals Community proposals Describe the research activities to be pursued Community proposals Describe the classes of research activities that the proposed effort will support. Keep it short: You only need enough detail to support the methods and computational plan being proposed. TIP—Reviewers don’t want to read the proposal you submitted to NSF/NIH/etc., but they need to see if you have merit-reviewed (grant) funding. Same rule applies to both compute and storage allocation requests. Community projects (e.g. NEES) might support greater flexibility, while Community services (e.g. ROBETTA) provide very specific capabilities

II. Computational Methods (and Applications/Codes used) Very similar between traditional and community proposals. For compute requests Describe the Applications and components you will use. Describe the methods/algorithms employed in your computational research Describe code development/features/advances ‘home-grown’ codes. For storage requests Provide description of data to be stored (organization, formats, collection mechanisms, permissions granted or received) Describe the amount and expected growth of data to be stored. Very similar between traditional and community proposals. More significant if using ‘home-grown’ codes. If using widely known third-party codes (e.g., NAMD, CHARMM, AMBER), you can cut some corners here, although you should explain why you chose this code over alternatives. Similarly for data storage: if it’s your own data, describe it in more detail than if you are storing data in a well-documented format Provide performance and scaling details on problems and test cases similar to those you plan to pursue. Or that you expect the community to pursue. Describe why this code is a good fit for the resource(s) requested and/or list acceptable alternatives. Similarly for storage allocations, describe the methods to be applied to the data stored. Ideally, provide performance and scaling data collected by you for the specific resource(s) you are requesting This gives reviewers additional confidence that you know what you’re doing.

III. Application Efficiencies Very similar between traditional and community proposals. For compute requests Explain why you chose specific resources for your applications. Provide performance and scaling details on problems and test cases similar to those being pursued. (What is the appropriate scale for your problem?) Ideally, provide performance and scaling data collected by you for the specific resource(s) you are requesting. For storage requests Explain the efficiency of your storage algorithms and protocols. Describe and estimate the expected costs of scaling to larger data sets and a larger number of clients. Very similar between traditional and community proposals. More significant if using ‘home-grown’ codes. If using widely known third-party codes (e.g., NAMD, CHARMM, AMBER), you can cut some corners here, although you should explain why you chose this code over alternatives. Similarly for data storage: if it’s your own data, describe it in more detail than if you are storing data in a well-documented format Provide performance and scaling details on problems and test cases similar to those you plan to pursue. Or that you expect the community to pursue. Describe why this code is a good fit for the resource(s) requested and/or list acceptable alternatives. Similarly for storage allocations, describe the methods to be applied to the data stored. Ideally, provide performance and scaling data collected by you for the specific resource(s) you are requesting This gives reviewers additional confidence that you know what you’re doing.

IV. Computational Research Plan Traditional proposals Explicitly describe the problem cases you will examine BAD: “…a dozen or so important proteins under various conditions…” GOOD: “…7 proteins [listed here; include scientific importance of these selections somewhere, too]. Each protein will require [X] number of runs, varying [x] parameters [listed here] [in very specific and scientifically meaningful ways]…” Science Gateway proposals Explicitly describe the typical use-case(s) that the gateway supports and the type of runs that you expect users to make Describe how you will help ensure that the community will make scientifically meaningful runs (if applicable) BAD: “…the gateway lets users run NAMD on XSEDE resources…” BETTER: “…users will run NAMD jobs on [biological systems like this]…” BETTER STILL: “…the gateway allows users to run NAMD jobs on up to 128 processors on problem sizes limited [in some fashion]…” Similarly for storage allocations, describe what data will be stored and how data will be used, as explicitly as possible.

V. Justification of SUs, TBs Traditional, Research proposals If you’ve done sections II, III and IV well, this section should be a straightforward math problem For each research problem, calculate the SUs required based on runs (base units) defined in IV and the timings in section III, broken out appropriately by resource Reasonable scaling estimates from test-case timing runs to full-scale production runs are acceptable. Clear presentation here will allow reviewers to award time or storage in a rational fashion Analogous calculations should apply for storage requests In poorly written proposals, a computational plan will read as “We will study about a dozen proteins under a variety of conditions.” This will be followed by a “justification” that basically says “We estimate we will need approximately 500,000 SUs for these computations.” The key here: For your proposal to succeed, your justification should be a straightforward math problem, and you should “show your work.” Every operand in the math problem should be supported scientifically and computationally elsewhere in the proposal. The reviewers will accept reasonably rounded figures.

V. Justification of SUs, TBs Community (gateway-type) proposals The first big trick: Calculating SUs when you don’t know the precise runs to be made a priori. renewIn Year 2 and beyond Start with an estimate of total usage based on prior year’s usage patterns and estimate for coming year’s usage patterns. From this information, along with data from sections IV and III, you can come up with a tabulation of SU estimates. Year 1 requires bootstrapping Pick conservative values (and justify them) for the size of the community and runs to be made, and calculate SUs. TIP—Start modestly. If you have ~0 users, don’t expect the reviewers to believe that you will get thousands (or even hundreds) next year. Analogous calculations for TBs of storage needed The same rule applies here: Your justification should be a straightforward math problem, you should show your work, and you should justify all the operands used to reach the SU total.

VI. Additional Review Considerations Ability to complete the work plan described (more significant for larger requests) Sufficient merit-reviewed funding Staff, both number and experience Local computing environment Special Needs Other access to HPC resources (e.g., Campus centers, DOE centers, etc.) These considerations can be addressed briefly. The reviewers consider them more as a checklist, rather than in great detail.

VI. Additional Considerations Community (gateway) proposals these components can provide key details: Community Support and Management Plan Describe the gateway interface — in terms of how it helps community burn SUs or access TBs. Describe plans for growing the user community, “graduating” users to Research allocation awards, regulating “gateway hogs” Progress report The actual user community and usage patterns Manuscripts thanking this service, or list articles referencing XSEDE. Local computing environment Other HPC resources

Renewals require a Progress Report For Research Project Renewal and Supplement Requests Summary of Scientific Discoveries Accomplishments of Computation Plan Usage Achievements of the Computations (more detail than summary). Specify the number of publications, conferences, reports that result from XSEDE support. Contributions to other research efforts. (experimental/computational/instrumental, etc.). Report New scientific discoveries Computational accomplishments of the previous computational work plan (list and SUs used) Summary of publication information, including conference presentations, technical reports, etc. (Put the Publication list in a separate document, but report the number of publications and other proceedings and reports in the Progress report.) Contributions to other research efforts and fields of science (experimental/computational/instrumental, etc.). Community Support and Management Plan Instead of staff/experience You may want to include brief description of gateway interface, the fact that it has been used for production work, relevant development effort — in terms of how it helps community burn SUs. If you have a plan for growing the user community, for “graduating” users from the gateway to their own MRAC awards, it would be good to mention. If you somehow regulate “gateway hogs,” describe that. Progress report: Provide details of the actual user community and usage patterns seen in the prior award period. List manuscripts published, accepted, submitted or in preparation, thanks to this service. Helps convince reviewers that SUs haven’t gone down a black hole. May be trickier depending on the nature of the community activities. Local computing environment, Other HPC resources: Same as for traditional proposals.

Other Documents: Required: CVs for PIs and Co-PIs (2 pages) List of Publications resulting from the XSEDE allocation Optional: Code Performance & Scaling (If it won’t fit in Main Doc.) Special Requirements References (If they won’t fit in Main Doc.) Other Community Support and Management Plan Instead of staff/experience You may want to include brief description of gateway interface, the fact that it has been used for production work, relevant development effort — in terms of how it helps community burn SUs. If you have a plan for growing the user community, for “graduating” users from the gateway to their own MRAC awards, it would be good to mention. If you somehow regulate “gateway hogs,” describe that. Progress report: Provide details of the actual user community and usage patterns seen in the prior award period. List manuscripts published, accepted, submitted or in preparation, thanks to this service. Helps convince reviewers that SUs haven’t gone down a black hole. May be trickier depending on the nature of the community activities. Local computing environment, Other HPC resources: Same as for traditional proposals.

Proposal Review Criteria Methodology For compute requests, the choice of applications, methods, algorithms and techniques to be employed to accomplish the stated objectives should be reasonably justified. While the accomplishment of the stated objectives in support of the science is important, it is incumbent on proposers to consider the methods available to them and to use that which is best suited. (For storage requests, the data usage, access methods, algorithms and techniques to be employed to accomplish the stated research objectives should be reasonably justified. For shared collections, proposers must describe the public or community access methods to be provided.) State Appropriateness of Computations for Scientific Simulations The computations must provide a precise representation of the physical phenomena to be investigated. They must also employ the correct methodologies and simulation parameters (step size, time scale, etc.) to obtain accurate and meaningful results. Describe the Efficiency in Usage of Resources The resources selected must be used as efficiently as is reasonably possible. To meet this criterion for compute resources, performance and parallel scaling data should be provided for all applications to be used along with a discussion of optimization and/or parallelization work to be done to improve the applications. (For storage resources, information on required performance and expected access patterns should be provided for all data and collections to be stored and used along with a discussion of work done or planned to improve the efficiency of the data use.) Computational Research Plan Explain computational steps to accomplish science. Give details of computational costs. (Justification) I copied and pasted this from the policies document. You can read it later. To sum up: Are the computational method or applications appropriate to the science objectives? -AND- Do you have a plan to use those methods and applications in a sensible fashion? Did you select the best resources for the methods/applications? Are your codes/applications efficient on the resources selected? More important for ‘home-grown’ codes. In general, these review criteria apply to both compute, storage and advanced support requests.

XSEDE Projects An XSEDE Project is like a bank account for allocations. It is permanent, only one per PI. It holds a year’s worth of allocation (on 1 or more systems) PI’s request an allocation renewal each year thereafter. An Allocation awarded to a New Request creates an XSEDE Project. A PI’s Computational Projects evolve over the years. Computational Projects begin, end and extend. In subsequent years successful Renewal Requests provide allocations for new Computational Projects under the same XSEDE Project. Your XSEDE Project remains the same. A Renewal Requests is just like New Request, but must contain a Progress Report of last year’s Computational Projects and list of publications from past year’s allocation.

Eligibility Principal investigator (PI) must be a researcher or educator at a U.S.-based institution, including federal research labs or commercial organizations, (Commercial requests must guarantee that their results are publically available, and work must be in collaboration with an open science organization.) A postdoctoral researcher is eligible to be a PI. A qualified advisor may apply for an allocation for his or her class; but a high school, undergraduate or graduate student may not be a PI.

Overview: Research Request portal.xsede.org AllocationsSubmit/Review Request ** Web forms: Investigator, Grants, Resource Request,… Requires Main Doc. = “proposal” (pdf upload) & CV Reviewed by experts in same Field of Science 2.5 months from deadline to award availability Details: Allocation Size: Unlimited Reviewed: Quarterly Deadlines: 15th of October, January, April, July Awards Begin: 1st of January, April, July, October Most importantly: PIs need to be aware of the lead time for getting an Research award. Need to allow for proposal submission, review and award. May take up to five months to start Research award if you miss the date. Plan Startup use and your expectations accordingly. (By comparison, NSF funding grants can take six months or more. We do better than that, but the process still takes time.) Requires a written proposal Guidelines in the Allocations policies Reviewed by domain experts (The MRAC and LRAC are actually the same set of reviewers; they wear different hats on different days, whether they’re reviewing medium-scale or large-scale requests.) MRAC and LRAC proposals allow you to request time on combinations specific resources (e.g., SDSC DataStar p655s) or TeraGrid Roaming, or both. MRAC limit: 500,000 SUs MRAC requests accepted, reviewed quarterly Submission deadlines in January, April, July, October (approximately 2.5 months prior to award start) MRACs start April 1, July 1, October 1, January 1 LRAC requests: more than 500,000 SUs LRAC requests accepted, reviewed semi-annually Submission deadlines in January, July LRACs start April 1, October 1

Overview: Startup/Education Requests portal.xsede.org AllocationsSubmit/Review Request ** Web forms: Investigator, Resource Request,… Requires only an abstract and CV Reviewed by a XSEDE Staff (Startup Allocations Committee) 2 weeks from submission to award availability For code devel / performance eval / small-scaling computations / classroom & training instruction Details: Request limit: 200,000 SUs total or combination of all resources requested Reviewed: within 2 weeks of submission Deadlines: None Awards Begin: within 2 weeks of submission DAC awards are not required prior to submitting an MRAC or LRAC proposal, but usually a good idea particularly to gain familiarity with codes and resources, collect timing and scaling data, and demonstrate knowledge to reviewers Usually reviewed and awarded within 2 weeks. Reviewed by TeraGrid RP staff.

Pg. limit: DOES INCLUDE FIGURES & TABLES. Proposal Document(s) https://www.xsede.org/web/xup/allocation-policies** CV (s) required for all requests. Abstract for startup/education request (in forms, or as a PDF document) Proposal “Main Document” for Research request (renewals/supplements) Key to a successful review: Adhere to page limits! “Justify” allocation request. Page Limit Proposal Document 3 Progress report 10 New or Renewal 15 Over 10 Million SUs The POPS forms are primarily for record keeping, award administration, and NSF reporting purposes. Proposals are where the reviewers focus their attention. Pg. limit: DOES INCLUDE FIGURES & TABLES.

The Award = Allocation One per PI (generally) 1-year duration Unused SUs are forfeited at the end of an award period Progress report required for renewal requests. Add users to a grant via XSEDE User Portal 4 quarters = 1 yr allocation period Awards aka “grants”, aka “projects”, aka “allocations” The one proposal per PI rule is designed to minimize the number of proposals that require review And the effort required of the allocations staff. There are exceptions, but they are rare. Contact allocations staff before trying this. Advance Award Time to renew Submission Review

The Resources: Compute https://www.xsede.org/resources/overview HPC Systems: (Kraken, Ranger, Lonestar, Steele, Trestles, Blacklight, Keeneland, Quarry, Gordon) Advanced VIS Systems: (Longhorn, Nautilus, Spur) HTC Systems: (Condor and OSG) Storage Systems: (local resource storage) Lincoln consists of 192 compute nodes (Dell PowerEdge 1950 dual-socket nodes with quad-core Intel Harpertown 2.33GHz processors and 16GB of memory) and 96 NVIDIA Tesla S1070 accelerator units. Each Tesla unit provides 345.6 gigaflops of double-precision performance and 16GB of memory. Dash 245TFLOPS 64TB or RAM and 256TB of flash memory Kraken 113K Ranger 63K Cores: Lonestar: 23K (Abe: 9600 QB: 5344 Steele: 7216) 22K Athena 18K Trestles: 10K Black Light: 4096 Ember 2048 Pople 768 (3072 cores, 1/5 for TG, ends 3/2011)

The Resources: Extended Collaborative Support(ECS) https://www.xsede.org/ecss Dedicated, but limited, XSEDE staff assistance (request FTE months) 5 Questions which are part of resource request section of application Reviewers rate need for ECS (0-3) Dedicated TeraGrid staff assistance to achieve specific computational objectives Called Advanced Support Program in POPS TeraGrid ASTA SDSC SAC NCSA SAP Limited resources! MRAC/LRAC reviewers rate possible projects RPs factor in these ratings along with staff availability in making the final selections The extra info in proposals is a one-page “special request” that describes what you want to accomplish with the staff assistance how it will benefit your scientific objectives, and who from your group will be collaborating with the staff.

The Process: Steps Assess Systems: https://www.xsede.org/resources/overview Determine Type of Project Login: portal.xsede.org AllocationsSubmit/Review Request ** (“Create portal login” if first time.) Select Action (New, Renewal, Suppl/Just/Prog/Ext/Trans/Adv) Select project Type (Research; Startup/Edu < 200K SUs) Fill in forms: PI/Co-PI Info, Proposal Info, Supporting Grants, Resource Request (alloc. request/machine). Upload proposal document(s): (Main Doc., CVs, etc.). Update at anytime and “Save to date” Click “Final Submission” when finished (but can still change)

Login at portal.xsede.org Straightforward (mostly) Once you get to the Web-based data entry forms Latest changes Supporting grant information You can now login to POPS with your TeraGrid Portal login Coming soon Better TeraGrid portal integration

Example Form: New Project

Example Form: PI entry page

Example Form: PI entry page, populating with Portal information Clicking on this box will auto-populate this page with your portal information

Example proposal submission: Title, Abstract, FOS and Keywords

Example proposal submission: Supporting grants

Example proposal submission: Resource request page

Example proposal submission: Resources request page (continued)

Example proposal submission: Document upload page

Example proposal submission: Document upload (continued) Check mark confirms I have uploaded CV

Example proposal submission: Saving and Final Submission

Example proposal submission: Saving and Final Submission

Example proposal submission: Successful submission

Pending Request

Approved Request

Interesting Facts ~600 research requests per year ~800 other requests ~3.5B SUs requested(3.2B are research requests) ~1.8B SUs awarded(1.6B are research awards)

Asking for Help help@xsede.org Questions? Asking for Help help@xsede.org Asking for Help The allocations staff want you to succeed. We can provide advice and guidance on proposals, answer policy questions, etc. Multi-year Awards Possible, but not recommended for new PIs Only Progress Reports required in subsequent years Highly recommended to confer with allocations staff before submitting a multi-year. Justifications To address reviewer concerns and get more of the requested SUs Best for specific omissions (not to salvage horrible proposals) Supplements Request additional SUs during a 12-month allocation period Not for DACs! Reviewed by MRAC/LRAC members. Extensions Can extend award period an additional 6 months for cause No additional SUs! Advances Once MRAC/LRAC proposal submitted, up to 10% of request can be provided in advance IF RESOURCES ARE AVAILABLE This can help cover the gap between proposal submission and award start. 43