Presentation is loading. Please wait.

Presentation is loading. Please wait.

Writing a Successful XSEDE Proposal

Similar presentations


Presentation on theme: "Writing a Successful XSEDE Proposal"— Presentation transcript:

1 Writing a Successful XSEDE Proposal
Ken Hackworth XSEDE Allocations Coordinator

2 Outline References & Terms
Main Document and guidelines for XSEDE Research (XRAC) Request Research Objectives Computational Methodology (and Applications/Codes to be used) Application Efficiencies Computational Research Plan Justification for SUs (TB) requested Additional considerations Why all of this  If you do this right, all goes well. If you don’t, expect delays. What are the costs $40M/yr  $10M/quarter, with about 300M SUs/quarter that is $0.03/CPU hr. 10M SUs = $300,000 dollars; 1M SUs = $30,000.

3 Outline continued: Other Documents (*Progress Report and Publications for Renewals) Review Criteria Overview of Proposal (Request) Types and Actions XSEDE Awards (Allocations) XSEDE Systems (Resources) Procedures for submitting Allocation request

4 Allocation Request Types
The Lingo Allocation Request Types Startup Development/testing/ porting/benchmarking Education Classroom, Training Research Program (usually funded) PI Principal Investigator POPS Partnerships Online Proposal System XRAC XSEDE Resource Allocations Committee SU Service Unit = 1 Core-hour 3 Types of XSEDE Projects

5 “Traditional” v. Community
XRAC proposals are accepted in four general categories of research activities Single Principal Investigator Large research Collaborations (e.g., MILC consortium) Community Consortiums (e.g., NEES) Community Services (e.g., XSEDE Gateways) The general requirements for proposals of all four types remain largely the same. Whether requesting compute, storage, visualization, or advanced support or some combination For this discussion, I’ll consider individual investigators and research collaborations as “Traditional” proposals community projects and services as “Community” proposals This award is an outcome of the NSF program solicitation George E. Brown, Jr. Network for Earthquake Engineering Simulation (NEES) Research (NEESR) ... We use large scale numerical simulations to study quantum chromodynamics (QCD), the theory of the strong interactions of subatomic physics.

6 General Proposal Outline
Research Objectives Computational methodology (Applications/Codes) Application efficiencies Computational Research Plan Justification for SUs(TB) requested Additional considerations Note: Sections III and IV are often integrated. Explain your Science. Have a few lines about the Benefits to Society and Broad Impact. We’ll go through these sections in turn. As noted, you can apply these guidelines to requests for all types of resources (tweaked appropriately)

7 I. Research Objectives Traditional proposals Community proposals
Describe the research activities to be pursued Community proposals Describe the classes of research activities that the proposed effort will support. Keep it short: You only need enough detail to support the methods and computational plan being proposed. TIP—Reviewers don’t want to read the proposal you submitted to NSF/NIH/etc., but they need to see if you have merit-reviewed (grant) funding. Same rule applies to both compute and storage allocation requests. Community projects (e.g. NEES) might support greater flexibility, while Community services (e.g. ROBETTA) provide very specific capabilities

8 II. Computational Methods (and Applications/Codes used)
Very similar between traditional and community proposals. For compute requests Describe the Applications and components you will use. Describe the methods/algorithms employed in your computational research Describe code development/features/advances ‘home-grown’ codes. For storage requests Provide description of data to be stored (organization, formats, collection mechanisms, permissions granted or received) Describe the amount and expected growth of data to be stored. Very similar between traditional and community proposals. More significant if using ‘home-grown’ codes. If using widely known third-party codes (e.g., NAMD, CHARMM, AMBER), you can cut some corners here, although you should explain why you chose this code over alternatives. Similarly for data storage: if it’s your own data, describe it in more detail than if you are storing data in a well-documented format Provide performance and scaling details on problems and test cases similar to those you plan to pursue. Or that you expect the community to pursue. Describe why this code is a good fit for the resource(s) requested and/or list acceptable alternatives. Similarly for storage allocations, describe the methods to be applied to the data stored. Ideally, provide performance and scaling data collected by you for the specific resource(s) you are requesting This gives reviewers additional confidence that you know what you’re doing.

9 III. Application Efficiencies
Very similar between traditional and community proposals. For compute requests Explain why you chose specific resources for your applications. Provide performance and scaling details on problems and test cases similar to those being pursued. (What is the appropriate scale for your problem?) Ideally, provide performance and scaling data collected by you for the specific resource(s) you are requesting. For storage requests Explain the efficiency of your storage algorithms and protocols. Describe and estimate the expected costs of scaling to larger data sets and a larger number of clients. Very similar between traditional and community proposals. More significant if using ‘home-grown’ codes. If using widely known third-party codes (e.g., NAMD, CHARMM, AMBER), you can cut some corners here, although you should explain why you chose this code over alternatives. Similarly for data storage: if it’s your own data, describe it in more detail than if you are storing data in a well-documented format Provide performance and scaling details on problems and test cases similar to those you plan to pursue. Or that you expect the community to pursue. Describe why this code is a good fit for the resource(s) requested and/or list acceptable alternatives. Similarly for storage allocations, describe the methods to be applied to the data stored. Ideally, provide performance and scaling data collected by you for the specific resource(s) you are requesting This gives reviewers additional confidence that you know what you’re doing.

10 IV. Computational Research Plan
Traditional proposals Explicitly describe the problem cases you will examine BAD: “…a dozen or so important proteins under various conditions…” GOOD: “…7 proteins [listed here; include scientific importance of these selections somewhere, too]. Each protein will require [X] number of runs, varying [x] parameters [listed here] [in very specific and scientifically meaningful ways]…” Science Gateway proposals Explicitly describe the typical use-case(s) that the gateway supports and the type of runs that you expect users to make Describe how you will help ensure that the community will make scientifically meaningful runs (if applicable) BAD: “…the gateway lets users run NAMD on XSEDE resources…” BETTER: “…users will run NAMD jobs on [biological systems like this]…” BETTER STILL: “…the gateway allows users to run NAMD jobs on up to 128 processors on problem sizes limited [in some fashion]…” Similarly for storage allocations, describe what data will be stored and how data will be used, as explicitly as possible.

11 V. Justification of SUs, TBs
Traditional, Research proposals If you’ve done sections II, III and IV well, this section should be a straightforward math problem For each research problem, calculate the SUs required based on runs (base units) defined in IV and the timings in section III, broken out appropriately by resource Reasonable scaling estimates from test-case timing runs to full-scale production runs are acceptable. Clear presentation here will allow reviewers to award time or storage in a rational fashion Analogous calculations should apply for storage requests In poorly written proposals, a computational plan will read as “We will study about a dozen proteins under a variety of conditions.” This will be followed by a “justification” that basically says “We estimate we will need approximately 500,000 SUs for these computations.” The key here: For your proposal to succeed, your justification should be a straightforward math problem, and you should “show your work.” Every operand in the math problem should be supported scientifically and computationally elsewhere in the proposal. The reviewers will accept reasonably rounded figures.

12 V. Justification of SUs, TBs
Community (gateway-type) proposals The first big trick: Calculating SUs when you don’t know the precise runs to be made a priori. renewIn Year 2 and beyond Start with an estimate of total usage based on prior year’s usage patterns and estimate for coming year’s usage patterns. From this information, along with data from sections IV and III, you can come up with a tabulation of SU estimates. Year 1 requires bootstrapping Pick conservative values (and justify them) for the size of the community and runs to be made, and calculate SUs. TIP—Start modestly. If you have ~0 users, don’t expect the reviewers to believe that you will get thousands (or even hundreds) next year. Analogous calculations for TBs of storage needed The same rule applies here: Your justification should be a straightforward math problem, you should show your work, and you should justify all the operands used to reach the SU total.

13 VI. Additional Review Considerations
Ability to complete the work plan described (more significant for larger requests) Sufficient merit-reviewed funding Staff, both number and experience Local computing environment Special Needs Other access to HPC resources (e.g., Campus centers, DOE centers, etc.) These considerations can be addressed briefly. The reviewers consider them more as a checklist, rather than in great detail.

14 VI. Additional Considerations
Community (gateway) proposals these components can provide key details: Community Support and Management Plan Describe the gateway interface — in terms of how it helps community burn SUs or access TBs. Describe plans for growing the user community, “graduating” users to Research allocation awards, regulating “gateway hogs” Progress report The actual user community and usage patterns Manuscripts thanking this service, or list articles referencing XSEDE. Local computing environment Other HPC resources

15 Renewals require a Progress Report
For Research Project Renewal and Supplement Requests Summary of Scientific Discoveries Accomplishments of Computation Plan Usage Achievements of the Computations (more detail than summary). Specify the number of publications, conferences, reports that result from XSEDE support. Contributions to other research efforts. (experimental/computational/instrumental, etc.). Report New scientific discoveries Computational accomplishments of the previous computational work plan (list and SUs used) Summary of publication information, including conference presentations, technical reports, etc. (Put the Publication list in a separate document, but report the number of publications and other proceedings and reports in the Progress report.) Contributions to other research efforts and fields of science (experimental/computational/instrumental, etc.). Community Support and Management Plan Instead of staff/experience You may want to include brief description of gateway interface, the fact that it has been used for production work, relevant development effort — in terms of how it helps community burn SUs. If you have a plan for growing the user community, for “graduating” users from the gateway to their own MRAC awards, it would be good to mention. If you somehow regulate “gateway hogs,” describe that. Progress report: Provide details of the actual user community and usage patterns seen in the prior award period. List manuscripts published, accepted, submitted or in preparation, thanks to this service. Helps convince reviewers that SUs haven’t gone down a black hole. May be trickier depending on the nature of the community activities. Local computing environment, Other HPC resources: Same as for traditional proposals.

16 Other Documents: Required: CVs for PIs and Co-PIs (2 pages)
List of Publications resulting from the XSEDE allocation Optional: Code Performance & Scaling (If it won’t fit in Main Doc.) Special Requirements References (If they won’t fit in Main Doc.) Other Community Support and Management Plan Instead of staff/experience You may want to include brief description of gateway interface, the fact that it has been used for production work, relevant development effort — in terms of how it helps community burn SUs. If you have a plan for growing the user community, for “graduating” users from the gateway to their own MRAC awards, it would be good to mention. If you somehow regulate “gateway hogs,” describe that. Progress report: Provide details of the actual user community and usage patterns seen in the prior award period. List manuscripts published, accepted, submitted or in preparation, thanks to this service. Helps convince reviewers that SUs haven’t gone down a black hole. May be trickier depending on the nature of the community activities. Local computing environment, Other HPC resources: Same as for traditional proposals.

17 Proposal Review Criteria
Methodology For compute requests, the choice of applications, methods, algorithms and techniques to be employed to accomplish the stated objectives should be reasonably justified. While the accomplishment of the stated objectives in support of the science is important, it is incumbent on proposers to consider the methods available to them and to use that which is best suited. (For storage requests, the data usage, access methods, algorithms and techniques to be employed to accomplish the stated research objectives should be reasonably justified. For shared collections, proposers must describe the public or community access methods to be provided.) State Appropriateness of Computations for Scientific Simulations The computations must provide a precise representation of the physical phenomena to be investigated. They must also employ the correct methodologies and simulation parameters (step size, time scale, etc.) to obtain accurate and meaningful results. Describe the Efficiency in Usage of Resources The resources selected must be used as efficiently as is reasonably possible. To meet this criterion for compute resources, performance and parallel scaling data should be provided for all applications to be used along with a discussion of optimization and/or parallelization work to be done to improve the applications. (For storage resources, information on required performance and expected access patterns should be provided for all data and collections to be stored and used along with a discussion of work done or planned to improve the efficiency of the data use.) Computational Research Plan Explain computational steps to accomplish science. Give details of computational costs. (Justification) I copied and pasted this from the policies document. You can read it later. To sum up: Are the computational method or applications appropriate to the science objectives? -AND- Do you have a plan to use those methods and applications in a sensible fashion? Did you select the best resources for the methods/applications? Are your codes/applications efficient on the resources selected? More important for ‘home-grown’ codes. In general, these review criteria apply to both compute, storage and advanced support requests.

18 XSEDE Projects An XSEDE Project is like a bank account for allocations. It is permanent, only one per PI. It holds a year’s worth of allocation (on 1 or more systems) PI’s request an allocation renewal each year thereafter. An Allocation awarded to a New Request creates an XSEDE Project. A PI’s Computational Projects evolve over the years. Computational Projects begin, end and extend. In subsequent years successful Renewal Requests provide allocations for new Computational Projects under the same XSEDE Project. Your XSEDE Project remains the same. A Renewal Requests is just like New Request, but must contain a Progress Report of last year’s Computational Projects and list of publications from past year’s allocation.

19 Eligibility Principal investigator (PI) must be a researcher or educator at a U.S.-based institution, including federal research labs or commercial organizations, (Commercial requests must guarantee that their results are publically available, and work must be in collaboration with an open science organization.) A postdoctoral researcher is eligible to be a PI. A qualified advisor may apply for an allocation for his or her class; but a high school, undergraduate or graduate student may not be a PI.

20 Overview: Research Request
portal.xsede.org AllocationsSubmit/Review Request ** Web forms: Investigator, Grants, Resource Request,… Requires Main Doc. = “proposal” (pdf upload) & CV Reviewed by experts in same Field of Science 2.5 months from deadline to award availability Details: Allocation Size: Unlimited Reviewed: Quarterly Deadlines: 15th of October, January, April, July Awards Begin: 1st of January, April, July, October Most importantly: PIs need to be aware of the lead time for getting an Research award. Need to allow for proposal submission, review and award. May take up to five months to start Research award if you miss the date. Plan Startup use and your expectations accordingly. (By comparison, NSF funding grants can take six months or more. We do better than that, but the process still takes time.) Requires a written proposal Guidelines in the Allocations policies Reviewed by domain experts (The MRAC and LRAC are actually the same set of reviewers; they wear different hats on different days, whether they’re reviewing medium-scale or large-scale requests.) MRAC and LRAC proposals allow you to request time on combinations specific resources (e.g., SDSC DataStar p655s) or TeraGrid Roaming, or both. MRAC limit: 500,000 SUs MRAC requests accepted, reviewed quarterly Submission deadlines in January, April, July, October (approximately 2.5 months prior to award start) MRACs start April 1, July 1, October 1, January 1 LRAC requests: more than 500,000 SUs LRAC requests accepted, reviewed semi-annually Submission deadlines in January, July LRACs start April 1, October 1

21 Overview: Startup/Education Requests
portal.xsede.org AllocationsSubmit/Review Request ** Web forms: Investigator, Resource Request,… Requires only an abstract and CV Reviewed by a XSEDE Staff (Startup Allocations Committee) 2 weeks from submission to award availability For code devel / performance eval / small-scaling computations / classroom & training instruction Details: Request limit: 200,000 SUs total or combination of all resources requested Reviewed: within 2 weeks of submission Deadlines: None Awards Begin: within 2 weeks of submission DAC awards are not required prior to submitting an MRAC or LRAC proposal, but usually a good idea particularly to gain familiarity with codes and resources, collect timing and scaling data, and demonstrate knowledge to reviewers Usually reviewed and awarded within 2 weeks. Reviewed by TeraGrid RP staff.

22 Pg. limit: DOES INCLUDE FIGURES & TABLES.
Proposal Document(s) CV (s) required for all requests. Abstract for startup/education request (in forms, or as a PDF document) Proposal “Main Document” for Research request (renewals/supplements) Key to a successful review: Adhere to page limits! “Justify” allocation request. Page Limit Proposal Document 3 Progress report 10 New or Renewal 15 Over 10 Million SUs The POPS forms are primarily for record keeping, award administration, and NSF reporting purposes. Proposals are where the reviewers focus their attention. Pg. limit: DOES INCLUDE FIGURES & TABLES.

23 The Award = Allocation One per PI (generally) 1-year duration
Unused SUs are forfeited at the end of an award period Progress report required for renewal requests. Add users to a grant via XSEDE User Portal 4 quarters = 1 yr allocation period Awards aka “grants”, aka “projects”, aka “allocations” The one proposal per PI rule is designed to minimize the number of proposals that require review And the effort required of the allocations staff. There are exceptions, but they are rare. Contact allocations staff before trying this. Advance Award Time to renew Submission Review

24 The Resources: Compute
HPC Systems: (Kraken, Ranger, Lonestar, Steele, Trestles, Blacklight, Keeneland, Quarry, Gordon) Advanced VIS Systems: (Longhorn, Nautilus, Spur) HTC Systems: (Condor and OSG) Storage Systems: (local resource storage) Lincoln consists of 192 compute nodes (Dell PowerEdge 1950 dual-socket nodes with quad-core Intel Harpertown 2.33GHz processors and 16GB of memory) and 96 NVIDIA Tesla S1070 accelerator units. Each Tesla unit provides gigaflops of double-precision performance and 16GB of memory. Dash 245TFLOPS 64TB or RAM and 256TB of flash memory Kraken 113K Ranger 63K Cores: Lonestar: 23K (Abe: QB: Steele: 7216) 22K Athena 18K Trestles: 10K Black Light: Ember Pople 768 (3072 cores, 1/5 for TG, ends 3/2011)

25 The Resources: Extended Collaborative Support(ECS)
Dedicated, but limited, XSEDE staff assistance (request FTE months) 5 Questions which are part of resource request section of application Reviewers rate need for ECS (0-3) Dedicated TeraGrid staff assistance to achieve specific computational objectives Called Advanced Support Program in POPS TeraGrid ASTA SDSC SAC NCSA SAP Limited resources! MRAC/LRAC reviewers rate possible projects RPs factor in these ratings along with staff availability in making the final selections The extra info in proposals is a one-page “special request” that describes what you want to accomplish with the staff assistance how it will benefit your scientific objectives, and who from your group will be collaborating with the staff.

26 The Process: Steps Assess Systems: Determine Type of Project Login: portal.xsede.org AllocationsSubmit/Review Request ** (“Create portal login” if first time.) Select Action (New, Renewal, Suppl/Just/Prog/Ext/Trans/Adv) Select project Type (Research; Startup/Edu < 200K SUs) Fill in forms: PI/Co-PI Info, Proposal Info, Supporting Grants, Resource Request (alloc. request/machine). Upload proposal document(s): (Main Doc., CVs, etc.). Update at anytime and “Save to date” Click “Final Submission” when finished (but can still change)

27 Login at portal.xsede.org
Straightforward (mostly) Once you get to the Web-based data entry forms Latest changes Supporting grant information You can now login to POPS with your TeraGrid Portal login Coming soon Better TeraGrid portal integration

28 Example Form: New Project

29 Example Form: PI entry page

30 Example Form: PI entry page, populating with Portal information
Clicking on this box will auto-populate this page with your portal information

31 Example proposal submission: Title, Abstract, FOS and Keywords

32 Example proposal submission: Supporting grants

33 Example proposal submission: Resource request page

34 Example proposal submission: Resources request page (continued)

35 Example proposal submission: Document upload page

36 Example proposal submission: Document upload (continued)
Check mark confirms I have uploaded CV

37 Example proposal submission: Saving and Final Submission

38 Example proposal submission: Saving and Final Submission

39 Example proposal submission: Successful submission

40 Pending Request

41 Approved Request

42 Interesting Facts ~600 research requests per year ~800 other requests ~3.5B SUs requested(3.2B are research requests) ~1.8B SUs awarded(1.6B are research awards)

43 Asking for Help help@xsede.org
Questions? Asking for Help Asking for Help The allocations staff want you to succeed. We can provide advice and guidance on proposals, answer policy questions, etc. Multi-year Awards Possible, but not recommended for new PIs Only Progress Reports required in subsequent years Highly recommended to confer with allocations staff before submitting a multi-year. Justifications To address reviewer concerns and get more of the requested SUs Best for specific omissions (not to salvage horrible proposals) Supplements Request additional SUs during a 12-month allocation period Not for DACs! Reviewed by MRAC/LRAC members. Extensions Can extend award period an additional 6 months for cause No additional SUs! Advances Once MRAC/LRAC proposal submitted, up to 10% of request can be provided in advance IF RESOURCES ARE AVAILABLE This can help cover the gap between proposal submission and award start. 43


Download ppt "Writing a Successful XSEDE Proposal"

Similar presentations


Ads by Google