National Science Foundation Directorate for Computer & Information Science & Engineering (CISE) Panel Charge CAREER Proposals.

Slides:



Advertisements
Similar presentations
NSF Graduate Research Fellowship: What is it? 3 years of funding: $30k/year as stipend $10,500/year for tuition $1,000 one-time international travel allowance.
Advertisements

Chemical, Bioengineering, Environmental, and Transport (CBET) Division Panel Program Director: _______________________ Program Assistant: ______________________.
Panel Briefing CAREER Panel. CISE Organization and Core Research Programs CISE Cross-Cutting Programs Cross-Foundation Programs 30% 70% CISE Core Programs.
National Science Foundation
CAREER WORKSHOP APRIL 9, 2014 Putting a Face on the CAREER Peer Review Process Ross Ellington Associate Vice President for Research FLORIDA STATE UNIVERSITY.
2013 USCOTS Writing More Effective NSF Proposals Lee Zia Division Undergraduate Education National Science Foundation May 19, 2013.
1 Performance Assessment An NSF Perspective MJ Suiter Budget, Finance and Awards NSF.
Broader Impacts: Meaningful Links between Research and Societal Benefits October 23, 2014 Martin Storksdieck I Center for Research on Lifelong STEM Learning.
Session 5 Intellectual Merit and Broader Significance FISH 521.
 Introductions  Webinar etiquette ◦ Please place your phone on MUTE if you are not asking a question or not responding to the presenters. ◦ If you encounter.
NSF Merit Review Process NSF Regional Grants Conference October 4 - 5, 2004 St. Louis, MO Hosted by: Washington University.
NSF Research Proposal Review Guidelines. Criterion 1: What is the intellectual merit of the proposed activity? How important is the proposed activity.
Merit Review and Proposal Preparation Mark Courtney Division of Environmental Biology
Graduate Research Fellowship Program Operations Center NSF Graduate Research Fellowship Program National Science Foundation.
NSF Merit Review and Proposal Preparation Mark Courtney, Ph.D Adjunct, Department of Biology New Mexico State University 24 September 2008.
An Excellent Proposal is a Good Idea, Well Expressed, With A Clear Indication of Methods for Pursuing the Idea, Evaluating the Findings, and Making Them.
Cedric L. Williams, Ph. D. Professor Dept. of Psychology and Graduate Program in Neuroscience University of Virginia Charlottesville, VA Council on Undergraduate.
NSF East Asia and Pacific Summer Institutes (EAPSI) Shelley Hawthorne Smith UA Graduate College Office of Fellowships and Community Engagement
National Science Foundation Update Governor's Grants Office Higher Education Conference Bowie State University May 22, 2012.
NSF Merit Review Criteria Revision Background. Established Spring 2010 Rationale: – More than 13 years since the last in-depth review and revision of.
The Proposal Review Process Matt Germonprez Mutual of Omaha Associate Professor ISQA College of IS&T.
The IGERT Program Preliminary Proposals June 2008 Carol Van Hartesveldt IGERT Program Director IGERT Program Director.
Welcome and thanks for coming. Before we get started Please be sure to sign in!
National Science Foundation: Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES)
Division of Undergraduate Education Directorate of Education and Human Resources National Science Foundation Improving Undergraduate STEM Education Program.
Graduate Research Fellowship Program Operations Center NSF Graduate Research Fellowship Program National Science Foundation.
1 Welcome to an NSF Unsolicited Panel Chemical, Bioengineering, Environmental, and Transport (CBET) Division.
responsive to modifications in NSF merit review criteria GPG 13.1
Proposal Strengths and Weakness as Identified by Reviewers Russ Pimmel & Sheryl Sorby FIE Conference Oct 13, 2007.
Tips for Writing a Successful Grant Proposal Diana Lipscomb Associate Dean for Faculty and Research CCAS.
Company LOGO Broader Impacts Sherita Moses-Whitlow 07/09/09.
National Science Foundation Research Experiences for Undergraduates (REU) Site Program.
NSF: Programs and Plans Kathleen McCloud Physics Division National Science Foundation Increasing the Number of Women in Science and Engineering.
Submitting a Proposal: Best Practices By: Anu Singh Science Assistant
 NSF Merit Review Criteria Intellectual Merit Broader Impacts  Additional Considerations Integration of Research & Education Integrating Diversity into.
A Roadmap to Success Writing an Effective Research Grant Proposal Bob Miller, PhD Regents Professor Oklahoma State University 2011 Bob Miller, PhD Regents.
Partnerships and Broadening Participation Dr. Nathaniel G. Pitts Director, Office of Integrative Activities May 18, 2004 Center.
NSF CAREER Program & CAREER Proposals Claudia Rankins Program Director, Directorate of Education and Human Resources NSF CAREER Program.
NSF CAREER Program & CAREER Proposals Claudia Rankins Physics (PHY) NSF CAREER Program.
Promoting Diversity at the Graduate Level in Mathematics: A National Forum MSRI October 16, 2008 Deborah Lockhart Executive Officer, Division of Mathematical.
Funding your Dreams Cathy Manduca Director, Science Education Resource Center Iowa State University, 2005.
NSF IGERT proposals Yang Zhao Department of Electrical and Computer Engineering Wayne State University.
An Excellent Proposal is a Good Idea, Well Expressed, With A Clear Indication of Methods for Pursuing the Idea, Evaluating the Findings, and Making Them.
Workshop for all NSF-funded PIs regarding new NSF policies and requirements. America COMPETES Act contains a number of new requirements for all those funded.
 NSF Merit Review Criteria Intellectual Merit Broader Impacts  Additional Considerations Integration of Research & Education Broadening Participation.
National Science Foundation Overview. Agenda Our Legacy: About NSF Our Work: Programs & The Merit Review Process Our Opportunities: Working at the NSF.
Making USDA grant submission more successful: A panelist’s perspective Brian S. Baldwin Dept. of Plant & Soil Sciences
Actions and Plans for Broadening Participation Chemistry Division - NSF AAAS/AGEP – Feb. 2, 2007 NSF Division of Chemistry.
Integrating Broader Impacts into your Research Proposal Delta Program in Research, Teaching, and Learning Trina McMahon Professor of Civil and Environmental.
NSF Peer Review: Panelist Perspective QEM Biology Workshop; 10/21/05 Dr. Mildred Huff Ofosu Asst. Vice President; Sponsored Programs & Research; Morgan.
NSF Funding Opportunities Anthony Garza. General Funding Opportunities Standard proposals or investigator-initiated research projects (submission once.
1 Grant Applications Rachel Croson, PhD Dean, College of Business UT Arlington (formerly DD SES/SBE NSF)
Improving Research Proposals: Writing Proposals and the Proposal Review Process Heather Macdonald (based on material from Richelle Allen-King, Cathy Manduca,
Pre-Submission Proposal Preparation Proposal Processing & Review.
Data Infrastructure Building Blocks (DIBBS) NSF Solicitation Webinar -- March 3, 2016 Amy Walton, Program Director Advanced Cyberinfrastructure.
NSF Faculty Early Career Development (CAREER) Program February 25, 2016.
Intellectual Merit & Broader Impact Statements August 2016
NSF/NIH Review Processes University of Southern Mississippi
NSF/NIH Review Processes University of Southern Mississippi
What Reviewers look for NIH F30-33(FELLOWSHIP) GRANTS
NSF CAREER TTVN Seminar February 3, 2009
FISH 521 Further proceedings Peer review
Rick McGee, PhD and Bill Lowe, MD Faculty Affairs and NUCATS
Intellectual Merit & Broader Impact Statements August 2018
National Science Foundation Graduate Research Fellowship Program
Welcome and thanks for coming.
Intellectual Merit & Broader Impact Statements August 2017
Welcome and thanks for coming.
S-STEM (NSF ) NSF Scholarships for Science, Technology, Engineering, & Mathematics Information Materials 6 Welcome! This is the seventh in a series.
Intellectual Merit & Broader Impact Statements August 2019
Presentation transcript:

National Science Foundation Directorate for Computer & Information Science & Engineering (CISE) Panel Charge CAREER Proposals

Faculty Early Career Development (CAREER) Program An NSF-wide program to support junior faculty. Emphasizes outstanding research, excellent education, and the integration of education and research. Each award is at least $400K spent over 5 years Presidential Early Career Awards for Scientists and Engineers (PECASE) Nominations (citizens and green-card holders) are made by participating federal agencies NSF nominates, about 20/year, from among the most meritorious new CAREER awardees Final selection made by White House Office of Science & Tech. Policy. NSF selections are strongly influenced by Reviews/Panel Outcomes and Broader Impact activities post-award Overview of the CAREER Program

Emphasizing Transformative Research Transformative research involves ideas, discoveries, or tools … that radically change our understanding of an important existing scientific or engineering concept or educational practice … or, leads to the creation of a new paradigm or field of science, engineering, or education. Such research challenges current understanding or provides pathways to new frontiers … or, allow development of new practices in cyberinfrastructure (for ACI)

Conflicts of Interest Primary purpose is to remove or limit the influence (or appearance of influence) of ties to an applicant institution or investigator that could affect reviewer advice. Second purpose is to preserve the trust of the scientific community, Congress, and the general public in the integrity, effectiveness, and evenhandedness of NSF’s peer review process.

5 Conflicts of Interest Sign and turn in Conflict-of-Interest form Typical relationships that could lead to a conflict: You must not participate in the discussion of any proposal for which you have a conflict. Please discuss any actual or perceived conflicts with PDs now. Any relationship with an institution or person involved in a proposal that might look questionable to a third party should be discussed with a PD INSTITUTIONAL  current or previous employment (12 months) or seeking employment  award, honorarium, or travel payment (past 12 months)  officer or governing board  any financial interest PERSONAL  co-author of paper or project collaborator (48 months)  co-edited journal or proceedings (24 months)  Ph.D. thesis advisor or student (life- long)  family member or close friend

Confidentiality Process and results are confidential! Do not disclose identities of your fellow reviewers. Do not discuss proposals out-of-band (in the absence of a PD) including , Skype or text etc. Do not disclose identities of people associated with proposals (PI, Co- PIs, Consultants, etc.) Do not discuss results or recommendations with other people. Do not use names of other reviewers in your review or Panel Summary (if you are the Scribe). Proposals contain sensitive information and are not in the public domain -- do not copy, distribute or quote from them. You can indicate (e.g., on a resume) that you served NSF on a review panel – just don’t identify which panel(s).

Merit Review Principles All NSF projects should be of the highest quality and have the potential to advance, if not transform, the frontiers of knowledge. NSF projects, in the aggregate, should contribute more broadly to achieving societal goals. Meaningful assessment and evaluation of NSF funded projects should be based on appropriate metrics, keeping in mind the likely correlation between the effect of broader impacts and the resources provided to implement projects.

Proposal Review Criteria The Intellectual Merit criterion encompasses the potential to advance knowledge; and The Broader Impacts criterion encompasses the potential to benefit society and contribute to the achievement of specific, desired societal outcomes.

Proposal Review Criteria The following elements should be considered in the review for both criteria: What is the potential for the proposed activity to: advance knowledge and understanding within its own field or across different fields (Intellectual Merit); and benefit society or advance desired societal outcomes (Broader Impacts)? To what extent do the proposed activities suggest and explore creative, original, or potentially transformative concepts? Is the plan for carrying out the proposed activities well-reasoned, well- organized, and based on a sound rationale? Does the plan incorporate a mechanism to assess success? How well qualified is the individual, team, or institution to conduct the proposed activities? Are there adequate resources available to the PI (either at the home institution or through collaborations) to carry out the proposed activities?

Proposal Review Criteria NSF staff will also give careful consideration to the following in recommending funding decisions: Integration of Research and Education: level of engagement in joint efforts that infuse education with the excitement of discovery and enrich research through the diversity of learning perspectives. This is particularly emphasized in the CAREER solicitation Integrating Diversity into NSF Programs, Projects, and Activities: broadening opportunities and enabling the participation of all citizens – women and men, underrepresented minorities, and persons with disabilities – is essential to the health and vitality of science and engineering.

Integration of Research and Education: “All CAREER proposals must have an integrated research and education plan at their core.” Does the PI think creatively about how his or her research will impact their education goals and, Conversely, how does his or her education activities feed back into his or her research. (These plans should reflect both the proposer's own disciplinary and educational interests and goals, as well as the needs and context of his or her organization.) Proposal Review Criteria

For Each Proposal L (Lead) opens the discussion by presenting the objectives of the research, the strengths of the proposed research, and weaknesses of the proposed research R1, R2 (Reviewers 1 and 2) add to the comments or express any disagreements with the lead reviewer if necessary S (Scribe) makes notes on the panel discussion. The scribe may also add comments or express any disagreements if necessary. The summary should be written in third person The floor is open for discussion. Majority vote will be taken if there is no unanimous agreement, but panel summary will note the disagreement. After all proposals are discussed and placed into categories, the Scribe will draft the Panel Summary, and read the panel summary for comments, editing and approval. Panel Discussion Protocol (for some programs within CISE)

Panel Outputs Individual Reviews in FastLane for each proposal OK to modify reviews, including change of rating. Ensure individual reviews for each proposal are on electronic panel system and are “correct”. Be sure any modifications to reviews are recorded in FastLane! These MUST be made BEFORE leaving your panel. Panel summary for each proposal Initially framed by one reviewer who serves as scribe using the provided template. Should reflect discussion (not just restate individual reviews). Includes short, clear comments to help unsuccessful PIs improve their proposals in the next competition. Add “Justification for Recommendation" heading at the end of the summary and write an informative, concise justification (1-2 sentences). Should be written in 3rd-person and proof-read by all assigned panelists. The CAREER Program and Your Reviews

Panel Outputs Panelist grades: E, V, G, F, P Avoid being overly harsh (“I never give an E”) or overly generous. Be discriminative & use the entire spectrum P.. E if appropriate Panel recommendations: Highly Competitive (HC): Solid proposal, worth working on. Competitive (C): Good proposal, but some portions unconvincing. Low Competitive (LC): As is the proposal is weak, but contains good ideas. Not Competitive (NC): there are major flaws and PI is discouraged from resubmitting Panel recommendation is based on insights gained during discussion Funding a project with F or P rating and declining one with E rating requires explanation by PD. And remember …. CISE will compete for PECASE nominations. Identify potential PECASE candidates—i.e. this proposal is especially meritorious on both Intellectual Merit and Broader Impact. Assign your ratings accordingly. The CAREER Program and Your Reviews

Panel Summary Outline Description of project (brief): Intellectual Merit: Strengths: Weaknesses: Broader Impacts: Strengths: Weaknesses: Constructive suggestions for improvement: Data management plan Postdoc mentoring plan (if applicable) Justification(s) for panel’s recommendation, including key strengths and critical weaknesses : The panel placed this proposal in the following category: ____ Highly Competitive ____ Competitive ____ Low Competitive ____ Not Competitive The summary was read by the panel, and the panel concurred that the summary accurately reflects the panel discussion. Please follow the format in writing panel summaries. Do not merely list reviewers’ comments. Should reflect the whole discussion to be useful to NSF and PI Crisp comments to help unsuccessful PIs improve their proposals for the next competition. HC should not have excessively negative comments. Comments in reviews and panel summaries should be constructive, informative, non- inflammatory, and non- discriminatory Only letters of collaboration that prescribe to the GPG format can be considered. Please address the educational components and activities as well as the research activities.

Please Remember! Reviews and panel summaries are sent to Principal Investigators feedback, laudatory or critical, is important comments should be constructive, informative, and non-inflammatory Results are advisory and confidential do not discuss proposals or results proposals may contain sensitive information and are not in the public domain -- do not copy, distribute or quote from them PLEASE DELETE THE PROPOSALS AND NOTES FROM ALL ELECTRONIC DEVICES AND SHRED PAPER COPIES!

Minimizing Bias in Evaluation Implicit bias toward a group (“schemas”) Non-conscious hypotheses/stereotypes, often about competence Lack of critical mass greater reliance on schemas Few women and minorities in sciences Accumulation of disadvantage Small bias in same direction has large effect over time Very small differences in treatment can have major consequences in salary, promotion and prestige [Valian (1998)]

Schemas are… Widely culturally shared All people, even members of under-represented groups, hold schemas about these groups People are often not aware of them Applied more under circumstances of: Lack of information Stress from competing tasks Time pressure Lack of critical mass Fiske (2002). Current Directions in Psychological Science, 11,

Example: Impact of “Blind-Auditions” Based on audition records of 14,000 individuals & rosters of orchestras from : The audition data show the use of a screen increases the probability that a woman will advance from preliminary rounds by 50% The roster data show the switch to blind auditions accounts for 30% of the increase in the proportion of women among new hires Goldin & Rouse (2000) The American Economic Review, 90, 4,

Evaluation of Identical CVs: Race “Jamal” had to send 15 resumes to get a callback, compared to 10 needed by “Greg.” “Greg” yielded as many more callbacks as an additional eight years of experience for “Jamal.” The higher the resume quality, the higher the gap between callbacks for “Greg” and “Jamal.” Bertrand & Mullainathan (2004) Poverty Action Lab, 3, Jamal Greg

Evaluation of Fellowship Applications “…the success rate of female scientists applying for postdoctoral fellowships at the [Swedish Medical Research Council] during the 1990s has been less than half that of male applicants.” Wenneras & Wold (1997) Nature, 387, p. 341 Women had to be 2.5 times more productive to receive the same competence score. Similar findings: GAO report on Peer Review in Federal Agency Grant Selection (1994); and European Molecular Biology Organization Reports (2001) *Cited by Richard Zare, Stanford chemistry professor and former NSB chair, editorial in 5/15/06 Chemistry and Engineering News

Ways to Mitigate Evaluation Bias 1.Increase awareness of how schemas might bias evaluation. 2.Decrease time pressure and distractions in evaluation process. 3.Rate on explicit criteria rather than global judgments. 4.Point to specific evidence supporting judgments. Bauer & Baltes, 2002, Sex Roles, 47 (9/10), Please incorporate (3) & (4) in your discussions

Thank You!