PLDI 2018 Program-Chair Report Dan Grossman

Slides:



Advertisements
Similar presentations
Two Issues Concerning Research Conferences Dave Patterson October 2004.
Advertisements

1 What Makes a Good Paper? Anja Feldmann Deutsche Telekom Laboratories TU-Berlin, Germany.
Promotion and Tenure Faculty Senate May 8, To be voted on.
REQUESTING TRANSCRIPTS Student Training Presentation.
Welcome to Turnitin.com’s Peer Review! This tour will take you through the basics of Turnitin.com’s Peer Review. The goal of this tour is to give you.
Manuscript Writing and the Peer-Review Process
An explanation of search committees and how to add members to recruitment requests.
Technical Program Committee selection Committee strategy Paper submission Paper assignment Paper reviewing Committee meeting Proceedings Conference Post.
The Online Submission Process: Guidelines and Training for Authors Marlowe H. Smaby, Michael R. Smith, Cleborne D. Maddux.
The Story of your ORS Abstract Before submitting, check that you have : Correct Title, Authors, Institution(s), Keywords Introduction, Methods, Results,
Writing a Publishable USENIX/SAGE Technical Paper Joshua S. Simon Collective TechnologiesApril 8, 1998.
Doc.: IEEE /1623r0 Submission November 2006 Jim Petranovich, Conexant Systems, Inc.Slide 1 PHY Ad Hoc Nov 1 Agenda and Minutes Notice: This document.
Staying Sane on the Tenure Track Shane Henderson Cornell University Ph.D. Colloquium, 2008 WSC.
ACADEMIC PUBLISHING How a manuscript becomes an article.
Selecting a Journal. Choosing a journal before doing the research My advice is to not pick a target journal before doing the research – Lot’s of people.
Oakland 2010 Program Committee Meeting. authentication electioneer dischargeable indecipherable inheritability inexactitude public relations sanction.
Experience and views of primary care and urgent care Telephone, paper and online survey 5,980 responses from CWS area Half (2,985) had urgent care need.
How to Get Published: Surviving in the Academic World Stephen E. Condrey, Ph.D. Vice President, American Society for Public Administration Editor-in-Chief,
TECM 3200 Dr. Lam.  Text CHRISLAM138 to  Occurs when an individual exerts less effort working in a team than if they would have worked by themselves.
Introduction to CSCI 1311 Dr. Mark C. Lewis
NSERC Coach - Dr. Steve Perlman, Dept. of Biology
IEEE TCRTS Survey for Conference Planning
Delegating 101.
My Digital Footprint By Nadia Blower.
NATA Foundation Student Grants Process
New ways of ‘hearing’ - Case Study
CA-2 State Webinar November 16, 2016
Tweaks to improve our review process
Welcome to Ilkley Grammar School Post-16
“The Places My Granddad Built”
Peer Instruction and Just in Time Teaching Marion Birch & Niels Walet School of Physics and Astronomy.
Fast Action Links extension A love letter to CiviCRM
Role of peer review in journal evaluation
Class of 2019 Naviance Student: Senior year
ETTING EADY CHIEVING REAMS
Considering whether to volunteer to be an NSF AISL reviewer?
TCPI Project Pathway: Session 3 of 8 Staff Engagement: Teamwork and Joy # 6 and 19 (24) To QIA for possible use: Thank you for taking my call and listening.
Exploring Family Studies
Write-On Clinic March 7th-8th, 2016
Bellringer Nov. 2 What abstract concept would you say AHH addresses in the first 8 chapters of part 1? (example: love, fear, jealously, desire, yearning)
Review: What influences confidence intervals?
Rick McGee, PhD and Bill Lowe, MD Faculty Affairs and NUCATS
Senior Action Research Project
AP Research The second course in College Board’s Capstone Program
Aspiring School Leaders Information Session
Raising Healthy Children
Partnered or Group Projects
Preview: Senior Project Reflection
Standard SSEF1 d. Define opportunity cost as the next best alternative.
Easy Chair Online Conference Submission, Tracking and Distribution Process: Getting Started + Information for Reviewers AMS World Marketing Congress /
GRADING – Rick Wormeli *all information on these slides is HIS, used for our discussion purposes.
Learning Strategies: Skills for Success in Secondary School
The Process of Getting Published: Reviews and Rejection
Ceremonials Report and Information
Software Engineering Experimentation
Reviews and the Review Process
NHS Prospective Members’ Meeting
Senior College Prep 10/1/18.
Alternative Pathways to Learning
Community Development Block Grant
Presentation and project
AP Research The second course in College Board’s Capstone Program
What is Self-Advocacy? Self-Advocacy means taking responsibility for telling people what you want and need in a straight-forward way. It is knowing how.
How “Appeals” and Exam Estimates work
Presentation and project
I eat breakfast in the morning.
A System For Managing Career Learning
Writing Groups and Revision Strategies
Road Trip! You and Your Board Connect! 2020
Computer Science Publications
Presentation transcript:

PLDI 2018 Program-Chair Report Dan Grossman

Process [with some data] Data [with some reflections] Reflections [with some thank-yous]

Submissions Anonymous, better known as “double blind” Conflicts-of-interest (precisely defined) indicated by authors Double-checked by Program Chair Reviewers could not “see” papers they were conflicted with PC members could not see submissions by other PC members

Reviewers PC: ~35 people, ~20 reviews each, decide on non-PC papers EPC: ~20 people, ~10 reviews each, decide on PC papers ERC: ~50 people, ~5 reviews each External reviewers to complement expertise: ~25 people, 1 review

Review Preferences Better known as “bidding” Give both “desire” and “likely expertise”, e.g., 5X or -10Y Such preferences are much more useful Number / letter often correlated but not always No automation here beyond sorting by topic scores

Review Assignments Round 1: All papers received 3 or 4 reviews Aim to give each paper at least one “X” and not reviewers who “really don’t want to do it” Started with an automated assignment and then spent a dozen hours of “ad hoc hill-climbing” using review preferences Round 2: Additional 1-2 reviews for most papers (> 2/3) Online discussion to identify complementary expertise Assignments done manually over a couple days

Decisions Extensive online discussion after round-two reviews 2-day PC meeting Not all papers discussed, but those discussed in almost-random order No target acceptance rate EPC conference call for PC papers – “clearly within acceptance envelope” “Double-blind until accept” with one slight, appropriate exception

Shepherding All acceptances were conditional with shepherd from PC/EPC Mandatory vs. recommended changes from the discussion This year, all conditional acceptances were accepted Healthy process with no “issues”

Process [with some data] Data [with some reflections] Reflections

262 submissions (but 4 not sent to reviewers) 55 acceptances For an acceptance rate of… I’m not going to write that down for you Unclear to me what it’s measuring Well within historic norms though somewhat higher than recent years

Submissions were down Upsides No decrease in number of high-quality submissions (acceptances up) More reasonable reviewing loads with manageable committee sizes Have some theories, but none with evidence  PLDI seems plenty healthy 2018 262 2017 322 2016 304 2015 303 2014 287 2013 267 2012 255 2011 236 2010 206 2009 196 2008 184 2007 178

Numbers through the process 262 submissions 258 received round-one reviews 179 received round-two reviews 92 discussed at the PC meeting 47 accepted by the PC (8 more than last year) 8 PC papers accepted by the EPC (same as last year)

Expertise levels [Mostly for fun, don’t read much into this] 238 papers received >= 1 expert (“X”) review, 52 accepted 20 papers received 0 expert reviews, 3 accepted Often expected an X Always discussed if an X was needed 6 papers received all expert reviews, 0 accepted (Average would have been 1)

Paper topics [grain of salt: all self-reported at time of submission] The list: Started with last year, made ~10-15 edits Examples: Split verification in three; removed unused topics Incorporated additional feedback from PC/EPC/ERC Mostly used to assist with review preferences/assignments

(>= 17) (turns out all >= 3)

(>= 8, <= 16)

(<= 7)

Choose your own narrative Infinite number of theories consistent with the data Many don’t hold one standard deviation away This is entirely a post hoc analysis Mostly done in the last 48 hours 

Process [with some data] Data [with some reflections] Reflections [with some thank-yous]

Jeff

PC (1 of 2) PC

PC (2 of 2)

EPC (1 of 1)

ERC (1 of 2)

ERC (2 of 2)

Also… Submitters Rest of organizing committee Student volunteers Attendees …

When you’re PC chair… Committee selection is worth lots of time and effort Primary influence of review expertise and quality Emphasize courteous and constructive reviews There is literally no downside Emphasize accepting: Imperfect papers (else zero papers) But not flawed papers

Goal My goal has been “cautious caretaker” for a healthy community If you enjoy the conference and learn something, I’m happy!

A long road I first attended PLDI in ‘98 [it had a lot of great papers!] Also attended ‘99, ’01, ‘02, ’04, ’06, ‘07, ‘09, ‘10, ‘11, ‘13, ‘15, ‘17 Jeff invited me to be PLDI 2018 PC Chair 756 days ago

Thanks!

submit accept rate 2000 173 30 17.3% 2001 144 20.8% 2002 169 28 16.6% 2003 131 21.4% 2004 127 25 19.7% 2005 135 20.7% 2006 174 36 2007 178 45 25.3% 2008 184 34 18.5% 2009 196 41 20.9% 2010 206 19.9% 2011 236 55 23.3% 2012 255 48 18.8% 2013 267 46 17.2% 2014 287 52 18.1% 2015 303 58 19.1% 2016 304 15.8% 2017 322 47 14.6% 2018 262 21.0% Average 2000-2017: 19.4%