DPDK Community Survey Results

Slides:



Advertisements
Similar presentations
Concluding the Session 1.Verify that question was answered 2.End the session 3.Assign resolution code to the session 4.Close the session.
Advertisements

Electronic questionnaires between Customs statistics and PSIs for data checking Anne Oikarinen, Finnish Customs, Statistics Unit.
Information guide.
Small Business Resource Power Point Series Discussion Groups.
Increasing Efficiency in Data Collection Processes Arie Aharon, Israel Central Bureau of Statistics.
Cover Letter YOUTH CENTRAL – Cover Letters & Templates
Thank you for the kind feedback. I truly do hope you have enjoyed the course and have had a good learning experience. Most people said they found the course.
Classic Connections: Innovative Methods for Making Education Work.
Value in Webinars ID: Andrea Hildreth Client: Walden University, Capstone Project.
Development Management Customer Satisfaction Survey 2015/16 Economy, Planning and Employability Services Reported Prepared May 2016.
Welcome to the Quality Checkers Report Presentation. Northamptonshire Quality Checkers.
L5 Computing Team Project Final Presentation Guidelines.
The top 5 ways to get customer reviews
Digital Assistants – a brand’s best friend?
Survey Design and Analysis
Together we can stop bullying happening…
The problem you have samples for has been edited to reduce the amount of reading for the students. We gave the original late in the year. As we were working.
Helping you succeed in promoting your club
Tracking Students Throughout the Scholarship Season
How to compose a message to a teacher
CIC: January TOT for Mentor and BT Forums “It’s all about student achievement.” Vanessa Nieto-Gomez Jean Duffey
Step 1 I found it, Now what?.
Formal Feedback about the ‘Bridge’ (Internally collated)
Paraphrasing Class #8 February 14, 2013.
Hearing, Listening and Responding to Patient Stories - the Plymouth experience Ray Jones, Kim Young and Pam Nelmes - Plymouth University.
Quiz: How Are Your Meetings
Hillingdon CCG CCG 360o stakeholder survey 2014 Summary report.
Online Matching System
Deer Park Family Medical Practice Questionnaire
Survey Scheduling And Monitoring in eRCTs (SESAMe): a digital tool to improve data collection in clinical trials. Trygve Skonnord, Finn Steen, Arne.
Capitalizing on Social Media
Why Peer Review? Rationale #4
Skills for change Hot off the press! How to get media coverage.
IPads 7th yr Wednesday 31st August 2016.
Multi Rater Feedback Surveys FAQs for Participants
Multi Rater Feedback Surveys FAQs for Participants
Ground Rules.
The webinar will begin shortly
College & Career Awareness
SAPSI-S PEP Overview I-RtI Network December, 2012
Checking your NSCC The Green Group!
End of Year Performance Review Meetings and objective setting for 2018/19 This briefing pack is designed to be used by line managers to brief their teams.
Starfish Faculty Training How to Raise Flags and Kudos
Aim The aims of today's session is to think about what bullying is and how to help yourself or others who are being bullied. To understand what Anti-Bullying.
Forensic Science Unit One
Lean Six Sigma Project Name: Project: Date: Intros Expecations
Screen Writing Brylee Huber.
Introducing the Ideas One of Six Traits:
Workforce Engagement Survey
Year 12 into 13 bridging work
bodyTachi Survey Results
September Meeting 6th and 7th  7:10am or 3:15pm.
Teamwork & Active Listening Teamwork and Active Listening.
We know who they are and what they do, but how do we help them?
Information, Communication and Technology
Who are Care Opinion? Introduction to yourself and why you are giving this presentation.
BE MORE INVOLVED IN YOUR HEALTH CARE
2018 MEMBERSHIP SURVEY MAY 2018.
Networking Workshop (2)
Harrow CCG CCG 360o stakeholder survey 2014 Summary report.
Back to School - Again 10/13/2018 District I/II Fall Meeting 2018
Participation Feedback
FPHS Capacity Assessment: Updates and Reminders
New country and Varied Cultures
Hypothesis Compiling.
Citing Textual Evidence
ECU Foundation Xtender Application
How to compose a message to a teacher
Searching the CCDT Link from the EDUCE 104 Syllabus (17-Feb): to the CCDT search form.
How to use Open study By blm1.
Presentation transcript:

DPDK Community Survey Results Honnappa Nagarahalli, Arm Maxime coquelin, REDHAT

Survey Changes to DPDK development process using GitHub, Gerrit Discussion started after the 2018 North America Summit 2 Part Survey Understand the pain points in the current process Receive feedback on new tools/methods – contingent on the outcome of the first survey Tech-Board and Volunteers helped deliberate the questions Please remember that I am just a messenger. Brief background on why the survey was done. There have been discussions on changing the development process using GitHub, Gerrit etc. in the past. There was a question as well from the community in the 2018 North America Summit. Tech-board decided to conduct a survey. It was decided to do a 2 part survey. First part was to understand if we really have a problem with the current process. If yes, what are the pain points? It would have taken 10mns to answer all the questions. The second part of the survey was intended to receive feedback on new tools/methods and was contingent on the outcome of the first survey.

Survey Response 1St Survey Sent out Reminder Sent out Deadline Extended Advertised on VPP/SPDK Even on Twitter!! 23rd Feb 1St Week 27 Responses Deadline: 13th March Extended Deadline: 23rd March 41 Responses 59 Responses!! How was the response from the community? This is not exactly an overwhelming response, remember that the survey was about what are the pain points. There are about 119 authors with at least 10 patches since 18.02. 77% of them did not respond. 119 authors with at least 10 patches since 18.02 92 or 77% of them did not respond!!

Survey Results – Ease of sending Patches Out of the 20% 58% said the process is too cumbersome Somebody said ‘it is nerve wrecking’!! We will go over some of the questions, starting with the key questions that define the outcome. Out of the 20%, 58% said the process is too cumbersome, especially for infrequent contributors. Somebody said it is nerve-wrecking to make sure nothing is missed. Something to take note and I think Travis CI helps one to make sure the patch is as per the guidelines.

Survey Results – Email based review helpful? Out of the 27% 22% - Responses are slow 34% - Comments lost/not addressed Is email based review helpful? 73% said yes, 27% said no. Out of the 27% ‘no responses’: Slow responses - 22%  Comments lost or not addressed - 34% Managing the large list of emails is difficult - 44% 44% - Difficult to manage large list of emails

Survey Results – Is patchwork helpful? Out of the 20% Can’t link revisions No way to get notified of patches under my maintainer-ship Does patchwork help you in managing the patches? 80% say yes. 20% responded with issues. Does not integrate with the process well.  No way to link revisions of the patch, no way to get notified about patches under one’s maintainership, relies a lot on manual process (such as acks etc.), difficult to find one’s own patch (no direct link, need to search), hard to manage patch sets that have dependency. It will be better if patchwork can automatically mark obsolete patch as superseded when new patch version is posted. Also, prefer to be able to leave and reply to comments online. Hard to manage patch sets that have dependency

Survey Results – Other Questions Do you find the coding guidelines easy to follow? 90% said yes, 10% indicated few issues Ease of using the scripts in ‘devtools’ 78% found it easy 22% said not easy (most pointed to difficulty in setting up checkpatch.sh) Do you get enough reviews on the patches? 61% yes, 39% said no Is the rate of response enough? 49% yes, 51% said no The last 3 questions kind of decided the outcome of the survey. Few other questions worth mentioning are: Some work for the community to address the review. I think reviews are never enough. But, here the majority seems to indicate concerns. Thomas’s release statistics helps encourage reviews, may be the member companies can help by asking their engineers to do more reviews.

The current process will not change Conclusion The current process will not change The last 3 questions kind of decided the outcome of the survey. Few other questions worth mentioning are: Some work for the community to address the review. I think reviews are never enough. But, here the majority seems to indicate concerns. Thomas’s release statistics helps encourage reviews, may be the member companies can help by asking their engineers to do more reviews.

But – Good Suggestions 26 unstructured comments were provided Tech board is looking at incorporating some feedback Advertise subtrees better Elixir for DPDK Process for integrating patches when maintainers do not respond Not enough responses – Maintainers could delegate to others? However, there were 26 unstructured comments. It also makes sense to implement some of the feedback provided in the survey questions which was unrelated to the process itself. These were presented to the tech board and are being debated. Some are implemented already. Address the ‘process for integrating patches when maintainers do not respond’ part. People have to maintain that code in their private repo.

Questions?