Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific.

Slides:



Advertisements
Similar presentations
CONCEPTUAL WEB-BASED FRAMEWORK IN AN INTERACTIVE VIRTUAL ENVIRONMENT FOR DISTANCE LEARNING Amal Oraifige, Graham Oakes, Anthony Felton, David Heesom, Kevin.
Advertisements

Producing Quality Evidence in a Well Organised Portfolio Doc Ref: 20/04/09-portfolio-quality-evidence.
Training Needs Analysis. Ambition in Action TAFE NSW: Doing business in the 21 st Century The TAFE NSW Workforce Development Guarantee.
Research and Technology Object Oriented Defect Detection Frank HoudekForrest Shull DaimlerChrysler AGFraunhofer Center - Maryland Research and Technology.
Donald T. Simeon Caribbean Health Research Council
Using training packages to meet client needs Facilitator: Gerard Kell.
MODULE 8: PROJECT TRACKING AND EVALUATION
Episode 3 / CAATS II joint dissemination event Lessons Learnt Episode 3 - CAATS II Final Dissemination Event Philippe Leplae EUROCONTROL Episode 3 Brussels,
TITLE OF PROJECT PROPOSAL NUMBER Principal Investigator PI’s Organization ESTCP Selection Meeting DATE.
SEP1 - 1 Introduction to Software Engineering Processes SWENET SEP1 Module Developed with support from the National Science Foundation.
DRAFT Strategic Planning U.S. Department of Energy Rebuild America Business Partners and Deanna Braunlin GAVIN Consulting, Inc. John Deakin Energy Program.
Research Ethics Levels of Measurement. Ethical Issues Include: Anonymity – researcher does not know who participated or is not able to match the response.
Amirkabir University of Technology, Computer Engineering Faculty, Intelligent Systems Laboratory,Requirements Engineering Course, Dr. Abdollahzadeh 1 Dealing.
CSE USC Fraunhofer USA Center for Experimental Software Engineering, Maryland February Empiricism in Software Engineering Empiricism:
Development of Competence Profile Quality managers in VET-institutions Project no: PL1-LEO This publication [communication] reflects the.
E mpowering S taff T hrough I nstitute P lanning (ESTIP) Academic managers workshop Institute Name: XXXXXX Presenter: XXXXXX Date: XXXXXX.
Constructive Alignment Towards a Learner-centered Undergraduate Education Edmond Ko City University of Hong Kong 26 March 2004.
ISERN-Meeting, Honolulu, Hawaii 09 October 2000 Slide 0 Using Experiments to Teach Software Engineering Using Experiments to Teach Software Engineering.
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Module 3 of 3.
World Languages Portfolio. Student Growth Portfolio with Peer Review 2  THE GOAL: A holistic and meaningful picture of the value a teacher adds to students,
N By: Md Rezaul Huda Reza n
1 IDI, NTNU Programvarekvalitet og prosessforbedring vår 2000, Forrest Shull et al., Univ. Maryland and Reidar Conradi, NTNU (p.t. Univ.
CriteriaExemplary (4 - 5) Good (2 – 3) Needs Improvement (0 – 1) Identifying Problem and Main Objective Initial QuestionsQuestions are probing and help.
© 2014 Blackboard Inc. All rights reserved.. Explain the overall steps for creating a test. Select question and test settings based on your needs. Deploy.
Making Digital History: students as partners in online learning, teaching and research Dr Jamie Wood, School of Humanities, University of Lincoln
Contents 1 Description of 1 Description of Initiative Initiative 3 Results: 3 Results: Characterization Characterization 2 Description of 2 Description.
Behavioral Research Chapter 6-Observing Behavior.
1 APNIC eLearning Ready when you are!. 2 APNIC training - background Training is an important and fundamental member service Training helps –develop better.
Key features of the University of Manchester Professor Cathy Cassell Deputy Director (Academic) Sarah Featherstone Head of Undergraduate Services Original.
IT Project Management, Third Edition Chapter 11 1 Chapter 6: Project Risk Management.
Rethinking Pre-College Math: A Brief Reminder about Why We’re Here and What We’re Trying to Do Overall context/purpose of project Defining characteristics.
Baseline Transformation Audit Marketing, Media and Communications Industry Parliamentary Portfolio Committee on Communications 19 October 2004.
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
Goal and Target Setting - What’s my role? Module 3.
Big6 Overview Big6™ Trainers Program McDowell County Schools.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
CMMI. 1.Initial - The software process is characterized as ad hoc, and occasionally even chaotic. Few processes are defined, and success depends on individual.
ISERN Survey & Benchmark 10 th anniversary meta-experiment project Session Chair, Stefan Biffl Marcus Ciolkowski, Forrest Shull, and Dieter Rombach 1.Strategy.
Develop Project Charter
School Improvement Partnership Programme: Summary of interim findings March 2014.
Page 1 TEST in the large RELEASE REWORK ASSESS packaged application documentation models and source code management documents requirement alloc. matrix.
The Perfect Storm: Faculty Development for Large Enrollment Hybrid & Online Courses Cynthia K. Russell 1, Steven W. Gilbert 2, Susan Jacob 1, Victoria.
Slide 1 Experiment Framework Results from the Working Groups Revised Experiment Plan People for Roles Who signed up for what Goals 2001/2 until next ISERN.
MAP the Way to Success in Math: A Hybridization of Tutoring and SI Support Evin Deschamps Northern Arizona University Student Learning Centers.
WERST – Methodology Group
The Implementation of BPR Pertemuan 9 Matakuliah: M0734-Business Process Reenginering Tahun: 2010.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
DEVELOPING THE WORK PLAN
Building Collaborative Partnerships Bill Heatherman, County Engineer Wyandotte County/KCK Slides courtesy of Karen McNamara, City of San Ramon.
Contents 1 Description of 1 Description of Initiative Initiative 3 Defining Inspection 3 Defining Inspection Perspectives Perspectives 2 Overview of 2.
Continual Service Improvement Methods & Techniques.
BSBPMG501A Manage Project Integrative Processes Manage Project Integrative Processes Project Integration Processes – Part 2 Diploma of Project Management.
Let Quality Guide Quality Evaluation Recent Trends Implemented in the Middle East.
Teaching Peer Review of Writing in a Large First-Year Electrical and Computer Engineering Class: Comparison of Two Methods Michael Ekoniak Molly Scanlon.
Research Philosophies, Approaches and Strategies Levent Altinay.
Demographic Full Count Review Presentation to the FSCPE March 26, 2001 Washington D.C.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Using Scrum to Improve Teamwork, Communication, Quality and Speed.
Contents 1 Description of 1 Description of Initiative Initiative 3 Year 2: Updated 3 Year 2: Updated Training/Metrics Training/Metrics 2 Year 1: NASA 2.
How Can Non-Profit Evaluators Best Support Field Staff in Data Collection? American Evaluation Association Annual Conference October 19 th, 2013 Kelly.
Dealing with the Common Core K-12
IENG 451 / 452 Voice of the Customer: Analysis (KANO, CTQ)
Lindsay Ruhter, Lori Andersen, & Allison Lawrence
youtube. com/watch Maybe intro while settling.
Sarah Lucchesi Learning Services Librarian
Proposal Development Support & Planning
Changing the Game The Logic Model
ISABEL NAYLON ESF EVALUATION PARTNERSHIP MEETING 13 NOVEMBER 2013
Metrics That Work for You
Teacher Evaluator Student Growth Retraining Academy
Presentation transcript:

Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 1 Forrest Shull,Fraunhofer Center for Experimental Software Engineering - Maryland ISERN Distributed Experiment: Planned Design

Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 2 Goals for this Meeting Present a framework for the distributed experiment –The broad strokes that are not negotiable! Decide whether there is sufficient consensus to go forward with a distributed experiment If yes, –Gauge the interest in specific hypotheses that can be explored using this framework. –Decide on a specific set of hypotheses for study and sign up participants.

Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 3 Distributed Design Goals Collect useful information about inspections for researchers and project managers –Each local experiment is interesting –“Meta-analysis” across a critical mass of experiments yields another level of information Understand and incorporate local differences Identify factors influencing effectiveness across –Different types of organizations –Different types of development environments Allow several options for collaboration –Allowing organizations to match required effort with anticipated benefits –All of which support the larger analysis

Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 4 Collaboration options: Level 1 Industry Survey –Descriptive analysis, no benchmark. –Estimated effort for organization:  One respondent, 4 to 6 staff hours. –Output: A report characterizing types of responding organizations and their inspection processes. –Benefits:  Characterization of state-of-the-practice  Compilation of best practices  Understanding of problem areas that could be improved  [Measure distance between “standards” and local processes, local documents.]

Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 5 Collaboration Options: Level 1 Industry Survey: Process –Identify/contact a respondent. –Respondent answers for his/her development team. –Responses are aggregated across all respondents and reflected in final report, along with lessons learned analysis.

Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 6 Collaboration Options: Level 2 Benchmark an inspection technique –Pick some inspection variant and study it across different contexts. –Estimated effort for organization:  Contact person and >10 participants for 1 day, 65 to 85 staff hours total. –Benefits:  Provides training to participants in a beneficial inspection approach.  Some understanding of potential benefit due to process change in the local organization.  [Some understanding of expected improvement due to variant process in different contexts.]

Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 7 Collaboration Options: Level 2 Benchmarking a technique: Process –Complete survey steps –Produce local documents (version control, seeding, …) –Training –Inspection using the new technique –Analysis: Compare inspection results to historical baseline for organization  Qualitative or quantitative –Feedback to participants

Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 8 Collaboration Options: Level 3 Controlled experiment –Get very accurate data about the improvement due to some inspection variant –Estimated effort for organization:  1 contact person, 8-10 hours  >10 participants for 1.5 days –Benefits:  Provides training to participants in a beneficial inspection approach.  Accurate understanding of potential benefit due to process change in the local organization.  [“Meta-analysis” across all organizations.]

Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 9 Collaboration Options: Level 3 Controlled experiment: Process –Complete survey steps –Produce 2 local documents –Inspection of “local” document using the usual technique –Training –Inspection of “baseline” document using the new technique –Inspection of “local” document using the new technique –Analysis:  Compare results on local documents for new vs. usual inspection techniques  Compare results for new technique on local vs. baseline documents –Feedback to participants

Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 10 Collaboration Options: Shadow Experiments For industrial collaborators who want to lower risk before investing in Level 2 or Level 3. –Must make a representative, “anonymized” local document available. –ISERN will match the industrial partner to a university course that would be willing to perform a first run of the same study –Industrial experiment will only occur if results from academic environment are promising.

Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 11 Known Issues Most organizations don’t have a common inspection process. –Contact points and analysis must be done at the level of development teams. Too little emphasis on accurate measures of effectiveness. –Can get some benefit from qualitative reflections, even at the survey level. No “benchmark technique” will be equally effective in all environments. –Level 2 and shadow experiment give an opportunity to test the hypothesis with less commitment. Seeding defects in any document is inaccurate. –We will emphasize using previously version-controlled documents with a defect history whenever possible, but have to rely on an organization’s own assessment of what the important issues are.

Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 12 Goals for the technique to be benchmarked Require evidence of its likely benefit Should be widely applicable (maximize potential pool of participants) Some version should be able to be taught in a “reasonable” time Should be of genuine interest to target audience Results should be actionable

Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 13 Some options for the technique to be benchmarked PBR (requirements) / OORTs (UML) –Training materials (including webcasts) with a history of reuse are available to assist consistency of training –PBR: Reusable ICSE tutorial Inspection meeting approaches –Video/telecon; document circulation; netmeeting; DOORS- based protocol –Alternatives are commonly available, require minimal training