Presentation is loading. Please wait.

Presentation is loading. Please wait.

Copyright 2015, Robert W. Hasker. Continuous Inspection  Code reviews  Powerful tool  Difficult to ensure meaningful reviews take place  Static analysis.

Similar presentations


Presentation on theme: "Copyright 2015, Robert W. Hasker. Continuous Inspection  Code reviews  Powerful tool  Difficult to ensure meaningful reviews take place  Static analysis."— Presentation transcript:

1 Copyright 2015, Robert W. Hasker

2 Continuous Inspection  Code reviews  Powerful tool  Difficult to ensure meaningful reviews take place  Static analysis tools  Clear win: easy to run, unambiguous results  One of earliest: lint (C)  jlint, pmd  http://en.wikipedia.org/wiki/List_of_tools_for_sta tic_code_analysis to be overwhelmed http://en.wikipedia.org/wiki/List_of_tools_for_sta tic_code_analysis  Demo jlint in TeamCity w/ failure condition

3 Code Metrics  How to measure amount of code  SLOC  Statements  Useful within developer, style, problem complexity  Why shouldn’t we grade on SLOCs?  Cyclomatic Complexity Number  Number of distinct paths  CCN > 10: high probability of defects

4 Cyclomatic Complexity  Textbook example: CCN of routine = 114  How to respond?  Check #tests >= 114  Book suggests 114 might be unreasonable. Do you agree?  Refactor  First rule of refactoring: write a test case before you change anything.  Apply extract method technique: write smaller procedures

5 Duplicated Code  Why a problem?  Increased maintenance costs  Uncertainty: all occurrences fixed?  Testing: additional coverage needed  Real world examples:  Linux kernel in 2002: 15 to 25% duplicated  Sun Java JDK: 21-29% duplicated  Tools: CPD from PMD package, Simian, others

6 Continuous Inspection  Reducing code complexity  Continuous (procedural) design reviews  Maintain standards  Reduce duplicate code  Assess testing coverage

7 Continuous Deployment  New model: product always evolving  Continuous deployment – release every night  What do you need to make this work?  CI, obviously  Automated labeling of released versions  Rollback support  Extreme CI  Bare install OS, configure OS, server components, third party tools, custom software  Absolutely all tests run

8 A middle ground  Three branches:  master: released  stage: validation  dev: target for each sprint  Levels of testing  dev: core testing  stage: acceptance testing by stakeholders  also performance testing?  mutation testing?  A tool!

9 Improving the process  Automatically build (& test) each pull request  Open VCS Root, show advanced options  Branch specification: +:refs/pull/*/merge  will see build in log as “refs/…master”

10 Continuous Feedback  What to do if build fails?  Just let someone discover the problem?  Post an alert  Who?  Not everyone!  Project manager, technical lead, developers, testers?  How?  Siren? Email? Text Message? Visual signal?

11 Continuous Feedback  Email: obvious option  Problem: may not be on email, spam  Text Message: still have spam issue  Ambient Orb  Small globe, color shows last build status  Problem: no detailed information  Windows task bar  Monitor  Should “last to break build” be on the screen?

12 Continuous Feedback in TeamCity  Open Build Configuration Settings, General Settings  Open advanced options  Build options: enable status widget  HTML for : @import "http://localhost:XXX/css/status/externalStatus.css";  HTML for body:  Usage: open demos\local-ci-status.html

13 Review  Continuous Inspection  Code metrics, cyclomatic complexity  Identifying code clones  Continuous Deployment  staging branch  build on pull request  mutation testing  Continuous Feedback


Download ppt "Copyright 2015, Robert W. Hasker. Continuous Inspection  Code reviews  Powerful tool  Difficult to ensure meaningful reviews take place  Static analysis."

Similar presentations


Ads by Google