Presentation is loading. Please wait.

Presentation is loading. Please wait.

CSCE 548 Code Review. CSCE 548 - Farkas2 Reading This lecture: – McGraw: Chapter 4 – Recommended: Best Practices for Peer Code Review,

Similar presentations


Presentation on theme: "CSCE 548 Code Review. CSCE 548 - Farkas2 Reading This lecture: – McGraw: Chapter 4 – Recommended: Best Practices for Peer Code Review,"— Presentation transcript:

1 CSCE 548 Code Review

2 CSCE 548 - Farkas2 Reading This lecture: – McGraw: Chapter 4 – Recommended: Best Practices for Peer Code Review, http://www.smartbear.com/docs/BestPracticesForPeerCodeRev iew.pdf http://www.smartbear.com/docs/BestPracticesForPeerCodeRev iew.pdf Next lecture: – Architectural Risk Analysis – Chapter 5

3 CSCE 548 - Farkas3 Application of Touchpoints Requirement and Use cases Architecture and Design Test Plans Code Tests and Test Results Feedback from the Field 5. Abuse cases 6. Security Requirements 2. Risk Analysis External Review 4. Risk-Based Security Tests 1. Code Review (Tools) 2. Risk Analysis 3. Penetration Testing 7. Security Operations

4 CSCE 548 - Farkas4 Code Review (Tool) Artifact: Code Implementation bugs Static Analysis tools White Hat activity

5 CSCE 548 - Farkas5 Software Bugs Programming bugs: – Compiler catches error, developer corrects bug, continue development Security relevant bug: – May be dormant for years – Potentially higher cost than programming error Who should be responsible for security bug? – Software developer? – Security expert?

6 CSCE 548 - Farkas6 Manual vs. Automated Code Review Manual Code Review – Tedious, error prone, exhausting – Need expert with the mindset of an attacker! Static analysis tools – Identify many common coding problems – Faster than manual – Need developer with basic understanding of security problems and how to fix detected ones

7 CSCE 548 - Farkas7 Best Practices Peer Code Review recommendations from SmartBear Software Based on Cisco code review study – Over 6000 programmers and 100 companies “lessons learned” results – Light weight code review

8 CSCE 548 - Farkas8 Best Practices Recommendations 1. 1. Review fewer that 200-400 lines Optimizes number of detected vulnerabilities (70-90 %) 2. Aim for an inspection rate of less than 300-500 line of code/hour Faster is not better! Based on number of detected vulnerabilities 3. Do not spend more than 60-90 mins on review at a time Efficiency drops after about an hour of intense work 4. Make developers annotate their code Encourage developers to “double-check” their work Reduce the number of vulnerabilities in the code

9 CSCE 548 - Farkas9 Best Practices Recommendations 2. 5. Establish quantifiable goals for code review External metrics: e.g., reduced # of support calls Internal metrics: e.g., defect rate 6. Maintain checklist Prevent omissions of important security components 7. Verify that defects are actually fixed Need good collaborative review of software 8. Managers must support code review Support team building and acceptance of process

10 CSCE 548 - Farkas10 Best Practices Recommendations 3. 9. Beware of the “Big Brother” effect Use of metrics – role of manager 10. The Ego effect User code review to encourage developers for good coding habits Review at least 20-33% of code 11. Light weight style of review Tool assisted Just as efficient as formal, heavy weight review but 1/5 less time required

11 CSCE 548 - Farkas11 Source Code vs. Binary Code Check What to check? Source Code or Binary Code? Source Code: – See the logic, control, and data flow – See explicit code lines – Fixes can be carried out on the source code Compiled Code: – May need reverse engineering (disassemble, decompile) – Finding a few vulnerabilities is easy. Finding all is difficult – Fixes may be incorporated as binary modules or external filters

12 CSCE 548 - Farkas12 How Static Analysis Works? Look for fixed set of patterns or rules – Syntactic matches – Lexical analysis – Flow analysis (control flow, call chains, data flow) False negatives False positives Sound tool: given a set of assumptions, the static analysis tool does not produce false negatives Commercial tools: unsound

13 CSCE 548 - Farkas13 Static Analysis Identify vulnerable constructs Similar to compiler – preprocess source file and evaluates against known vulnerabilities Scope of analysis – Local: one function at a time – Module-level: one class (or compilation unit) at a time – incorporates relationships between functions – Global: entire program – all relationships between functions

14 CSCE 548 - Farkas14 Rule Coverage Taxonomy of coding errors: – Language specific (e.g., C/C++, Java, etc.) – Functions or APIs Academia vs. industry

15 CSCE 548 - Farkas15 Commercial Tools Easy to use – still need expert knowledge Can process large code (millions of lines) efficiently Need competence of reviewer of results Encapsulates knowledge (known vulnerabilities) and efficient flow analysis Encourages efficient and secure coding

16 CSCE 548 - Farkas16 Tool Characteristics Be designed for security Support multiple tires Be extensible Be useful for both security analyst and developer Support existing development process Make sense for multiple stakeholders

17 CSCE 548 - Farkas17 Next Class Architectural Risk Analysis – Chapter 5


Download ppt "CSCE 548 Code Review. CSCE 548 - Farkas2 Reading This lecture: – McGraw: Chapter 4 – Recommended: Best Practices for Peer Code Review,"

Similar presentations


Ads by Google