CSCE 548 Secure Software Development Risk-Based Security Testing
Reading This lecture: Next lecture: Risk-Based Security Testing, McGraw: Chapter 7 Next lecture: Security Operations, McGraw: Chapter 9 CSCE 548 - Farkas
Application of Touchpoints External Review 3. Penetration Testing 1. Code Review (Tools) 6. Security Requirements 4. Risk-Based Security Tests 2. Risk Analysis 7. Security Operations 5. Abuse cases 2. Risk Analysis Requirement and Use cases Architecture and Design Test Plans Code Tests and Test Results Feedback from the Field CSCE 548 - Farkas
Software Testing Running a program or system with the intent of finding errors Evaluating capability of the system and determining that its requirements are met Physical processes vs. Software processes Testing purposes To improve quality For Verification & Validation (V&V) For reliability estimation CSCE 548 - Farkas
Quality Assurance External quality: correctness, reliability, usability, integrity Interior (engineering) quality: efficiency, testability, documentation, structure Future (adaptability) quality: flexibility, reusability, maintainability CSCE 548 - Farkas
Correctness Testing Black box: Test data are derived from the specified functional requirements without regard to the final program structure Data-driven, input/output driven, or requirements-based Functional testing No implementation details of the code are considered CSCE 548 - Farkas
Correctness Testing White box: Software under test are visible to the tester Testing plans: based on the details of the software implementation Test cases: derived from the program structure Glass-box testing, logic-driven testing, or design-based testing CSCE 548 - Farkas
Performance Testing Goal: bottleneck identification, performance comparison and evaluation, etc. Explicit or implicit requirements "Performance bugs" – design problems Test: usage, throughput, stimulus-response time, queue lengths, etc. Resources to be tested: network bandwidth requirements, CPU cycles, disk space, disk access operations, memory usage, etc. CSCE 548 - Farkas
Reliability Testing Probability of failure-free operation of a system Dependable software: it does not fail in unexpected or catastrophic ways Difficult to test CSCE 548 - Farkas
Security Testing Test: finding flaws in software can be exploited by attackers Quality, reliability and security are tightly coupled Software behavior testing Need: risk-based approach using system architecture information and attacker’s model CSCE 548 - Farkas
Risk-Based Testing Identify risks Create tests to address identified risks Security testing vs. penetration testing Level of approach Timing of testing CSCE 548 - Farkas
Penetration Testing Performed after the software is completed Evaluate operational environment Dynamic behavior Outside in activity – defending perimeters Cursory CSCE 548 - Farkas
Security Testing Can be applied before the product is completed Different levels of testing (e.g., component/unit level vs. system level) Testing environment Detailed CSCE 548 - Farkas
Risk Analysis Design phase analysis: Component/unit testing Identify and rank risks Discusses inter-component assumptions Component/unit testing Test for: Unauthorized misuse of and access to the target assets Violations of assumptions Breaking system into a number of discrete parts Risk can be mitigated within the bounds of contextual assumptions CSCE 548 - Farkas
System-Level Testing Focus on the properties of the integrated software system Penetration testing = Security testing Using data flow diagrams, models, and inter-component documentations, identify Inter-component failures Design level security risks Use misuse cases to enhance test plan CSCE 548 - Farkas
Behavior in the Presence of Malicious Attack What happens when the software fails? Safety critical systems Track risk over time Security relative to Information and services protected Skills and resources of adversaries Cost of protection System vulnerabilities CSCE 548 - Farkas
Vulnerabilities Design-level Implementation specific Hardest to detect Prevalent and critical Requires great expertise to detect – hard to automate Implementation specific Critical Easier to detect – some automation CSCE 548 - Farkas
Security Testing Functional security testing: testing security mechanisms for functional capabilities Adversarial security testing: risk-based security testing Understanding and simulating the attacker’s approach Both approaches must be used Security attacks may ignore the security mechanism to exploits of the software defects CSCE 548 - Farkas
Who Should Perform the Test? Standard testing organizations Functional testing Software security professionals Risk-based security testing Important: expertise and experience CSCE 548 - Farkas
How to Test? White box analysis Black box analysis Understanding and analyzing source code and design Very effective finding programming errors Can be supported by automated static analyzer Disadvantage: high rate of false positives Black box analysis Analyze a running program Probe the program with various input (malicious input) No need for any code – can be tested remotely CSCE 548 - Farkas
Malicious Input Software: takes input Trust input? Attacker toolkit Malformed or malicious input may lead to security compromise What is the input? Data vs. control Attacker toolkit CSCE 548 - Farkas
What Else? Testing for malicious input: necessary but NOT sufficient Risk-based security testing Planning tests (use forest-level view) Need operational aspects System state vs. applications used Multithread system – time-based attacks CSCE 548 - Farkas
Next Class Security Operations CSCE 548 - Farkas