Jay Ligatti summer 2004 intern work with:

Slides:



Advertisements
Similar presentations
Architectural Support for Software-Based Protection Mihai Budiu Úlfar Erlingsson Martín Abadi ASID Workshop, Oct 21, 2006 Silicon Valley.
Advertisements

1 Software Control Flow Integrity Techniques, Proofs, & Security Applications Jay Ligatti summer 2004 intern work with: Úlfar Erlingsson and Martín Abadi.
Defenses. Preventing hijacking attacks 1. Fix bugs: – Audit software Automated tools: Coverity, Prefast/Prefix. – Rewrite software in a type safe languange.
1/1/ / faculty of Electrical Engineering eindhoven university of technology Introduction Part 2: Data types and addressing modes dr.ir. A.C. Verschueren.
Using Instruction Block Signatures to Counter Code Injection Attacks Milena Milenković, Aleksandar Milenković, Emil Jovanov The University of Alabama in.
Foundational Certified Code in a Metalogical Framework Karl Crary and Susmit Sarkar Carnegie Mellon University.
Breno de MedeirosFlorida State University Fall 2005 Buffer overflow and stack smashing attacks Principles of application software security.
1 CHAPTER 8 BUFFER OVERFLOW. 2 Introduction One of the more advanced attack techniques is the buffer overflow attack Buffer Overflows occurs when software.
CS 326 Programming Languages, Concepts and Implementation Instructor: Mircea Nicolescu Lecture 18.
1  2004 Morgan Kaufmann Publishers Chapter Six. 2  2004 Morgan Kaufmann Publishers Pipelining The laundry analogy.
Achieving Trusted Systems by Providing Security and Reliability Ravishankar K. Iyer, Zbigniew Kalbarczyk, Jun Xu, Shuo Chen, Nithin Nakka and Karthik Pattabiraman.
1 RISE: Randomization Techniques for Software Security Dawn Song CMU Joint work with Monica Chew (UC Berkeley)
Address Obfuscation: An Efficient Approach to Combat a Broad Range of Memory Error Exploits Sandeep Bhatkar, Daniel C. DuVarney, and R. Sekar Stony Brook.
On-Chip Control Flow Integrity Check for Real Time Embedded Systems Fardin Abdi Taghi Abad, Joel Van Der Woude, Yi Lu, Stanley Bak, Marco Caccamo, Lui.
Java Security. Topics Intro to the Java Sandbox Language Level Security Run Time Security Evolution of Security Sandbox Models The Security Manager.
Previous Next 06/18/2000Shanghai Jiaotong Univ. Computer Science & Engineering Dept. C+J Software Architecture Shanghai Jiaotong University Author: Lu,
Shuo Chen, Jun Xu, Emre C. Sezer, Prachi Gauriar, and Ravishankar K. Iyer Brett Hodges April 8, 2010.
Buffer Overflow Detection Stuart Pickard CSCI 297 June 14, 2005.
CS533 Concepts of Operating Systems Jonathan Walpole.
Computer Science Detecting Memory Access Errors via Illegal Write Monitoring Ongoing Research by Emre Can Sezer.
Mitigation of Buffer Overflow Attacks
Automatic Diagnosis and Response to Memory Corruption Vulnerabilities Presenter: Jianyong Dai Jun Xu, Peng Ning, Chongkyung Kil, Yan Zhai, Chris Bookhot.
Title of Selected Paper: IMPRES: Integrated Monitoring for Processor Reliability and Security Authors: Roshan G. Ragel and Sri Parameswaran Presented by:
Buffer Overflow Proofing of Code Binaries By Ramya Reguramalingam Graduate Student, Computer Science Advisor: Dr. Gopal Gupta.
Buffer Overflow Attack Proofing of Code Binary Gopal Gupta, Parag Doshi, R. Reghuramalingam, Doug Harris The University of Texas at Dallas.
1 A Secure Access Control Mechanism against Internet Crackers Kenichi Kourai* Shigeru Chiba** *University of Tokyo **University of Tsukuba.
Part I The Basic Idea software sequence of instructions in memory logically divided in functions that call each other – function ‘IE’ calls function.
Introduction Program File Authorization Security Theorem Active Code Authorization Authorization Logic Implementation considerations Conclusion.
Efficient software-based fault isolation Robert Wahbe, Steven Lucco, Thomas Anderson & Susan Graham Presented by: Stelian Coros.
Security Attacks Tanenbaum & Bo, Modern Operating Systems:4th ed., (c) 2013 Prentice-Hall, Inc. All rights reserved.
MOPS: an Infrastructure for Examining Security Properties of Software Authors Hao Chen and David Wagner Appears in ACM Conference on Computer and Communications.
CNIT 127: Exploit Development Ch 8: Windows Overflows Part 1.
VM: Chapter 7 Buffer Overflows. csci5233 computer security & integrity (VM: Ch. 7) 2 Outline Impact of buffer overflows What is a buffer overflow? Types.
1 Introduction to Information Security , Spring 2016 Lecture 2: Control Hijacking (2/2) Avishai Wool.
Memory Protection through Dynamic Access Control Kun Zhang, Tao Zhang and Santosh Pande College of Computing Georgia Institute of Technology.
1 Jay Ligatti (Princeton University); joint work with: Lujo Bauer (Carnegie Mellon University), David Walker (Princeton University) Enforcing Non-safety.
Shellcode COSC 480 Presentation Alison Buben.
Mitigation against Buffer Overflow Attacks
Remix: On-demand Live Randomization
Protecting Memory What is there to protect in memory?
Introduction to Information Security
Types for Programs and Proofs
Protecting Memory What is there to protect in memory?
Protecting Memory What is there to protect in memory?
Suman Jana *Original slides from Vitaly Shmatikov
ELEN 468 Advanced Logic Design
Introduction to Information Security
Exam Review.
RISC Concepts, MIPS ISA Logic Design Tutorial 8.
Improving Program Efficiency by Packing Instructions Into Registers
Instructions - Type and Format
Chap. 8 :: Subroutines and Control Abstraction
Chap. 8 :: Subroutines and Control Abstraction
Security in Java Real or Decaf? cs205: engineering software
Defending against Stack Smashing attacks
Enforcing Non-safety Security Policies with Program Monitors
Inlining and Devirtualization Hal Perkins Autumn 2011
The University of Adelaide, School of Computer Science
New Research in Software Security
CSC 495/583 Topics of Software Security Format String Bug (2) & Heap
CSC 495/583 Topics of Software Security StackGuard & Format String Bug
Sampoorani, Sivakumar and Joshua
Language-based Security
Code-Pointer Integrity
CS5123 Software Validation and Quality Assurance
Computer Architecture
Loop-Level Parallelism
Carmine Abate Rob Blanco Deepak Garg Cătălin Hrițcu Jérémy Thibault
Return-to-libc Attacks
Presentation transcript:

Software Control Flow Integrity Techniques, Proofs, & Security Applications Jay Ligatti summer 2004 intern work with: Úlfar Erlingsson and Martín Abadi

Motivation I: Bad things happen DoS Weak authentication Insecure defaults Trojan horse Back door Particularly common: buffer overflows and machine-code injection attacks Source: http://www.us-cert.gov

Motivation II: Lots of bad things happen ** (only Q1 and Q2 of 2004) Source: http://www.cert.org/stats/cert_stats.html

Motivation III: “Bad Thing” is usually UCIT About 60% of CERT/CC advisories deal with Unauthorized Control Information Tampering [XKI03] E.g.: Overflow buffer to overwrite return address Other bugs can also divert control Can be anything Attack Code Hijacked PC pointer Previous function’s stack frame Return address Local Buffer Garbage Attack Code UCIT not only types of bugs. Other ways to do UCIT (format strings, heap corrupt, etc.) – would be nice to handle all of them.

Motivation IV: Previous Work Ambitious goals, Informal reasoning, Flawed results StackGuard of Cowan et al. [CPM+98] (used in SP2) “Programs compiled with StackGuard are safe from buffer overflow attack, regardless of the software engineering quality of the program.” Why can’t an attacker learn/guess the canary? What about function args? [CPM+98] Also: what about heap buffers?

This Research Goal: Provably correct mechanisms that prevent powerful attackers from succeeding by protecting against all UCIT attacks Part of new project: Gleipnir What we can’t handle: non-UCIT attacks …in Norse mythology, is a magic chord used to bind the monstrous wolf Fenrir, thinner than a silken ribbon yet stronger than the strongest chains of steel. These chains were crafted for the Norse gods by the dwarves from “the sound of a cat's footfall and the woman's beard and the mountain's roots and the bear's sinews and the fish's breath and bird's spittle.”

Attack Model Powerful Attacker: Can at any time arbitrarily overwrite any data memory and (most) registers Attacker cannot directly modify the PC Attacker cannot modify our reserved registers (in the handful of places where we need them) Few Assumptions: Data memory is Non-Executable * Code memory is Non-Writable * Also… currently limited to whole-program guarantees (still figuring out how to do dynamic loading of DLLs)

Our Mechanism FA FB Acall Acall+1 B1 Bret CFG excerpt nop IMM1 if(*fp != nop IMM1) halt Acall Acall+1 B1 Bret CFG excerpt nop IMM2 if(**esp != nop IMM2) halt NB: Need to ensure bit patterns for nops appear nowhere else in code memory call fp return

More Complex CFGs CFG excerpt Maybe statically all we know is that FA can call any int int function B1 Acall FA C1 FB succ(Acall) = {B1, C1} nop IMM1 if(*fp != nop IMM1) halt Construction: All targets of a computed jump must have the same destination id (IMM) in their nop instruction call fp FC

Imprecise Return Information Q: What if FB can return to many functions ? CFG excerpt FA Acall+1 A: Imprecise CFG Bret Dcall+1 call FB FB nop IMM2 if(**esp != nop IMM2) halt succ(Bret) = {Acall+1, Dcall+1} CFG Integrity: Changes to the PC are only to valid successor PCs, per succ(). Even for a very imprecise CFG, where the succ set of every jump instruction is the entry point of every basic block, CFG integrity implies that basic blocks are only entered from the top. FD return call FB

No “Zig-Zag” Imprecision Solution I: Allow the imprecision Solution II: Duplicate code to remove zig-zags Acall B1 CFG excerpt C1A Ecall C1E CFG excerpt B1 Acall C1 Ecall

Security Proof Outline Define machine code semantics Model a powerful attacker Define instrumentation algorithm Prove security theorem

Security Proof I: Semantics “Normal” steps: (an extension of [HST+02]) Attack step: General steps:

Security Proof II: Instrumentation Algorithm Insert new illegal instruction at the end of code memory For all computed jump destinations d with destination id X, insert “nop X” before d Change every jmp rs into: addi r0, rs, 0 ld r1, r0[0] movi r2, IMMX bgt r1, r2, HALT bgt r2, r1, HALT jmp r0 Where IMMX is the bit pattern that decodes into “nop X” s.t. X is the destination id of all targets of the jmp rs instruction.

Security Proof III: Properties Instrumentation algorithm immediately leads to constraints on code memory, e.g.: Using such constraints + the semantics,

SMAC Extensions In general, our CFG integrity property implies uncircumventable sandboxing (i.e., safety checks inserted by instrumentation before instruction X will always be executed before reaching X). Can remove NX data and NW code assumptions from language (can do SFI and more!): Even with most-general succ() graph, we get enough guarantees to do uncircumventable sandboxing. NX data addi r0, rs, 0 bgt r0, max(dom(MC)), HALT bgt min(dom(MC)), r0, HALT [checks from orig. algorithm] jmp r0 NW code addi r0, rd, 0 bgt r0, max(dom(MD)) - w, HALT bgt min(dom(MD)) - w, r0, HALT st r0(w), rs

Runtime Precision Increase Can use SMAC to increase precision Set up protected memory for dynamic information and query it before jumps E.g., returns from functions When A calls B, B should return to A not D Maintain return-address stack untouchable by original program Also mention this is frequent idea: have extra stack memory Has been proposed in hardware schemes, etc. Or schemes where you cannot guarantee that memory isn’t corrupted

Efficient Implementation ? Should be fast (make good use of caches): Checks & IDs same locality as code Static pressure on unified caches and top-level iCache Dynamic pressure on top-level dTLB and dCache How to do checks on x86 Can implement NOPs using x86 prefetching etc. Alternatively add 32-bit id and SKIP over it How to get CFG and how to instrument? Use magic of MSR Vulcan and PDB files

Microbenchmarks PIII P4 NOP IMM SKIP IMM Program calls pointer to “null function” repeatedly Preliminary x86 instrumentation sequences Normalized Overheads PIII P4 NOP IMM Forward 11% Return 11% Both 33% Forward 55% Return 54% Both 111% SKIP IMM Return 221% Both 276% Forward 19% Return 181% Both 195% PIII = XP SP2, Safe Mode w/CMD, Mobile Pentium III, 1.2GHz P4 = XP SP2, Safe Mode w/CMD, Pentium 4, no HT, 2.4GHz

Future Work Practical issues: Formal work: Real-world implementation & testing Dynamically loaded code Partial instrumentation Formal work: Finish proof of security for extended instrumentation Proofs of transparency (semantic equivalence) of instrumented code Move to proof for x86 code

References [CPM+98] Cowan, Pu, Maier, Walpole, Bakke, Beattie, Grier, Wagle, Zhang, Hinton. StackGuard: Automatic adaptive detection and prevention of buffer-overflow attacks. In Proc. of the 7th Unsenix Security Symposium, 1998. [HST+02] Hamid, Shao, Trifonov, Monnier, Ni. A Syntactic Approach to Foundational Proof-Carrying Code. Technical Report YALEU/DCS/TR-1224, Yale Univ., 2002. [XKI03] Xu, Kalbarczyk, Iyer. Transparent runtime randomization. In Proc. of the Symposium on Reliable and Distributed Systems, 2003.

End