Next steps towards a Science of Security David Evans University of Virginia IARPA Visit 11 January 2010.

Slides:



Advertisements
Similar presentations
W ashington A rea T rustworthy C omputing H our
Advertisements

Research skills. OUTLINE Mission and Vision What is Research? Ten Steps for Good Research Resources of Research Types of research Skills (Top_5 Skills)
Science and Psychology
What is the mistake?. Scientific Method Lesson 1.
NP-Complete William Strickland COT4810 Spring 2008 February 7, 2008.
GENI: Global Environment for Networking Innovations Larry Landweber Senior Advisor NSF:CISE Joint Techs Madison, WI July 17, 2006.
Standards for Qualitative Research in Education
Secure and Trustworthy Cyberspace (SaTC) Program Sam Weber Program Director March 2012.
Summary of the NSF/IARPA/NSA Workshop on the Science of Security David Evans University of Virginia INFOSEC Research Council 16 July 2009.
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
1 Introduction to Computability Theory Lecture12: Reductions Prof. Amos Israeli.
Automated Analysis and Code Generation for Domain-Specific Models George Edwards Center for Systems and Software Engineering University of Southern California.
Let’s add some detail to our definition of science….
Scientific method - 1 Scientific method is a body of techniques for investigating phenomena and acquiring new knowledge, as well as for correcting and.
1 Algorithmic Foundations of the Internet Alejandro Lopez-Ortiz (Waterloo) Jennifer Rexford (Princeton) Rebecca Wright (Stevens)
Control of Personal Information in a Networked World Rebecca Wright Boaz Barak Jim Aspnes Avi Wigderson Sanjeev Arora David Goodman Joan Feigenbaum ToNC.
The Road to a Good Science Project Dr. Michael H. W. Lam Department of Biology & Chemistry City University of Hong Kong Hong Kong Student Science Project.
Unit 2: Engineering Design Process
Ciarán O’Leary Wednesday, 23 rd September Ciarán O’Leary School of Computing, Dublin Institute of Technology, Kevin St Research Interests Distributed.
Major Non-Consensus Program and its review process design Wang Yue; Li xiaoxuan Institute of Policy and Management, Chinese Academy of Sciences Zheng yonghe.
Software Engineering ‘The establishment and use of sound engineering principles (methods) in order to obtain economically software that is reliable and.
CSCE 548 Secure Software Development Test 1 Review.
Science of Security Experimentation John McHugh, Dalhousie University Jennifer Bayuk, Jennifer L Bayuk LLC Minaxi Gupta, Indiana University Roy Maxion,
THEME 1: Improving the Experimentation and Discovery Process Unprecedented complexity of scientific enterprise Is science stymied by the human bottleneck?
Computer Science Open Research Questions Adversary models –Define/Formalize adversary models Need to incorporate characteristics of new technologies and.
Big Idea 1: The Practice of Science Description A: Scientific inquiry is a multifaceted activity; the processes of science include the formulation of scientifically.
The Science of Cyber Security Laurie Williams 1 Figure from IEEE Security and Privacy, May-June 2011 issue.
IWLU Panel Panelists: Mary Shaw, Steve Easterbrook, Betty Cheng, David Garlan, Alexander Egyed, Alex Orso, Tim Menzies Moderator: Marsha Chechik.
Model-Driven Analysis Frameworks for Embedded Systems George Edwards USC Center for Systems and Software Engineering
Let’s Stop Beating Dead Horses, and Start Beating Trojan Horses! David Evans INFOSEC Malicious Code Workshop San Antonio, 13.
WSMX Execution Semantics Executable Software Specification Eyal Oren DERI
Assessing the Frequency of Empirical Evaluation in Software Modeling Research Workshop on Experiences and Empirical Studies in Software Modelling (EESSMod)
The roots of innovation Future and Emerging Technologies (FET) Future and Emerging Technologies (FET) The roots of innovation Proactive initiative on:
National Science Foundation Directorate for Computer & Information Science & Engineering (CISE) Trustworthy Computing and Transition to Practice Secure.
Introduction to Science Informatics Lecture 1. What Is Science? a dependence on external verification; an expectation of reproducible results; a focus.
Science and Psychology Psych 231: Research Methods in Psychology.
How do we know things? The Scientific Method Psych 231: Research Methods in Psychology.
I know of no more encouraging fact than the unquestionable ability of man to elevate his life by conscious endeavor. Henry David Thoreau.
A Research Agenda for Scientific Foundations of Security David Evans University of Virginia NITRD Post-Oakland Program 25 May 2011 Artwork: Giacomo Marchesi.
Identifying the Impacts of Technology Transfer Beyond Commercialization FPTT National Meeting, June 12, 2007.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Developing and Evaluating Theories of Behavior.
Secure and Trustworthy Cyberspace (SaTC) “Top 10” Tips for SaTC Proposals One program director’s observations Sol Greenspan.
TRUST : Team for Research in Ubiquitous Secure Technology National Science Foundation Site Visit February 24-26, 2009 │Berkeley, California Health Infrastructures.
The E ngineering Design Process Foundations of Technology The E ngineering Design Process © 2013 International Technology and Engineering Educators Association,
1 Basic Chemistry: A Foundation by Steven S. Zumdahl University of Illinois.
Theories and Hypotheses. Assumptions of science A true physical universe exists Order through cause and effect, the connections can be discovered Knowledge.
Scientific Debugging. Errors in Software Errors are unexpected behaviors or outputs in programs As long as software is developed by humans, it will contain.
1Mobile Computing Systems © 2001 Carnegie Mellon University Writing a Successful NSF Proposal November 4, 2003 Website: nsf.gov.
March 12, SIGCSE Report FOCE Summit Panel 1 Getting to a Future of Computing Education Summit Joseph Urban Texas Tech University.
Hosted by: Institute for Software Integrated Systems (ISIS) Vanderbilt University Software Reliability for FCS Discussion Format May 18-19, 2004 ARO Workshop.
Page 1 of 10 Red Teams & Other Experiment Process Headaches NDSS 2000 Symposium, 4 February 2000 Brad Wood Information Design Assurance.
Artificial Intelligence: Research and Collaborative Possibilities a presentation by: Dr. Ernest L. McDuffie, Assistant Professor Department of Computer.
Chapter 1 Section 2 Scientific Methods. What are Scientific Methods What do Scientists use scientific methods for? To answer questions and to solve problems.
Belief in Information Flow Michael Clarkson, Andrew Myers, Fred B. Schneider Cornell University 18 th IEEE Computer Security Foundations Workshop June.
National Science Foundation Blue Ribbon Panel on Cyberinfrastructure Summary for the OSCER Symposium 13 September 2002.
What’s it all about Alfie? Scientific tools and the scientific method Scientific tools and the scientific method What is chemistry? What is chemistry?
Chapter 1 Section 3. Bell Ringer (Brainstorm)  To teach cardiopulmonary resuscitation, or CPR, instructors most often use a mannequin to model a human.
1 Ontological Foundations For SysML Henson Graves September 2010.
Algorithms and Problem Solving
Research strategies & Methods of data collection
How do we know things? The Scientific Method
Model-Driven Analysis Frameworks for Embedded Systems
Developing and Evaluating Theories of Behavior
TRUST:Team for Research in Ubiquitous Secure Technologies
Research strategies & Methods of data collection
Automated Analysis and Code Generation for Domain-Specific Models
Lecture 19: Proof-Carrying Code Background just got here last week
Research strategies & Methods of data collection
Rich Model Toolkit – An Infrastructure for Reliable Computer Systems
Presentation transcript:

Next steps towards a Science of Security David Evans University of Virginia IARPA Visit 11 January 2010

Things I won’t talk about… Producer 1 Code Producer 2 Data Policy Data Policy Code Data-Centric Policy Enforcement [MURI 2009] Implementable RFID Privacy & Security [NSF] User-Intent Based Access Control

Outline Workshop recap – Description, participants, format – Main ideas – Outcomes Discussion of what to do next

Workshop Stated Goals: Identify scientific questions regarding computer security Stimulate new work toward defining and answering those questions. Encourage work that goes beyond ad hoc point solutions and informal security arguments to discover foundational scientific laws for reasoning about and designing secure computing systems. 2-day workshop in Berkeley, CA (Nov 2008) Premise: need new ideas from outside traditional infosec research community

Participant Background David Bray, IDA Fred Chang, U. of Texas/NSA Byron Cook, MSR/Cambridge Son Dao, HRL Laboratories Anupam Datta, CMU John Doyle, Caltech Anthony Ephremides, UMd David Evans (organizer), UVa Manfai Fong, IARPA Stephanie Forrest, UNM John Frink, ODUSD Joshua Guttman, MITRE Robert Herklotz, AFOSR Alfred Hero, U. of Michigan Sampath Kannan, NSF Steven King, ODUSD Carl Landwehr, IARPA Greg Larsen, IDA Karl Levitt (organizer), NSF Brad Martin (organizer), NSA John Mallery, MIT CSAIL Roy Maxion, CMU Robert Meushaw, NSA John Mitchell, Stanford Pierre Moulin, U. Illinois Timothy Murphy, IARPA Dusko Pavlovic, Kestrel/Oxford Nick Petroni, IDA Lisa Porter, IARPA Michael Reiter, UNC Phillip Rogaway, UC Davis John Rushby, SRI Stuart Russell, UC Berkeley Fred Schneider, Cornell Jim Silk (organizer), IDA Dawn Song, UC Berkeley Pieter Swart, Los Alamos NL Vicraj Thomas, BBN/GENI Hal Varian, Berkeley/Google Cliff Wang, ARO Rebecca Wright, Rutgers Shouhuai Xu, UT San Antonio Ty Znati, NSF Lenore Zuck, NSF Traditional Infosec Researchers Computer Scientists Outside Area

Format 2 days Keynotes: Fred Schneider, Frederick Chang Panels: – What is a science of security? – What can we learn from other fields? – How can we reason about impossible things? – Are scientific experiments in security possible? Breakout groups on various topics including: – Metrics, provable security, managing complexity, composition, experimentation

Philosophical Questions (Usually Not Worth Discussing) Is there science in computer system security? Yes, but of course there should be more.

Meaning of “Science” Systematization of Knowledge – Ad hoc point solutions vs. general understanding – Repeating failures of the past with each new platform, type of vulnerability Scientific Method – Process of hypothesis testing and experiments – Building abstractions and models, theorems Universal Laws – Widely applicable – Make strong, quantitative predictions

Realistic Goal Can we be a science like physics or chemistry? Unlikely – humans will always be a factor in security. How far can we get without modeling humans? How far can we get with simple models of human capabilities and behavior? Workshop charge: avoid human behavior issues as much as possible.

Alchemy (700-~1660) Well-defined, testable goal (turn lead into gold) Established theory (four elements: earth, fire, water, air) Methodical experiments and lab techniques (Jabir ibn Hayyan in 8 th century) Wrong and unsuccessful...but led to modern chemistry.

Questions for a “Science” of Security Resilience: Given a system P and an attack class A, is there a way to: – Prove that P is not vulnerable to any attack in A ? – Construct a system P' that behaves similarly to P except is not vulnerable to any attack in A ? Establishing Improvement: How can we determine if a system A is more secure than system f(A) (where f is some function that transforms an input program or system definition)?

Metrics Scientific understanding requires quantification For most interesting security properties, not clear what to measure or how to measure it Comparative metrics may be more useful/feasible than absolute metrics: – How can we quantify the security of two systems A and B in a way that establishes which is more secure for a given context? – Quantify the security of a system A and a system f(A) (where f is some function that transforms an input program or system definition)?

Metrics: Promising Approaches Experimental metrics: more systematic “red team” approaches (hard to account for attacker creativity) Economic metrics: active research community Epidemiological metrics: good at modeling spread over network, but need assumptions about vulnerabilities Computational complexity metrics: define attacker search space (promising for automated diversity)

Formal Methods and Security Lots of progress in reasoning about correctness properties – Byron Cook: reasoning about termination and liveness Systems fail when attackers find ways to violate assumptions used in proof – Need formal methods that make assumptions explicit in a useful way – Combining formal methods with enforcement mechanisms that enforce assumption

Formal Methods vs. Complexity (Loosely) Due to Fred Chang Time Pessimist’s View Complexity Formal Techniques Capability Deployed Systems 2010

Formal Methods vs. Complexity (Loosely) Due to Fred Chang Time Optimist’s View Complexity Deployed Systems Formal Techniques Capability TCB of Deployed 2010

Formal Methods Questions Refinement: Can we use refinement approaches (design →... → implementation) that preserve security properties the way they are used to preserve correctness properties now? Program analysis: What security properties can be established by dynamic and static analysis? How can computability limits be overcome using hybrid analysis, system architectures, or restricted programming languages?

Experimentation Security experiments require adversary models Need to improve adversary models – Coalesce knowledge of real adversaries – Canonical attacker models (c.f., crypto) Design for reproducibility

Recap Summary Systematization of Knowledge – Ad hoc point solutions vs. general understanding – Repeating failures of the past with each new platform, type of vulnerability Scientific Method – Process of hypothesis testing and experiments – Building abstractions and models, theorems Universal Laws – Widely applicable – Make strong, quantitative predictions Valuable and achievable: need the right incentives for community Uncertainty if such laws exist; long way to go for meaningful quantification. Progress in useful models; big challenges in constructing security experiments

Concrete Outcomes INFOSEC Research Council (meeting focus, July 2009) 31 st IEEE Symposium on Security and Privacy (Oakland 2010): Systematization of Knowledge track “The goal of this call is to encourage work that evaluates, systematizes, and contextualizes existing knowledge. These papers will provide a high value to our community but would otherwise not be accepted because they lack novel research contributions.” IEEE Security and Privacy Magazine, Special Issue on the Science of Security (planned for May/June 2011) – Guest editors: Sal Stolfo/David Evans

Discussion What are the next steps? – Would another meeting be useful? – What kind, how organized? – Funding programs? Focus areas: – Metrics – Experimentation – Formal methods – Challenge problems?