Information Pooling Bias in Collaborative Cyber Forensics

Slides:



Advertisements
Similar presentations
Objectives Identify the differences between Analytical Decision Making and Intuitive Decision Making Demonstrate basic design and delivery requirements.
Advertisements

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Department of Industrial Management Engineering 1.Introduction ○Usability evaluation primarily summative ○Informal intuitive evaluations by designers even.
The Experience Factory May 2004 Leonardo Vaccaro.
1 Independent Verification and Validation Current Status, Challenges, and Research Opportunities Dan McCaugherty IV&V Program Manager Titan Systems Corporation.
Sensemaking and Ground Truth Ontology Development Chinua Umoja William M. Pottenger Jason Perry Christopher Janneck.
Empirically Assessing End User Software Engineering Techniques Gregg Rothermel Department of Computer Science and Engineering University of Nebraska --
Writing Good Software Engineering Research Papers A Paper by Mary Shaw In Proceedings of the 25th International Conference on Software Engineering (ICSE),
Secure Information and Resource Sharing in CloudSecure Information and Resource Sharing in Cloud References OSAC-SID Model [1]K. Harrison and G. White.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Improved Performance and Critical Thinking in Economics Students Using Current Event Journaling Sahar Bahmani, Ph.D. WI Teaching Fellow INTRODUCTION.
Unit 2: Engineering Design Process
ABET’s coming to Rose! Your involvement Monday, Nov 5, 2012.
Annual SERC Research Review, October 5-6, By Jennifer Bayuk Annual SERC Research Review October 5-6, 2011 University of Maryland Marriott Inn and.
Lecture 1 What is Modeling? What is Modeling? Creating a simplified version of reality Working with this version to understand or control some.
Umbrella Presentation Theme C: Cognitive Science of Cyber SA ASU (Cooke) Cyber Security as a Complex Cognitive System PSU (McNeese & Hall) Computer-aided.
Promoting Research and Application of Information Assurance and Cybersecurity 6 th Annual Security Summit May 20, 2009 Mark Weatherford, Chief Information.
1 MURI: Computer-aided Human Centric Cyber Situation Awareness Peng Liu Professor & Director, The LIONS Center Pennsylvania State University ARO Cyber.
Project Tracking. Questions... Why should we track a project that is underway? What aspects of a project need tracking?
Sharad Oberoi and Susan Finger Carnegie Mellon University DesignWebs: Towards the Creation of an Interactive Navigational Tool to assist and support Engineering.
INTERACTIVE ANALYSIS OF COMPUTER CRIMES PRESENTED FOR CS-689 ON 10/12/2000 BY NAGAKALYANA ESKALA.
Assessing the Frequency of Empirical Evaluation in Software Modeling Research Workshop on Experiences and Empirical Studies in Software Modelling (EESSMod)
Are forecasting methods too complex? Kesten C. Green*, University of South Australia J. Scott Armstrong*, University of Pennsylvania *Ehrenberg-Bass Institute.
Umbrella Presentation Cognitive Science of Cyber SA Collaborative Cyber Situation Awareness Nancy J. Cooke, ASU & Prashanth Rajivan, Indiana U. Models.
1 What is OO Design? OO Design is a process of invention, where developers create the abstractions necessary to meet the system’s requirements OO Design.
Consistency in Reporting Data Breaches
Experimentation in Computer Science (Part 2). Experimentation in Software Engineering --- Outline  Empirical Strategies  Measurement  Experiment Process.
Advanced Decision Architectures Collaborative Technology Alliance Five Lessons Learned in Human-Robot Interaction Patricia McDermott Alion Science and.
Getting from Discussion to Writing--with Maps 44th Annual Conference of the International Visual Literacy Association, October 11, Lenny.
1. October 25, 2011 Louis Everett & John Yu Division of Undergraduate Education National Science Foundation October 26, 2011 Don Millard & John Yu Division.
Unit 6 Understanding and Implementing Crew Resource Management.
Financial Sector Cybersecurity R&D Priorities The Members of the FSSCC R&D Committee November 2014.
Quality Online Preparation: Qualities of Faculty, Courses, and Preparation Programs By Dr. Erin O’Brien Valencia College League of Innovation Conference,
Jeanette Gurrola Psychology Department School of Behavioral & Organizational Sciences Claremont Graduate University American Evaluation.
BEHAVIOR BASED SELECTION Reducing the risk. Goals  Improve hiring accuracy  Save time and money  Reduce risk.
Stages of Research and Development
Technical Operations Report Board of Governors Meeting May 7-9, 2015
Michael J. Salé, Seidenberg School of CSIS, Westchester DPS 2016
Research Task / Overview Overview1 Goals & Objectives
Center of Excellence in Cyber Security
Experimental Psychology
Getting from Discussion to Writing--with Maps
Research Methods Dr. X.
Scholarship of Teaching and Learning
Instructor Training Cambridge
Inquiry Based Learning In Action
Hybrid Cloud Architecture for Software-as-a-Service Provider to Achieve Higher Privacy and Decrease Securiity Concerns about Cloud Computing P. Reinhold.
The Unbounded Systems Thinking Cybersecurity Paradigm
Dana Keener, Ph.D. ICF Macro 2009 AEA Annual Meeting November 12, 2009
ASSESSMENT OF STUDENT LEARNING
Architecture Components
Technical Communication: Foundations
How Do Psychologists Ask & Answer Questions?
Title: Validating a theoretical framework for describing computer programming processes 29 November 2017.
The Steps into creation of research
Software Quality Engineering
Program Evaluation Essentials-- Part 2
Optimize your research performance using SciVal
Model-Driven Analysis Frameworks for Embedded Systems
Building a National Collaborative Network for Deaf-Blind Services
NAE Global Challenge Scholars Program Annual Meeting on October 30.
Threat Trends and Protection Strategies Barbara Laswell, Ph. D
Multi-Step Attack Defense Operating Point Estimation via Bayesian Modeling under Parameter Uncertainty Peng Liu, Jun Dai, Xiaoyan Sun, Robert Cole Penn.
Improve Phase Wrap Up and Action Items
CS 522: Human-Computer Interaction Lab: Formative Evaluation
Technical Operations Strategic Planning
Comparative Evaluation of SOM-Ward Clustering and Decision Tree for Conducting Customer-Portfolio Analysis By 1Oloyede Ayodele, 2Ogunlana Deborah, 1Adeyemi.
Evaluation.
Research on Geoscience Learning
Finalization of the Action Plans and Development of Syllabus
Presentation transcript:

Information Pooling Bias in Collaborative Cyber Forensics Nancy J. Cooke, PhD Prashanth Rajivan, PhD Verica Buchanan Jessica Twyford November 18, 2014 This work has been supported by the Army Research Office under MURI Grant W911NF-09-1-0525.

ASU Project Overview Objectives: Department of Defense Benefit: Understand and Improve Team Cyber Situation Awareness via Understanding cognitive /teamwork elements of situation awareness in cyber-security domains Implementing a synthetic task environment to support team in the loop experiments for evaluation of new algorithms, tools and cognitive models Developing new theories, metrics, and models to extend our understanding of cyber situation awareness Department of Defense Benefit: Metrics, models, & testbeds for assessing human effectiveness and team situation awareness (TSA) in cyber domain Testbed for training cyber analysts and testing (V&V) algorithms and tools for improving cyber TSA Scientific/Technical Approach - Year 5 Explore and mitigate information pooling bias in cyber forensics through Empirical work in CyberCog testbed Prototype visualization Agent-Based Modeling Year 5 Accomplishments Found possible collaboration bias Developed prototype tool for mitigating bias Demonstrated tool utility in experiment Replicated this benefit in an agent-based model Challenge Struggle to maintain realism in testbed scenarios while allowing for novice participation and team interaction – now addressing with CyberCog and Dextar

Summary of FY 14 ASU Accomplishments PUBLICATIONS Rajivan, P. & Cooke, N. J., (under revision). A Methodology for Research on the Cognitive Science of Cyber Defense. Journal of Cognitive Engineering and Decision Making: Special Issue on Cybersecurity Decision Making. Champion, M., Jariwala, S. Ward, P., & Cooke, N. J. (2014). Using Cognitive Task Analysis to Investigate the Contribution of Informal Education to Developing Cyber Security Expertise. Proceedings of the 57th Annual Conference of the Human Factors and Ergonomics Society, Santa Monica, CA: Human Factors and Ergonomics Society   COLLABORATION Sushil Jajodia & Max Albanese– DEXTAR Several MURI partners on an ARL proposal Neville Stanton – EAST Modeling TECH TRANSFER Working with Charles River Analytics and AFRL on team measures of cyber defense Presentation to ASU Information Assurance Boeing interest in testbed  PROJECT: Information Pooling Bias in Collaborative Cyber Forensics STUDENTS SUPPORTED Prashanth Rajivan (PhD) Verica Buchanan (MS) Jessica Twyford (MS) Ethan Cornell (HS) David Owusu (HS) Anirudh Koka (HS) Adriana Stohn (HS) AWARD Human Factors and Ergonomics Society Alphonse Chapanis Student Paper Award: Rajivan, P., Janssen, M. A., & Cooke, N. J., (2013). Agent-based model of a cyber security defense analyst team. Proceedings of the 57th Annual Conference of the Human Factors and Ergonomics Society, Santa Monica, CA: Human Factors and Ergonomics Society.

Overall ASU Contributions Over the course of This MURI ASU Team has: Conducted Cognitive Task Analyses Developed Cyber Testbeds & Performance Metrics Conducted Experiments in Cyber Testbeds Developed EAST and Agent-based models And has Found Cyber analysts do not collaborate Collaboration is essential for cyber situation awareness Most recent work (presented today): Asks if collaboration could be biased Demonstrates value of intervention based on cognitive analysis of problem

Cyber attacks have evolved Multi-step, Advanced Persistent Threats (APT) & Stealth attacks Detecting multi-step and stealth attacks Requires advanced forensics tools Requires efficient information sharing

Cyber Defense - Forensics Analysis Analyze past evidences Detect a larger story

There is a lack of Context Raytheon’s recent survey revealed: “69 percent of the professionals surveyed said their security tools don't provide enough contextual information to determine the intent behind reported incidents” Currently: Isolated analysis Tools don’t offer cohesiveness or context Analysts blinded and tunneled

Solution - Machine Learning and Data analytics ? Human perfect context computing ? Humans survival mechanism Why don’t we use humans teams themselves

Cyber Defense Lacks Teamwork Low Information exchange Low understanding of how analyst team works and exchange information Existing collaboration tools: Emails, chat systems and wiki Sitting together in a office and using chat clients is not teamwork

Problem Statements With growing attack sophistication there is a need for timely knowledge sharing between cyber analysts There is a lack of understanding about team work and information sharing in analyst teams Can’t simply ask analysts to work as a team and expect miracles Lack of tailor made collaboration tools for cyber defense analysts Lacks global view of the attacks

Human Cognitive Biases Intelligence Analysis Very similar to cyber defense Decision making is plagued with biases (kahneman & Klein 2009) Information Load Loss in communication and team process Team level biases CKE and Confirmation bias

Explored Social Psychology Stumbled on Information Pooling Bias – Intrigued ! Expectation of a team: Work together, share all knowledge, build optimal decisions Different information distributions in a team can lead to different information sharing behavior. Information sharing is biased on shared information than unshared (or unique) information (Strasser & Titus 1985)

Information Pooling Bias Candidate A = 4 Candidate B = 6 A1 A2 A3 A4 B1 B2 A1 A2 A3 A4 B3 B4 Attributes of Candidate A will be repeatedly discussed Attributed of Candidate B will be mentioned few times A1 A2 A3 A4 B5 B6 Candidate A will be chosen

Information Pooling Bias Groups with unequal information distribution were found to be eight times less likely to find the solution than were groups having full information (Lu et al., 2012). It is impossible for every team member to know all the information (rely on others expertise) So simply asking cyber defense analyst to setup meetings and conduct forensics will not help.

Resemblance to APT and multi-step Evidence for the attack is uniquely available with different members of the team There are evidences of attacks that most of them are observing Fails to detect APT early on Information Pooling Bias ?

Effect of inefficient Information Sharing The 1999 mars climate orbiter disintegrated after entering the upper atmosphere of mars. Cost of the mission was $327.6 million !

A Quick Review Effective information sharing is paramount to detecting advanced types of attacks Teams in other domains have demonstrated the information pooling bias It is imperative that such a bias is investigated and understood in the cyber defense context Develop tool & interventions to mitigate the bias

Research Question 1 Does information pooling bias affect cyber forensic analyst team discussions and decisions?

Research Question 2 Does a tailor made collaboration tool lead to superior analyst performance compared to using off-the-shelf collaboration tool such as wiki software?

Procedure 30 teams of 3 = 90 participants Trained on cyber security concepts Practice mission 2 main missions Attack evidences distributed Pre-discussion reading and discussion Goal – Detect large scale attacks

Procedure

Attack Kinds Shared – large scale but seen by most Unique – large scale distributed evidences Isolated – Isolated attacks

Attack Data Distribution in Missions Unique A1 Unique A2 Unique A3 Unique B1 Unique B2 Unique B3 Isolated 1 Isolated 2 Isolated 3 Isolated 4 Isolated 5 Isolated 6 Shared 1 Shared 2 Shared 3 Shared 4 Shared 5 Attacks Analyst 1 Analyst 2 Analyst 3 Shared 1 Shared 1 Shared 1 Shared 2 Shared 2 Shared 2 Shared 3 Shared 3 Shared 4 Shared 5 Shared 4 Shared 5 Unique A1 Unique A2 Unique A3 Unique B1 Unique B2 Unique B3 Isolated 1 Isolated 3 Isolated 5 Isolated 2 Isolated 4 Isolated 6

Information distribution Used -Similar to earlier studies- Candidate A = 4 Candidate B = 6 A1 A2 A3 A4 B1 B2 A1 A2 A3 A4 B3 B4 A1 A2 A3 A4 B5 B6

Collaborative Visualization Experiment Design Mission1 - Baseline Mission 2 Tool Type Slide Based Wiki Collaborative Visualization

Collaborative Visualization Tool Collaborative visualization tool designed from a cognitive engineering perspective Cognitive Friendly Visualizations To mitigate the information pooling bias in cyber defense analysts Improve information sharing and decision making performance

Measures Communication coding Decision quality Workload & Demographics Discussion Focus Number of mentions of each attack type Decision quality All large scale attacks detected ? Workload & Demographics

Expected Results Amount of mentions of shared attacks is significantly more than the unshared attacks in baseline condition 1 or 2 standard deviation more Decision quality is hampered by the bias Information coverage & decision quality is higher in collaborative visualization condition Lower workload in the condition using collaborative visualization tool

Experiment Results

Team Level Measures Shared percent Unique percent Percentage of discussion focus spent on discussing attacks that are shared among members Unique percent Percentage of discussion focus spent on discussing attacks are unique but are part of large scale attack Detection Performance Number of attacks detected (Both shared and unique) Max possible = 18 (4*3 + 2*3)

Mission 1 Descriptives Multivariate test (Hotelling's Trace) yielded a non-significant result: F=1.074 p=0.398 Average Shared Percent = 63.5 Unique = 16.2 Performance = 11.5/18

Mission2 Descriptives Multivariate test (Hotelling's Trace) yielded a significant result: F=3.341 p=0.004 Average Shared Percent = 59.21 Unique = 22.7 Performance = 13/18

Percentage of shared information discussed in Mission 2

Percentage of unique information discussed in Mission 2

Number of attacks detected (Performance) in Mission 2

Multiple Comparisons

Percentage of shared information discussed compared between Missions

Percentage of unique information discussed compared between Missions

Number of attacks detected (Performance) compared between Missions

Number of shared attacks detected (Performance) compared between Missions

Number of unique attacks detected (Performance) compared between Missions

Summary of Experiment Results Significantly more percentage of shared attack information discussed Cyber Defense analysts undergo information pooling bias Prevents from detecting APT kinds of attacks Use of cognitive friendly visualization reduces the bias, improves performance Off the shelf collaboration tools doesn’t help

Computational Model of Experiment (Explained in Brevity) Internal search process could be critical Theorize about the underlying cognitive search process Individual Information search process (cognitive) primed by social interactions Human memory represented as search space

Models Random Local Search (Local Search & Uphill) Memory Aided Search Random walks in search of information Local Search (Local Search & Uphill) Discussion topic is the cue Searches information in the current neighborhood Uphill Memory Aided Search Uses recognition memory to identify regions to search Marks off attacks found

Measures Measures in parallel to the experiment Discussion Focus Performance

How models faired against experiment Bayesian statistics was employed Cognitive search Less-biased Local Search Biased discussion

Inferences from model People are using simple heuristics and search locally for information when they undergo bias Lack global view & Low recognition Essential to develop associations and develop context on the data Cognitive Friendly visualizations Higher recognition Less Load Global search

Conclusion Cyber SA requires collaboration of analysts Collaboration can be biased Visualization prototype mitigates that bias Agent-based model corroborates Supports understanding of cognitive underpinnings of cyber SA in design of interventions

Future Questions How to develop associations between reports for such a visualization ? Other cognitive biases in forensics ? Confirmation Bias – High chance !

Questions ncooke@asu.edu