Two-tiered, Multi-team Assessment of CSIRTs

Slides:



Advertisements
Similar presentations
Training activities administration and logistical support
Advertisements

Course Material Overview of Process Safety Compliance with Standards
Carnegie Mellon University Software Engineering Institute CERT® Knowledgebase Copyright © 1997 Carnegie Mellon University VU#14202 UNIX rlogin with stack.
S3-1 © 2001 Carnegie Mellon University OCTAVE SM Process 3 Identify Staff Knowledge Software Engineering Institute Carnegie Mellon University Pittsburgh,
More CMM Part Two : Details.
Copyright © 2012 Pearson Education, Inc. Publishing as Prentice Hall 3.1.
9 th Annual Public Health Finance Roundtable November 3, 2012 Boston, MA Peggy Honoré.
S2-1 © 2001 Carnegie Mellon University OCTAVE SM Process 2 Identify Operational Area Management Knowledge Software Engineering Institute Carnegie Mellon.
© 2013 Carnegie Mellon University Academy for Software Engineering Education and Training, 2013 Session Architect: Tony Cowling Session Chair: Nancy Mead.
© Carnegie Mellon University The CERT Insider Threat Center.
Copyright © 2012 Pearson Education, Inc. Publishing as Prentice Hall 3.1.
Dr. Julian Lo Consulting Director ITIL v3 Expert
Cybersecurity Summit 2004 Andrea Norris Deputy Chief Information Officer/ Director of Division of Information Systems.
IS Audit Function Knowledge
Unit 8: Tests, Training, and Exercises Unit Introduction and Overview Unit objectives:  Define and explain the terms tests, training, and exercises. 
National Institute of Standards and Technology Computer Security Division Information Technology Laboratory Threat Information Sharing; Perspectives, Strategies,
Quality evaluation and improvement for Internal Audit
4 4 By: A. Shukr, M. Alnouri. Many new project managers have trouble looking at the “big picture” and want to focus on too many details. Project managers.
project management office(PMO)
a judgment of what constitutes good or bad Audit a systematic and critical examination to examine or verify.
Purpose of the Standards
Standards and Guidelines for Quality Assurance in the European
Capability Maturity Model
4. Quality Management System (QMS)
Control environment and control activities. Day II Session III and IV.
Complying With The Federal Information Security Act (FISMA)
A SOUND INVESTMENT IN SUCCESSFUL VR OUTCOMES FINANCIAL MANAGEMENT FINANCIAL MANAGEMENT.
Information Security Compliance System Owner Training Richard Gadsden Information Security Office Office of the CIO – Information Services Sharon Knowles.
PMP® Exam Preparation Course
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
N By: Md Rezaul Huda Reza n
Overview of NIPP 2013: Partnering for Critical Infrastructure Security and Resilience October 2013 DRAFT.
Unit 5:Elements of A Viable COOP Capability (cont.)  Define and explain the terms tests, training, and exercises (TT&E)  Explain the importance of a.
© 2001 by Carnegie Mellon University PSM-1 OCTAVE SM : Senior Management Briefing Software Engineering Institute Carnegie Mellon University Pittsburgh,
Focus on Learning: Student Outcomes Assessment and the Learning College.
INFORMATION ASSURANCE USING C OBI T MEYCOR C OBI T CSA & MEYCOR C OBI T AG TOOLS.
GBA IT Project Management Final Project - Establishment of a Project Management Management Office 10 July, 2003.
December 14, 2011/Office of the NIH CIO Operational Analysis – What Does It Mean To The Project Manager? NIH Project Management Community of Excellence.
Certification and Accreditation CS Phase-1: Definition Atif Sultanuddin Raja Chawat Raja Chawat.
What is a Business Analyst? A Business Analyst is someone who works as a liaison among stakeholders in order to elicit, analyze, communicate and validate.
Private & Confidential1 (SIA) 13 Enterprise Risk Management The Standard should be read in the conjunction with the "Preface to the Standards on Internal.
SECTION 1 THE PROJECT MANAGEMENT FRAMEWORK
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Georgia Institute of Technology CS 4320 Fall 2003.
Presenter’s Name June 17, Directions for this Template  Use the Slide Master to make universal changes to the presentation, including inserting.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
Software Engineering - I
Example Incident Mgmt Initiation No recording of Incidents Users can approach different departments Solutions of previous incidents are not available.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Networked Systems Survivability CERT ® Coordination Center Software Engineering Institute Carnegie Mellon University Pittsburgh, PA © 2002 Carnegie.
Author Software Engineering Institute
Kathy Corbiere Service Delivery and Performance Commission
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
Project Management Training
1 CREATING AND MANAGING CERT. 2 Internet Wonderful and Terrible “The wonderful thing about the Internet is that you’re connected to everyone else. The.
SOLGM Wanaka Retreat Health and Safety at Work Act 2015 Ready? 4 February 2016 Samantha Turner Partner DDI: Mob:
AUDIT STAFF TRAINING WORKSHOP 13 TH – 14 TH NOVEMBER 2014, HILTON HOTEL NAIROBI AUDIT PLANNING 1.
A Scorecard for Cyber Resilience: What We Have Observed
Disaster and Emergency Planning
Michael Spiegel, Esq Timothy Shimeall, Ph.D.
Identify the Risk of Not Doing BA
Metrics-Focused Analysis of Network Flow Data
Monitoring and Evaluation using the
Capability Maturity Model
Capability Maturity Model
{Project Name} Organizational Chart, Roles and Responsibilities
Developing Useful Metrics
Presentation transcript:

Two-tiered, Multi-team Assessment of CSIRTs Robin Ruefle CERT Division Software Engineering Institute Carnegie Mellon University 26th Annual FIRST Conference Boston, MA June 2014 Again, our modeling helps us understand and characterize the contextual factors that surround and influence events. Technical question: “What are the observable technical and behavioral precursors and indications of insider theft of IP?”

Copyright 2014 Carnegie Mellon University This material is based upon work funded and supported under Contract No. FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center sponsored by the United States Department of Defense. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the United States Department of Defense. NO WARRANTY. THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN “AS-IS” BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT. This material has been approved for public release and unlimited distribution. This material may be reproduced in its entirety, without modification, and freely distributed in written or electronic form without requesting formal permission. Permission is required for any other use. Requests for permission should be directed to the Software Engineering Institute at permission@sei.cmu.edu. Carnegie Mellon® and CERT Coordination Center® are registered marks of Carnegie Mellon University. DM-0001434

How Is My CSIRT Doing? A key struggle for CSIRT organizations today is determining how successful they are in meeting their mission of managing cybersecurity incidents. As teams become more mature in terms of operational longevity, they are looking for ways to evaluate their operations. Key outcomes are to identify strengths and weaknesses in processes, technologies, and methods and use this to plan for improvement. Teams are also interested in being able to benchmark themselves against similar external teams but also against their own internal incident management groups.

Available Instruments from CERT Mission Risk Diagnostic for Incident Management Capabilities (MRD-IMC) New version just published: http://resources.sei.cmu.edu/library/asset-view.cfm?assetid=91452 Replaces the Incident Management Mission Diagnostic. Incident Management Capability Assessment (IMCA) Version 2 planned for development and publication Will replace the Incident Management Capability Metrics: http://resources.sei.cmu.edu/library/asset-view.cfm?assetid=8379

MRD-IMC Purpose: Overview: Determine the extent to which an IM function is in position to achieve its mission and objective(s) Overview: Evaluates a set of systemic risk factors (called drivers) to aggregate decision-making data and provide decision makers with a benchmark of an IM function's current state. Provides a high-level assessment of an IM function First-pass screening (i.e., “health check”) High-level diagnosis of conditions Complements detailed, deep-dive evaluations of IM functions Delivery Method: Expert-led assessment Self-assessment

Driver Question: Example

Incident Management Drivers: Detect and Respond 1. Incident Management Objectives 2. Stakeholder Requirements 3. Incident Management Plan 4. Organizational Environment 5. People 6. Roles and Responsibilities 7. Information Management 8. Tools and Technologies 9. Facilities 10. Information Collection 11. Detection 12. Analysis 13. Response 14. Information Dissemination 15. Coordination 16. Resilience

Incident Management Capability Assessment Purpose: Determine how many IM capabilities are being adequately performed by an IM function Overview: Measures an organization’s incident management functions against the CERT incident management capabilities which define a benchmark of good practice Provides a more detailed assessment of an IM function Evaluates a set of indicators for each capability There are three types of indicators: required, recommended best practices, and institutionalization Compliments detailed, deep-dive evaluations of IM functions Delivery Method: Expert-led assessment Could be used as a self-assessment, but process still needs to be followed Could also be used as guidance for creating incident management framework

Capability Example 1.1 Establish IM Function 1.1.2 An incident management function or CSIRT has been officially designated by the organization head or CIO through an official appointment order.  Priority II Scoring Criteria Yes No Evidence Required 1.1.2.1 Prerequisite: The constituency supported by the incident management function has been defined.   1.1.2.2 Control: Executives in the organization support the incident management mission. 1.1.2.3 Activity: A CSIRT, SOC, or other group has been established as the officially designated authority for incident management functions within the organization. 1.1.2.4 Activity: An entity or specific person has been designated as the incident management “lead.” Recommended Best Practices 1.1.2.5 Activity: A policy or other official designation is documented and distributed throughout the organization or otherwise made available.

Incident Management Capability Categories Prepare Protect Detect Respond Sustain Establish IM Function Core Processes and Tools Risk Assessment Prevention Operational Exercises for CND Training and Guidance Vulnerability Management Network and Systems Security Monitoring Threat and Situational Awareness Incident Reporting Incident Analysis MOUs and Contracts Project/Program Management IM Technology Development, Evaluation and Implementation Personnel Security Administration IM Information Systems

Use of Assessment Instruments Each instrument can be used alone to perform the prerequisite assessment. The MRD-IMC can be useful for small teams who do not have a lot of time or funding to perform a multi-week assessment activity. The instruments can also be used in combination to drill down into problem areas and areas for improvement.

Two-Tiered Multi-Team Assessment Approach

Who Should Use This Combined Assessment? This approach is best for large organizations with distributed CSIRTs or incident management components. Examples: Global company with incident management capabilities in different countries Government agencies with incident management capabilities in different ministries Academic organizations with incident management capabilities at different campuses Large enterprise with incident management capabilities in different divisions or components of the organization

Approach Perspective – Tier One Tier One uses the MRD-IMC to do a high-level check of the components or teams. The assessment can be completed by The team lead All members of a team as a survey Stakeholders and constituents who use the services of the team Analysis looks for Common problem areas across teams Specific problem areas for a team, e.g., where score and rationale do not match The Tier One assessment can also be used to establish an initial baseline of performance for yearly comparisons.

Approach Perspective – Tier Two Tier Two then uses the results of the MRD-IMC to do a more focused evaluation. The focused evaluation can concentrate on Capabilities which were scored poorly or identified weakness that were prevalent across the distributed components or teams. Specific teams that performed well or that performed poorly. Tier Two uses the IMCA to do a deeper dive assessment by Performing a complete IMCA on a team Scoping the evaluation, as needed, to specific capabilities

Benefits of Two-tiered Approach Allows for trends and baselines across components to be captured. Assessment time and resources is not as extensive as performing an IMCA on each component. Allows for focusing on most critical weaknesses or gaps.

Challenges and Issues Most challenges and issues were not related to the two-tiered approach, but were related to how the MRD-IMC drivers and the IMCA indicators were interpreted. Clarification of focus and perspective of the assessment needs to be made. For example if a team lead is completing the instrument are they answering what they think or what they believe their team thinks? Very clear definitions for terms in the assessments should be provided if a self-assessment is performed. Also, as self-assessments tend to be biased, scoring should be based on the analysis of the rationale given by the organization for their score rather than just the score.

Additional Activities That Can Be Done As a benchmark for identifying potential bias, the expert team can complete an MRD-IMC on the same group on which they performed an IMCA and compare their results to the group’s original self-applied MRD-IMC. Create a consolidated report of all the organizations MRD-IMC self-assessments to see how the organization performed across its components for all drivers.

MRD-IMC in More Detail

Driver A factor that has a strong influence on the eventual outcome or result Direct connection to the mission and objectives Small number of drivers (10-25) provides insight into mission and objectives Examples: Stakeholder Requirements: Are stakeholder requirements for the incident management function well understood? Incident Management Plan: Does the incident management plan enable achievement of objectives? Analysis: Does the incident management function analyze events and incidents sufficiently to enable an appropriate course of action for response?

Drivers: Success and Failure States A driver can guide the outcome toward key objectives (success state) or away from them (failure state).

Identifying Drivers: Basic Approach Gather information from experts Experts need to be familiar with the mission and objective(s) Mission and objective(s) help focus discussions with experts Questions answered by experts: What circumstances, conditions, and activities prevent an IM function from achieving each objective? What circumstances, conditions, and activities enable an IM function to achieve each objective?

Analyzing Drivers: Rationale and Supporting Evidence Rationale and supporting evidence recorded for each driver question Evidence can come from: Interview data Documentation Reports Observations Demonstrations Measurement data The publication includes a workbook. In a self-assessment, you need to balance your time and resource limitations against the need for objective (and sufficient) evidence.

Example: Rationale and Evidence 2. Are stakeholder requirements for the incident management function well understood? Response: Equally likely Rationale: Our overall response is “equally likely” due to equally compelling, conflicting data. The data do not favor a “yes” or “no” answer at this time. Supporting Data + The CSIRT has a good sense of its requirements and responsibilities. (anecdotal evidence from a few quick queries of IM personnel) + Technical objectives sufficiently consider constituency needs. (anecdotal evidence from a conversation with a group of constituents) - The current set of objectives for the standard services to be provided to constituents is not documented or well-communicated to the two contractors. (based on team knowledge) - Plans for improving the IM function’s services are documented to some extent but the schedule is out of date. (based on quick team review of IM plans)

Driver Profile Provides an indication of risk to the mission (i.e., mission risk) Dashboard for decision makers

IMCA in More Detail

Incident Management Capability Assessment Objectives The assessment measures an organization’s incident management functions against the CERT incident management capabilities which define a benchmark of good practice. The capabilities within the assessment are used to determine if an organization has all the necessary components, processes, and controls in place to perform the full range of incident management functions and services. The assessment can also be scoped to focus only on particular sets of capabilities based on the organization’s structure and operations.

Categories and Priorities for Incident Management Capabilities Five major service categories: Prepare Protect Detect Respond Sustain Three priorities: Priority I capabilities: critical services that an incident management function must provide Priority II capabilities: important services that should be provided Priority III capabilities: best practices that enhance operational effectiveness and quality

IMC Assessment Process Collect and Analyze Documentation Present Participants Briefing Conduct Interviews Observe Activities Present Overview Briefing Analyze Data Deliver Final Results

Types of Documents Reviewed Documents reviewed include but are not limited to Incident management capability organization chart and CONOPs or charter incident response/management plan communications plan incident management workflow processes incident management policies and procedures incident reporting forms and guidance incident management service descriptions job descriptions and training requirements for incident management staff

Types of Staff Interviewed executive management such as chief information officers (CIOs), chief security officers (CSOs), and chief risk officers (CROs) managers of incident management operations such as SOC managers or CSIRT manager or lead SOC or CSIRT staff such as help desk or hotline staff, incident analysts, vulnerability analysts, and malware analysts specialists such as law enforcement liaisons, digital media analysts, system and network administrators, firewall management, network monitoring, vulnerability scanning, threat assessment, patch management, and risk assessment other parts of the organization as required including representatives from human resources, legal counsel, training, budgeting, and contracting

Types of Observations or Demonstrations Observation or demonstrations of procedures, processes, mechanisms, tools, or systems may include but are not limited to IDS or other network monitoring activities vulnerability and threat assessment distributing and installing patches storing and analyzing incident and event data configuration and change management operations operational cyber exercises research and monitoring for situational awareness reacting to changes in threat levels establishing or working with trusted experts information dissemination and communication, including alerts and warnings secure communication and alternate communication paths sensitive and classified information handling

Capability Indicators Each capability contains a set of indicators prerequisites that are needed controls that are available or exist activities that are performed qualities that establish effective, quality service provision The indicators are evaluated to determine the performance of the activity validate the ability of the CSIRT to meet the requirements for that capability The assessment team uses the indicators to make a qualified judgment as to whether or not the capability has successfully been satisfied.

Analysis of Results Capabilities are scored based on information collected from the documentation reviewed interviews observations or demonstrations We document the rationale for the score given to each capability.

Scoring the Capabilities

Final Results The organization receives a report which reviews the score of each capability and a rationale for the score. Capabilities are analyzed to identify which priorities were met or where there are weaknesses in specific types or categories of capabilities.

Questions or Comments?