Presentation is loading. Please wait.

Presentation is loading. Please wait.

Information System Security Engineering and Management Dr. William Hery CS 996 Spring 2005.

Similar presentations


Presentation on theme: "Information System Security Engineering and Management Dr. William Hery CS 996 Spring 2005."— Presentation transcript:

1 Information System Security Engineering and Management Dr. William Hery whery@poly.edu hery@nac.net CS 996 Spring 2005

2 Outline of Presentation Course Motivation Approach to Learning, Grading in This Course Main Course Topics Highlights of course topics to show linkage Term Project Structure

3 Initial Course Motivation For SFS students: fill in gaps in National Security Telecommunications and Information Systems Security Committee (NSTISSC) certification for NSA  NSTISSI 4011: National Training Standards for INFOSEC Professionals ( http://www.nstissc.gov/html/library.html) Most technical topics are covered in other courses  Missing NSTISSI technical tidbits inserted as needed The “missing” topics are all related to management, policy and systems engineering.  The course will be a survey of information system security engineering and management topics over a system life cycle Although a “government” motivation in selecting the topics, all are broadly applicable to developing and managing commercial systems securely

4 Course Focus Broad management perspective applicable to DoD/NSA, civilian government agencies, corporate world: think like a manager  If you are a manager  If you have to deal with a manager System, not detail, focus  Not about security products (crypto, fiewall, etc.), but how to use them in a system Many topics are subjective, not objective  There may be no “right way” or “right answer” Many topics should be courses in themselves  This course will teach you what to think about, not how to do everything! Secondary goal: Gain experience in teamwork, government project organization, presentations and report writing.

5 Course Organization Weekly graded homework Each student will present a one hour lecture on a topic--and assign reading and homework for it Reading assignments and class discussion  Active participation in discussion part of grade! Student team projects (more later)

6 Grading  Homework:  Due before the next class after assignment  Each graded on a 100 point scale  10 points per week (or fraction thereof) deducted for any homework submitted after the due date  Ten highest individual HW grades averaged to get overall HW grade  Lecture: Graded on the basis of discovering and understanding material, and organization of presentation.  Team Project: Grades based on:  Depth of understanding of security both the technical issues and the application of systems engineering and management processes  Organization of final presentation and report  All members of a team get the same grade  Overall grade: 40% homework, 40% project, 20% lecture

7 References Primary text: Ronald Krutz and Russell Vines, The CISM Prep Guide, Wiley, 2003, ISBN 0-471- 45598-9 Supplementary material from:  Ross Anderson, Security Engineering, Wiley, 2001, ISBN 0-471-38922-6  Tipton and Krause, Information Security Management Handbook, 4 th Edition, Auerbach, ISBN 0-8493-1518-2 (Copy in ISIS Lab)  Various web sites, etc.

8 Key Points About Security For a Manager Information Security is a key part of the life cycle processes for any system:  System conception and design  System development  Operation of deployed systems  System “decommission” It is critical to include security considerations from the beginning of system conception and design Use of security technology (firewalls, crypto, IDS, patching), etc. is only part of security: security is based on people, processes, and technology Actively managing the “security process” is a key part of achieving security People (developers, users, hackers) are (sometimes without actively knowing it) are part of the security process

9 Key Points (Part II) Security is a wide array of options, not a yes/no choice  No security is probably not enough  Near perfect security is difficult, expensive, hard to use, and takes a long time to do. It is probably too much The key is to find the “sweet spot” of enough security for your particular system. This is a key management decision. Many topics are subjective, not objective  There may be no “right way” or “right answer”

10 Extremes of Security A system in an open area, which allows anyone to use it and allows anything to come in or go out over multiple networks provides virtually no security (e. g., Internet Café) A system in a locked room, with 2 foot thick concrete and metal walls (Faraday cage), no windows, battery operated (no power lines, a secure operating system, strongly encrypted data, armed guards, and only one person allowed in provides very strong security (e. g., most secure NSA system…maybe)

11 Costs of Security Security Development cost Development time Equipment cost Maintenance cost Usability? Functionality? Performance?

12 Finding the “Sweet Spot” The sweet spot depends on what you need to protect, and what environment you are operating in The key to finding the “sweet spot” is to understand the security requirements  Requirements may be hard to pin down  Different aspects of a system may have very different security requirements  The “sweet spot” is getting the right level of security for each aspect Two common errors in finding the “sweet spot”:  Uniformly low security because management does not understand the risk.  Uniformly, strong, and expensive approach to security everywhere, when less (e. g., no crypto) is enough in most places, with something stronger in selected places (e. g., protecting passwords or crypto keys)

13 What is Information Security? A set of properties of the information system, not a technology These properties are provided with both of processes and technologies The properties: CIA  Confidentiality: only permitted entities are allowed to “see” the information  Integrity: only permitted entities are allowed to modify the information (this includes creation and deletion)  Integrity preservation: you know it can’t be changed  Integrity violation detection: you can’t trust and must go to a backup or alternate source  Availability: the information is available when needed

14 Related security concepts Identification: a means of saying who/what an entity is Authentication: a means to verify that an entity is who it claims to be for decisions in support of confidentiality and integrity Access Control: a means to enforce which entities have access to information to support confidentiality and integrity Authorization: a combination of authentication (who) and access control Non-repudiation: integrity of the pair (information, creator of information) Privacy: confidentiality of personal information Anonymity: confidentiality of identity Recovery: restoration of a system to a “correct” state after a security incident.

15 Methods to Provide Security Properties

16 DoD terminology Communications Security (COMSEC)  Security of information (voice, data) while in transit. Includes switched circuits, radio links, microwave, satellite, packet nets, Asynchronous Transfer Mode (ATM), Synchronous Optical Networks (SONET), Packet over fiber, free space optics, etc. Computer Security (COMPUSEC)  Security of information while stored or being processed on a computer Information Security (INFOSEC)  COMPUSEC + COMSEC Transmission Security (TRANSEC)  Security of Transmission media Operations Security (OPSEC)  Operational processes for protecting potentially sensitive unclassified material (people and technology) Automated Information Systems (AIS)  Computers + networks linking computers

17 Security vs. Reliability Security attacks, software flaws, and hardware failure can all lead to violations of “CIA” For some events, it may be hard to determine which class of flaws is the cause. Some protection and recovery mechanisms are the same for both security attacks and hardware or software failures

18 Security vs Reliability Differences Hardware failures  No malicious cause  Usually affects “A”, sometimes “I” or “C”  Typically independent events  Testing is often a reliable way to find hardware failures on deployed systems  Stochastic and temporal (e. g., mean time between failure, MTBF) failure models are useful metrics  “Availability” is also a standard term in reliability Software failure  No malicious attack: design or coding error  Can affect “A”, sometimes “I” or “C”  Often correlated events from same flaw as similar state conditions arise in different instantiations  Stochastic models of limited value

19 Security vs Reliability Differences (continued) Security breach  Malicious attack  Serious attacks often attempt to hide event  Can affect “A”, “I” or “C”  In most cases, the most serious impacts are attacks on “I” or “C”  Many attacks are highly correlated worldwide, but some are very targeted and correlations may be hard to find

20 Survivable Systems Systems that provide both reliability and security are called survivable (or dependable) Reliability and security requirements are often similar in nature (particularly availability), and it makes sense to combine the requirements analysis for both Designing secure systems and reliable systems both depend on understanding what is at risk if there is a failure (security compromise or system failure), what the threats are (hackers, failure modes), and managing that risk It is sometimes hard to distinguish between a reliability failure and a security breach (e. g., the 2003 northeast blackout). Recovery from a failure and a security breach are sometimes similar, and it makes sense to combine the recovery plans. Security is considered by some to be a subset of reliability, with security breaches just another form of failure…  But the malicious, planned, correlated, and hidden aspects of security breaches requires a very different protection approach to most aspects of reliability

21 Where are security flaws? In system design  Not planning for security (many things today)  Designing security incorrectly (WiFi original encryption standard) In system implementation  Ambiguous/incomplete design document  Implementation errors (buffer overflows, etc.) In system use  Configuration--often weakest security “out of the box”  Failure to keep up with updates/patches  Physical security Ill advised user actions  Poor passwords/passwords written down  Victims of “social engineering” Management needs to keep all of this in mind when designing, implementing and deploying systems

22 Management Security Concerns Classified information at DoD/NSA/other govt agencies:  National security, loss of life, “sources and methods,” political, career impacts of security breech Unclassified government information:  Political, financial, legal, career impacts of security breech Corporate  Financial, intellectual property, legal, corporate image, career impacts of security breech Almost no managers: neat technology

23 Sources of Security Requirements Risk analysis (national security, lives, property, money) Legal (e. g., HIPAA, Sarbannes_Oxely, privacy laws) Higher level government/corporate policies Corporate/agency/personal image Others derived from the above Requirements may change due to costs, changing threat environment, etc.  Requirements may not be known or understood at the start of a project

24 Steps to Include Security in the System Life Cycle Risk analysis (for the most fundamental security requirements) Complete security requirements analysis  Security is a “non-functional” requirement, as is reliability High level security policy (technology, management processes, personnel policies) Overall system engineering  Includes security design and development  Lower level security requirements and policies developed  Security should be an integral element from the start Security management of deployed system Incident Response Business Continuity Planning Decommissioning of systems and components

25 Risk Analysis What is at risk (national security, lives, property, money)?  Some risk models are based on $ values Where does the threat come from?  Motivation (national security, money, fame,  Capabilities (intellect, equipment, money) What vulnerabilities can be exploited  Technical  Process  People Risk management  Eliminate/reduce risk (e. g., put in crypto, firewall…)  Accept risk (with recovery process)  Transfer risk (e. g., to an insurance company)

26 Security Policy Essentially a statement of security requirements Every security policy statement should have a corresponding enforcement mechanism Policies are at multiple levels High level policies flow down to multiple lower level policies  High level; e. g., “company proprietary information shall be protected from release to unauthorized personnel”  Mid level; e. g., “there shall be no externally initiated ftp sessions”  Low level; e. g., a firewall rule blocking incoming traffic on ports 20 (ftp data), 21 (ftp control), and 69 (tftp)  The firewall is the enforcement mechanism Policies also define management processes (e. g., incident response actions) and personnel rules (e. g., don’t write down passwords)

27 Security system engineering Part of overall systems engineering process Iterates requirements, design, review through multiple levels of detail Includes design and development Lower level security policies developed Security should be an integral element from the start

28 Student talks Presentations will focus on management and processes, not technical details (you know them already) Presenter will be given basic references and other reference pointers, and is encouraged to search for more material Presenter to assign background reading the week before the talk Review presentation with me for guidance as you develop it Prepare for ~ 45 minutes of presentation material, but use one hour+ with discussion Active participation of audience is encouraged Presenter to assign homework on topic

29 Course Schedule (tentative) 1/26*Overview (Hery) 2/2*Risk Analysis (Hery) 2/9*Secure Systems Engineering (Hery) 2/16ISO 17799 (taken) SSE/CMM (secure syst. eng. maturity model) 2/23Policy (2 hours???--Hery) Legal and other requirements 3/2Security Management and administration of Deployed Systems (2 hours) 3/9Incident Response Business Continuity Planning (merge w/ above?)

30 Course Schedule (tentative) continued 3/16*Assessment/Assurance (Hery) *Architecture of Classified Systems (Hery) 3/30Security Engineering for Software TRANSEC/EMSEC/Tempest (EE background) 4/6Physical Security/tamper resistance Information System Security Officer 4/13Government Key Management Policy Security Audit (tentatively taken) 4/20Certification and Accreditation Ethical issues in system design/management??

31 Student Team Project Teams of ~3 students Pick a system (discuss choice with me)  Want simple functionality, security issues, whole system (e. g., client and server side) Submit a 1-2 page proposal to management (Dr. Hery) Assess risks, threats, vulnerabilities Develop a security policy Do a high level system security design Present a “preliminary design review” (PDR) to management (include risk analysis, policies, system architecture) Iterate on risk assessment, policy, design Present a final “critical design review” (CDR) to management and the class Write a final report to management on above


Download ppt "Information System Security Engineering and Management Dr. William Hery CS 996 Spring 2005."

Similar presentations


Ads by Google