Download presentation
Presentation is loading. Please wait.
1
Chapter 19. Malicious Logic
Ko Jun-Hyuk
2
Contents 19.1. Introduction 19.2. Trojan Horses 19.3. Computer Viruses
19.4. Computer Worms 19.5. Other Forms of Malicious Logic 19.6. Defenses
3
19.1. Introduction Definition Example
Malicious logic is a set of instructions that cause a site’s security policy to be violated. Example UNIX script cp /bin/sh /tmp/.xxsh chmod u+s, o+x, /tmp/.xxsh rm ./ls ls $* Place in program called “ls” and trick someone into executing it You now have a setid-to-them shell!
4
19.2. Trojan Horses Definition Example: In the preceding example,
A Trojan horses is a program with an overt effect and a covert effect. Example: In the preceding example, Overt purpose: to list the files in a directory. Covert purpose: to create a shell that is setuid to the user executing the script.
5
19.2. Trojan Horses Example: NetBus program
Allows an attacker to control a Windows NT workstation remotely. The victim Windows NT system must have a server with which the NetBus program can communicate. This small program was placed in several small game programs as well as in some other “fun” programs.
6
19.2. Trojan Horses Definition Hart to detect
A propagating Trojan horse is a Trojan horse that creates a copy of itself. Hart to detect Karger and Schell, and later Thompson constructed The Trojan horse modifies the compiler to insert itself into specific programs, including future versions of the compiler itself.
7
19.2. Trojan Horses Example: Thompson’s Compiler
Thompson had the compiler check the program being compiled. If that program was login, the compiler added the code to use the fixed password. The extra code is visible in the compiler source. Recompile the compiler “no amount of source-level verification or scrutiny will protect you from using untrusted code”
8
19.3. Computer Viruses Definition Pseudocode
A computer virus is a program that inserts itself into one or more files and then performs some(possibly null) action. Pseudocode
9
19.3. Computer Viruses The insertion phase must be present but need not always be executed. A computer virus is a type of Trojan horse? Yes overt action: infected program covert action: insertion and execution phase No overt action: infect and execute covert action: none In any case, defenses against a Trojan horse inhibit computer viruses.
10
19.3.1. Boot Sector Infectors Definition Example: Brain Virus
A boot sector infector is a virus that inserts itself into the boot sector of a disk. Example: Brain Virus Moves disk interrupt vector from 13H to 6DH Sets the disk interrupt vector location to invoke Brain virus When new floppy seen, check for 1234H at location 4 If not there, copies itself onto disk after saving original boot block
11
19.3.2. Executable Infectors Definition How infection can occur
An executable infector is a virus that infects executable programs. How infection can occur T
12
19.3.2. Executable Infectors Example: Jerusalem virus
If system is not infected, the virus sets up to respond to requests to execute files Checks date. If the year is not 1987, it is not a Friday and not the 13th, it sets itself up to respond to clock interrupts and then run program If it is a Friday and the 13, the year is not 1987, the virus sets destructive flag; will delete files instead of infecting them. Checks all calls asking that files to be executed Do nothing for COMND.COM. Otherwise, infect or delete Error: doesn’t set signature when .EXE executes So .EXE continually reinfected
13
19.3.3. Multipartite Viruses Definition
A multipartite virus is one that can infect either boot sectors or applications. Such a virus typically has two parts Executable infector Boot sector infector
14
19.3.4. TSR Viruses Definition
A terminate and stay resident (TSR) virus is one that stays(resident) active in memory after the application (or bootstrapping, or disk mounting) has terminated. TSR viruses can be boot sector infectors or executable infectors.
15
19.3.5. Stealth Viruses Definition Example: IDF virus
Stealth viruses are viruses that conceal the infection of files. Example: IDF virus modifies DOS service interrupt handler Request for the length of the file: return the length of the uninfected file. Request to open the file: temporarily disinfect file, and reinfect on closing
16
19.3.6. Encrypted Viruses Definition
An encrypted virus is one that enciphers all of the virus code except for a small deciphering routine. The ordinary virus code is at the left. The encrypted virus code is at the right. T
17
Encrypted Viruses Example T
18
19.3.7. Polymorphic Viruses Definition
A polymorphic virus is a virus that changes its form each time it inserts itself into another program. Encryption virus – the decryption algorithm can be detected. Change the instructions in the virus to something equivalent but different. Polymorphism can exist at many levels. (instruction level, algorithm level)
19
19.3.7. Polymorphic Viruses Example All of the instructions
add 0 to operand or 1 with operand no operation subtract 0 from operand have exactly the same effect.
20
19.3.8. Macro Viruses Definition
A macro virus is a virus composed of a sequence of instructions that is interpreted, rather than executed directly. A macro virus can infect either executables or data files. Macro viruses are not bound by machine architecture. The effects may differ.
21
19.3.8. Macro Viruses Example: Melissa virus
Infected Word 97 and 98 documents on Windows and Macintosh systems. It is invoked when the program opens an infected file. It installs itself as the “open” macro and copies itself into the Normal template. It invokes a mail program and sends copies of itself to people in the user’s address book.
22
19.4. Computer Worms Definition Origins: distributed computations
A computer worm is a program that copies itself from one computer to another. Origins: distributed computations Schoch and Hupp: animations, broadcast messages Segment: part of program copied onto workstation Segment processes data, communicates with worm's controller Any activity on workstation caused segment to shut down
23
19.4. Computer Worms Example: Internet Worm of 1988
Targeted Berkeley, Sun UNIX systems Used virus-like attack to inject instructions into running program and run them To recover, had to disconnect system from Internet and reboot To prevent re-infection, several critical programs had to be patched, recompiled, and reinstalled Analysts had to disassemble it to uncover function Disabled several thousand systems in 6 or so hours
24
19.4. Computer Worms Example: Christmas Worm
Distributed in 1987, designed for IBM networks Electronic letter instructing recipient to save it and run it as a program Drew Christmas tree, printed “Merry Christmas!” Also checked address book, list of previously received and sent copies to each address Shut down several IBM networks Really, a macro worm Written in a command language that was interpreted
25
19.5.1. Rabbits and Bacteria Definition Example: shell script
A bacterium or a rabbit is a program that absorbs all of some class of resource. Example: shell script while true do mkdir x chdir x done exhaust either disk space or inode tables on a UNIX Version 7 system.
26
19.5.2. Logic Bombs Definition
A logic bomb is a program that performs an action that violates the security policy when some external event occurs. Example: program that deletes company’s payroll records when one particular record is deleted The “particular record” is usually that of the person writing the logic bomb Idea is if (when) he or she is fired, and the payroll record deleted, the company loses all those records
27
19.6.1. Malicious Logic Acting as Both Data and Instructions
Malicious logic is both Virus: written to program (data); then executes (instructions) Approach: treat "data" and "instructions" as separate types, and require certifying authority to approve conversion Keys are assumption that certifying authority will not make mistakes and assumption that tools, supporting infrastructure used in certifying process are not corrupt
28
19.6.1. Malicious Logic Acting as Both Data and Instructions
Example: LOCK Logical Coprocessor Kernel Designed to be certified at TCSEC A1 level Compiled programs are type "data" Sequence of specific, auditable events required to change type to "executable" Cannot modify "executable" objects So viruses can't insert themselves into programs (no infection phase)
29
19.6.1. Malicious Logic Acting as Both Data and Instructions
Example users with execute permission for a file usually also have read permission. files with execute permission be of type “executable” and that those without it be of type “data” “Executable” files could be modified, but type changed to “data” Certifier can change them back So virus can spread only if run as certifier
30
19.6.2.1. Information Flow Metrics
To limit the distance a virus can spread Definition Define the flow distance metric fd(x) for some information x as follows. Initially, all information has fd(x) = 0. Whenever x is shared, fd(x) increases by 1. Whenever x is used as input to a computation, the flow distance of the output is the maximum of the flow distance of the input.
31
19.6.2.1. Information Flow Metrics
Example Anne: 3, Bill: 2, Cathy: 2 Anne creates a program dovirus contaning a computer virus Bill executes it. When the virus infects Bill’s file safefile fd(dovirus) = 0, fd(safefile) = 1 Cathy executes safefile. When the virus tries to spread to her files fd(safefile) = 1, fd(Cathy’s files) = 2 Problem: if Cathy executes dovirus, her files can be infected
32
19.6.2.2. Reducing the Rights The principle of least privilege
Example: ACLs and C-Lists Suppose s1 owns a file o1, s2 owns a program o2 and a file o3 BACL = { (s1, o1, r), (s1, o1, w), (s1, o2, x), (s1, o3, w), (s2, o2, r), (s2, o2, w), (s2, o2, x), (s2, o3, r) } Program o2 contains a Trojan horse. If s1 wants to execute o2, and p12 tries to access o3 -> the access will be denied In fact, p12 inherits the access right of s1. s1 does not own o3. -> cannot delete its access rights over o3.
33
19.6.2.2. Reducing the Rights Example: ACLs and C-Lists (Con’t)
Solution: define an authorization denial subset R(si) to contain those ACL entries that it will not allow others to exercise over the objects that si owns. In this example, if R(s2) = { (s1, o3, w) }, then PD(p12) = { (p12, o1, r), (p12, o1, w), (p, o2, x) } Problem: how to determine which entries should be in the authorization denial subsets
34
19.6.2.2. Reducing the Rights Example
Karger proposes a knowledge-based subsystem to determine if a program makes reasonable file accesses. The subsystem sits between the kernel open routine and the application. T When the subsystem is invoked, it checks that the access is allowed. If not, it either denies the access or asks the user whether to permit the access.
35
19.6.2.2. Reducing the Rights Example
Lai and Gray have implemented a modified version of Karger’s scheme on a UNIX system. Allow programs to access files named on command line. Prevent access to other files. Two groups of processes trusted: not checked untrusted: valid access list (VAL)
36
19.6.2.2. Reducing the Rights Example (Con’t)
When an untrusted process tries to access a file 1. If the process is requesting access to a file on the VAL, the access is allowed if the effective UID and GID of the process allow the access. 2. If the process is opening the file for reading and the file is world-readable, the open is allowed. 3. If the process is creating a file, the creation is allowed if the effective UID and GID of the process allow the creation. 4. Otherwise, an entry in the system log reflects the request, and the user is asked if the access is to be allowed.
37
19.6.2.2. Reducing the Rights Example (Con’t)
The assembler when invoked from the compiler. The assembler is called as as x.s /tmp/cc2345 and the assembler creates the file /tmp/as1111. The VAL is x.s /tmp/cc2345 /tmp/as1111 Now Trojan horse tries to copy x.s to another file On creation, file inaccessible to all except creating user so attacker cannot read it (rule 3) If file created already and assembler tries to write to it, user is asked (rule 4), thereby revealing Trojan horse
38
19.6.2.3. Sandboxing Sandboxes, virtual machines also restrict rights
Modify program by inserting instructions to cause traps when violation of policy Replace dynamic load libraries with instrumented routines Example: Race Conditions A race condition occurs when successive system calls operate on an object. Both calls identify object by name. The name can be rebounded to a different object between the first and second system calls.
39
19.6.3. ML Crossing Protection Domain Boundaries by Sharing
The separation implicit in integrity policies Example: LOCK system When users share procedures, the LOCK system keeps only one copy of the procedure in memory. A master directory associates with each procedure a unique owner and with each user a list of others whom that user trusts. Before executing any procedure, the dynamic linker checks that the user executing the procedure trusts the procedure's owner.
40
19.6.3. ML Crossing Protection Domain Boundaries by Sharing
Programs to be protected be placed at the lowest possible level of an implementation of a multilevel security policy. By the mandatory access control, any process can read the programs but no process can write to them be combined with an integrity model to provide protection against viruses to prevent both disclosure and file corruption
41
19.6.4. Malicious Logic Altering Files
Mechanisms using manipulation detection codes (or MDCs) apply some function to a file to obtain signature block and then protect that block. If, after recomputing the signature block, the result differs from the stored signature block, the file has changed. Example: Tripwire The signature of each file consists of file attributes and various cryptographic checksums.
42
19.6.4. Malicious Logic Altering Files
An assumption is that the signed file does not contain malicious logic before it is signed. Example Pozzo and Grey have implemented Biba's integrity model on LOCUS to make the level of trust in assumption explicit. Credibility ratings assign a measure of trustworthiness on a scale of 0 (unsigned) to N (signed and formally verified). User(subject) has risk level - credibility level ≥ risk level, users can execute programs. - credibility level < risk level, special command must be used.
43
19.6.4. Malicious Logic Altering Files
Antivirus scanners check files for specific viruses If a virus is present, either warn the user or attempt to "cure" the infection by removing the virus. Each agent must look for known set of viruses Cannot deal with viruses not yet analyzed
44
19.6.5. Malicious Logic Performing Actions Beyond Specification
Treat execution, infection as errors and apply fault tolerant techniques Example: break program into sequences of nonbranching instructions Checksum each sequence, encrypt result When run, processor recomputes checksum, and at each branch co-processor compares computed checksum with stored one; if different, error occurred
45
19.6.5. Malicious Logic Performing Actions Beyond Specification
N-version programming Implementation of several different versions of algorithm Run them concurrently Check intermediate results periodically If disagreement, majority wins Assumptions Majority of programs not infected Underlying operating system secure Different algorithms with enough equal intermediate results may be infeasible Especially for malicious logic, where you would check file accesses
46
Proof-Carrying Code Code consumer (user) specifies safety requirement Code producer (author) generates proof code meets this requirement Proof integrated with executable code Changing the code invalidates proof Binary (code + proof) delivered to consumer Consumer validates proof Example statistics on Berkeley Packet Filter: proofs 300–900 bytes, validated in 0.3 –1.3 ms Startup cost higher, runtime cost considerably shorter
47
19.6.6. Malicious Logic Altering Statistical Characteristics
Example: application had 3 programmers working on it, but statistical analysis shows code from a fourth person—may be from a Trojan horse or virus! Other attributes: more conditionals than in original; look for identical sequences of bytes not common to any library routine; increases in file size, frequency of writing to executables, etc. Denning: use intrusion detection system to detect these
48
The Notion of Trust The effectiveness of any security mechanism depends on the security of the underlying base on which the mechanism is implemented and the correctness of the implementation. “Secure”, like “trust”, is a relative notion. The design of any mechanism for enhancing computer security must attempt to balance the cost of the mechanism against the level of security desired and the degree of trust in the base that the site accepts as reasonable.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.