Intelligent Detection of Malicious Script Code CS194, 2007-08 Benson Luk Eyal Reuveni Kamron Farrokh Advisor: Adnan Darwiche.

Slides:



Advertisements
Similar presentations
Cross-Site Scripting Issues and Defenses Ed Skoudis Predictive Systems © 2002, Predictive Systems.
Advertisements

 Application software consists of programs designed to make users more productive and/or assist with personal tasks.  Growth of internet simplified.
Section 3.2: Operating Systems Security
A Crawler-based Study of Spyware on the Web Author: Alexander Moshchuk, Tanya Bragin, Steven D.Gribble, Henry M.Levy Presented At: NDSS, 2006 Prepared.
Security Issues and Challenges in Cloud Computing
Server-Side vs. Client-Side Scripting Languages
1 Chapter 12 Working With Access 2000 on the Internet.
Intelligent Detection of Malicious Script Code CS194, Benson Luk Eyal Reuveni Kamron Farrokh Advisor: Adnan Darwiche.
Security Awareness: Applying Practical Security in Your World, Second Edition Chapter 3 Internet Security.
Exploring Latency Constraints of Co-Processing Boards Grant Jenks UCLA.
Intelligent Detection of Malicious Script Code CS194, Benson Luk Eyal Reuveni Kamron Farrokh Advisor: Adnan Darwiche Sponsored by Symantec.
Crawler-Based Search Engine By: Bryan Chapman, Ryan Caplet, Morris Wright.
How Clients and Servers Work Together. Objectives Learn about the interaction of clients and servers Explore the features and functions of Web servers.
Spring Definitions  Virus  A virus is a piece of computer code that attaches itself to a program or file so it can spread.
Module 6 Windows 2000 Professional 6.1 Installation 6.2 Administration/User Interface 6.3 User Accounts 6.4 Managing the File System 6.5 Services.
CP476 Internet Computing Browser and Web Server 1 Web Browsers A client software program that allows you to access and view Web pages on the Internet –Examples.
Knowledge is Power Marketing Information System (MIS) determines what information managers need and then gathers, sorts, analyzes, stores, and distributes.
Secure Search Engine Ivan Zhou Xinyi Dong. Project Overview  The Secure Search Engine project is a search engine that utilizes special modules to test.
Automated Malware Analysis
Internet safety Viruses A computer virus is a program or piece of code that is loaded onto your computer without your knowledge and runs against your.
TC2-Computer Literacy Mr. Sencer February 8, 2010.
INTRODUCTION TO WEB DATABASE PROGRAMMING
Securing Windows 7 Lesson 10. Objectives Understand authentication and authorization Configure password policies Secure Windows 7 using the Action Center.
January 5, 2011 Computer Science Mr. Verlin Web Design: Working on Web Pages (2 of 3)
Supporting the Automatic Construction of Entity Aware Search Engines Lorenzo Blanco, Valter Crescenzi, Paolo Merialdo, Paolo Papotti Dipartimento di Informatica.
Introduction to HP LoadRunner Getting Familiar with LoadRunner >>>>>>>>>>>>>>>>>>>>>>
Staying Safe. Files can be added to a computer by:- when users are copying files from a USB stick or CD/DVD - downloading files from the Internet - opening.
WEB SECURITY WEEK 3 Computer Security Group University of Texas at Dallas.
Niels Provos and Panayiotis Mavrommatis Google Google Inc. Moheeb Abu Rajab and Fabian Monrose Johns Hopkins University 17 th USENIX Security Symposium.
Crawlers - March (Web) Crawlers Domain Presented by: Or Shoham Amit Yaniv Guy Kroupp Saar Kohanovitch.
CHAPTER 4 Marketing Information and Research: Analyzing the Business Environment Off-line and Online M A R K E T I N G.
5 Chapter Five Web Servers. 5 Chapter Objectives Learn about the Microsoft Personal Web Server Software Learn how to improve Web site performance Learn.
Secure Search Engine Ivan Zhou Xinyi Dong. Introduction  The Secure Search Engine project is a search engine that utilizes special modules to test the.
A Crawler-based Study of Spyware on the Web A.Moshchuk, T.Bragin, D.Gribble, M.Levy NDSS, 2006 * Presented by Justin Miller on 3/6/07.
A Crawler-based Study of Spyware on the Web Authors: Alexander Moshchuk, Tanya Bragin, Steven D.Gribble, and Henry M. Levy University of Washington 13.
Database-Driven Web Sites, Second Edition1 Chapter 5 WEB SERVERS.
ERIKA Eesti Ressursid Internetis Kataloogimine ja Arhiveerimine Estonian Resources in Internet, Indexing and Archiving.
CSCE 201 Web Browser Security Fall CSCE Farkas2 Web Evolution Web Evolution Past: Human usage – HTTP – Static Web pages (HTML) Current: Human.
To Protect What Matters!! Protection Against Computer Virus Unit portfolio presentation by Saira Imtiaz.
Overview Managing a DHCP Database Monitoring DHCP
Intelligent Detection of Malicious Script Code CS194, Benson Luk Eyal Reuveni Kamron Farrokh Advisor: Adnan Darwiche Sponsored by Symantec.
Intelligent Web Topics Search Using Early Detection and Data Analysis by Yixin Yang Presented by Yixin Yang (Advisor Dr. C.C. Lee) Presented by Yixin Yang.
Module 8 : Configuration II Jong S. Bok
Data Science Background and Course Software setup Week 1.
ITGS Network Architecture. ITGS Network architecture –The way computers are logically organized on a network, and the role each takes. Client/server network.
Social Engineering © 2014 Project Lead The Way, Inc.Computer Science and Software Engineering.
What is Web Information retrieval from web Search Engine Web Crawler Web crawler policies Conclusion How does a web crawler work Synchronization Algorithms.
SUPPLY CHAIN OF BIG DATA. WHAT IS BIG DATA?  A lot of data  Too much data for traditional methods  The 3Vs  Volume  Velocity  Variety.
Web Browsing *TAKE NOTES*. Millions of people browse the Web every day for research, shopping, job duties and entertainment. Installing a web browser.
Secure Search Engine Ivan Zhou Xinyi Dong. Project Overview  The Secure Search Engine project is a search engine that utilizes special modules to test.
Data Workspace  Writing some scripts to automatically backup GFSAD30 Drive  Limited to archive folder at moment  Also backing up the following location:
Virtual Machines Module 2. Objectives Define virtual machine Define common terminology Identify advantages and disadvantages Determine what software is.
Kali Linux BY BLAZE STERLING. Roadmap  What is Kali Linux  Installing Kali Linux  Included Tools  In depth included tools  Conclusion.
By Collin Donaldson. Hacking is only legal under the following circumstances: 1.You hack (penetration test) a device/network you own. 2.You gain explicit,
Heat-seeking Honeypots: Design and Experience John P. John, Fang Yu, Yinglian Xie, Arvind Krishnamurthy and Martin Abadi WWW 2011 Presented by Elias P.
Windows Vista Configuration MCTS : Internet Explorer 7.0.
SlideSet #20: Input Validation and Cross-site Scripting Attacks (XSS) SY306 Web and Databases for Cyber Operations.
Intro to Digital Technology Review for Final Introduction to Digital Technology Finals Seniors Monday, 5/16 – 2 nd Tuesday 5/17 – 1 st,3 rd Underclassmen.
Crawling When the Google visit your website for the purpose of tracking, Google does this with help of machine, known as web crawler, spider, Google bot,
* Web Servers/Clients * The HTTP Protocol
Chapter 6: Securing the Cloud
Web Concepts Lesson 2 ITBS2203 E-Commerce for IT.
Department of Computer Science
Lesson #8 MCTS Cert Guide Microsoft Windows 7, Configuring Chapter 8 Configuring Applications and Internet Explorer.
Analyzing WebView Vulnerabilities in Android Applications
Chapter 3. Basic Dynamic Analysis
Knowledge is Power A Marketing Information System (MIS) determines what information managers need and then gathers, sorts, analyzes, stores, and distributes.
Recitation on AdFisher
Cross-Site Scripting Issues and Defenses Ed Skoudis Predictive Systems
Introduction to JavaScript
Presentation transcript:

Intelligent Detection of Malicious Script Code CS194, Benson Luk Eyal Reuveni Kamron Farrokh Advisor: Adnan Darwiche

Goals for the Quarter Phase I Set up machine for testing environment Set up machine for testing environment Ensure that “whitelist” is clean Ensure that “whitelist” is clean Phase II Modify crawler to output only necessary data. This means: Modify crawler to output only necessary data. This means: Grab only necessary information from webcrawling resultsGrab only necessary information from webcrawling results Listen into Internet Explorer’s Javascript interpreter and output relevant behaviorListen into Internet Explorer’s Javascript interpreter and output relevant behavior

Completed Tasks Phase I Configured machine with Norton Antivirus and Heritrix web crawler Configured machine with Norton Antivirus and Heritrix web crawler Webcrawler will be used to grab additional URLs, and Norton Antivirus will be used to verify that a URL has not launched an attackWebcrawler will be used to grab additional URLs, and Norton Antivirus will be used to verify that a URL has not launched an attack Created a Python script to ensure that visited sites are clean Created a Python script to ensure that visited sites are clean Captures Norton’s web attack logs before and after loading a site in Internet Explorer, then compares the logs for new entries and signals whether or not a site’s data should be discardedCaptures Norton’s web attack logs before and after loading a site in Internet Explorer, then compares the logs for new entries and signals whether or not a site’s data should be discarded Phase II Configured Heritrix to run specific crawls that target a set of domains, and output minimal information Configured Heritrix to run specific crawls that target a set of domains, and output minimal information The purpose is to gather as many URLs with scripts as possible for a large sample baseThe purpose is to gather as many URLs with scripts as possible for a large sample base Created a parser for Heritrix logs to filter out irrelevant websites Created a parser for Heritrix logs to filter out irrelevant websites For example, we are omitting URLs that point to images since they will not contain scriptsFor example, we are omitting URLs that point to images since they will not contain scripts

Pending Tasks and Difficulties Phase I Ensure whitelist is clean Ensure whitelist is clean This can be a time-consuming task given the massive size of the list; we are going to start with a small subset of the list for nowThis can be a time-consuming task given the massive size of the list; we are going to start with a small subset of the list for now With our scripts we can also check for cleanliness as we load URLsWith our scripts we can also check for cleanliness as we load URLs Acquire a larger hard drive for the computer, as to be able to store the data from the crawls Acquire a larger hard drive for the computer, as to be able to store the data from the crawls We have been unable to run a large crawl on the machine due to low hard drive spaceWe have been unable to run a large crawl on the machine due to low hard drive space Phase II Figure out how to “listen in” on the Javascript interpreter in Internet Explorer and output relevant information about the scripts currently running Figure out how to “listen in” on the Javascript interpreter in Internet Explorer and output relevant information about the scripts currently running This requires intimate knowledge of Internet Explorer and will likely consume too much time to develop from the ground upThis requires intimate knowledge of Internet Explorer and will likely consume too much time to develop from the ground up

Direction for Next Quarter Obtain resources and/or software from Symantec for listening in on Javascript interpreter Obtain resources and/or software from Symantec for listening in on Javascript interpreter Install a larger hard drive, ~750 GB Install a larger hard drive, ~750 GB Design and create a database to store information about the scripts Design and create a database to store information about the scripts Research and design an intelligent learning algorithm to read in and analyze the data Research and design an intelligent learning algorithm to read in and analyze the data