Download presentation
Presentation is loading. Please wait.
Published byWarren Payne Modified over 6 years ago
1
Web Mining Web Mining is the use of the data mining techniques to automatically discover and extract information from web documents/services Discovering useful information from the World-Wide Web and its usage patterns My Definition: Using data mining techniques to make the web more useful and more profitable (for some) and to increase the efficiency of our interaction with the web
2
Web Mining Data Mining Techniques Applications to the Web
Association rules Sequential patterns Classification Clustering Outlier discovery Applications to the Web E-commerce Information retrieval (search) Network management
3
Examples of Discovered Patterns
Association rules 98% of AOL users also have E-trade accounts Classification People with age less than 40 and salary > 40k trade on-line Clustering Users A and B access similar URLs Outlier Detection User A spends more than twice the average amount of time surfing on the Web
4
Web Mining The WWW is huge, widely distributed, global information service centre for Information services: news, advertisements, consumer information, financial management, education, government, e-commerce, etc. Hyper-link information Access and usage information WWW provides rich sources of data for data mining
5
Why Mine the Web? Enormous wealth of information on Web
Financial information (e.g. stock quotes) Book/CD/Video stores (e.g. Amazon) Restaurant information (e.g. Zagats) Car prices (e.g. Carpoint) Lots of data on user access patterns Web logs contain sequence of URLs accessed by users Possible to mine interesting nuggets of information People who ski also travel frequently to Europe Tech stocks have corrections in the summer and rally from November until February
6
Why is Web Mining Different?
The Web is a huge collection of documents except for Hyper-link information Access and usage information The Web is very dynamic New pages are constantly being generated Challenge: Develop new Web mining algorithms and adapt traditional data mining algorithms to Exploit hyper-links and access patterns Be incremental
7
Web Mining Applications
E-commerce (Infrastructure) Generate user profiles Targetted advertizing Fraud Similar image retrieval Information retrieval (Search) on the Web Automated generation of topic hierarchies Web knowledge bases Extraction of schema for XML documents Network Management Performance management Fault management
8
User Profiling Important for improving customization
Provide users with pages, advertisements of interest Example profiles: on-line trader, on-line shopper Generate user profiles based on their access patterns Cluster users based on frequently accessed URLs Use classifier to generate a profile for each cluster Engage technologies Tracks web traffic to create anonymous user profiles of Web surfers Has profiles for more than 35 million anonymous users
9
Internet Advertizing Ads are a major source of revenue for Web portals (e.g., Yahoo, Lycos) and E-commerce sites Plenty of startups doing internet advertizing Doubleclick, AdForce, Flycast, AdKnowledge Internet advertizing is probably the “hottest” web mining application today
10
Fraud With the growing popularity of E-commerce, systems to detect and prevent fraud on the Web become important Maintain a signature for each user based on buying patterns on the Web (e.g., amount spent, categories of items bought) If buying pattern changes significantly, then signal fraud HNC software uses domain knowledge and neural networks for credit card fraud detection
11
Retrieval of Similar Images
Given: A set of images Find: All images similar to a given image All pairs of similar images Sample applications: Medical diagnosis Weather predication Web search engine for images E-commerce 12
12
Problems with Web Search Today
Today’s search engines are plagued by problems: the abundance problem (99% of info of no interest to 99% of people) limited coverage of the Web (internet sources hidden behind search interfaces) Largest crawlers cover < 18% of all web pages limited query interface based on keyword-oriented search limited customization to individual users
13
Problems with Web Search Today
Today’s search engines are plagued by problems: Web is highly dynamic Lot of pages added, removed, and updated every day Very high dimensionality
14
Improve Search By Adding Structure to the Web
Use Web directories (or topic hierarchies) Provide a hierarchical classification of documents (e.g., Yahoo!) Searches performed in the context of a topic restricts the search to only a subset of web pages related to the topic Yahoo home page Recreation Business Science News Travel Sports Companies Finance Jobs
15
Automatic Creation of Web Directories
In the Clever project, hyper-links between Web pages are taken into account when categorizing them Use a bayesian classifier Exploit knowledge of the classes of immediate neighbors of document to be classified Show that simply taking text from neighbors and using standard document classifiers to classify page does not work Inktomi’s Directory Engine uses “Concept Induction” to automatically categorize millions of documents
16
Service Provider Network
Network Management Objective: To deliver content to users quickly and reliably Traffic management Fault management Service Provider Network Router Server
17
Why is Traffic Management Important?
While annual bandwidth demand is increasing ten-fold on average, annual bandwidth supply is rising only by a factor of three Result is frequent congestion at servers and on network links during a major event (e.g., princess diana’s death), an overwhelming number of user requests can result in millions of redundant copies of data flowing back and forth across the world Olympic sites during the games NASA sites close to launch and landing of shuttles
18
Traffic Management Key Ideas Akamai, Inktomi
Dynamically replicate/cache content at multiple sites within the network and closer to the user Multiple paths between any pair of sites Route user requests to server closest to the user or least loaded server Use path with least congested network links Akamai, Inktomi
19
Service Provider Network
Traffic Management Congested link Congested server Request Router Server Service Provider Network
20
Traffic Management Need to mine network and Web traffic to determine
What content to replicate? Which servers should store replicas? Which server to route a user request? What path to use to route packets? Network Design issues Where to place servers? Where to place routers? Which routers should be connected by links? One can use association rules, sequential pattern mining algorithms to cache/prefetch replicas at server
21
Fault Management Fault management involves
Quickly identifying failed/congested servers and links in network Re-routing user requests and packets to avoid congested/down servers and links Need to analyze alarm and traffic data to carry out root cause analysis of faults Bayesian classifiers can be used to predict the root cause given a set of alarms
22
Web Mining Issues Size Diverse types of data
Grows at about 1 million pages a day Google indexes 9 billion documents Number of web sites Netcraft survey says 72 million sites ( Diverse types of data Images Text Audio/video XML HTML
23
Total Sites Across All Domains August 1995 - October 2007
Number of Active Sites Total Sites Across All Domains August October 2007
24
Systems Issues Web data sets can be very large
6/10/2018 Systems Issues Web data sets can be very large Tens to hundreds of terabytes Cannot mine on a single server! Need large farms of servers How to organize hardware/software to mine multi-terabye data sets Without breaking the bank! Dr. Navneet Goyal, BITS,Pilani
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.