Presentation is loading. Please wait.

Presentation is loading. Please wait.

Page 1/71 Issues in Data Mining Infrastructure Issues in Data Mining Infrastructure Authors:Nemanja Jovanovic, Valentina Milenkovic,

Similar presentations


Presentation on theme: "Page 1/71 Issues in Data Mining Infrastructure Issues in Data Mining Infrastructure Authors:Nemanja Jovanovic, Valentina Milenkovic,"— Presentation transcript:

1 Page 1/71 Issues in Data Mining Infrastructure Issues in Data Mining Infrastructure Authors:Nemanja Jovanovic, nemko@acm.orgnemko@acm.org Valentina Milenkovic, tina@eunet.yutina@eunet.yu Prof. Dr. Veljko Milutinovic, vm@etf.bg.ac.yuvm@etf.bg.ac.yu http://galeb.etf.bg.ac.yu/~vm

2 Page 2/71 Data Mining in the Nutshell Data Mining in the Nutshell  Uncovering the hidden knowledge  Huge n-p complete search space  Multidimensional interface

3 Page 3/71 A Problem … A Problem … You are a marketing manager for a cellular phone company  Problem: Churn is too high  Bringing back a customer after quitting is both difficult and expensive  Giving a new telephone to everyone whose contract is expiring is very expensive (as well as wasteful)  You pay a sales commission of 250$ per contract  Customers receive free phone (cost 125$) with contract  Turnover (after contract expires) is 40%

4 Page 4/71 … A Solution … A Solution  Three months before a contract expires, predict which customers will leave  If you want to keep a customer that is predicted to churn, offer them a new phone  The ones that are not predicted to churn need no attention  If you don’t want to keep the customer, do nothing  How can you predict future behavior?  Tarot Cards?  Magic Ball?  Data Mining?

5 Page 5/71 Still Skeptical? Still Skeptical?

6 Page 6/71 The Definition The Definition  Automated The automated extraction of predictive information from (large) databases  Extraction  Predictive  Databases

7 Page 7/71 History of Data Mining History of Data Mining

8 Page 8/71 Repetition in Solar Activity Repetition in Solar Activity  1613 – Galileo Galilei  1859 – Heinrich Schwabe

9 Page 9/71 The Return of the Halley Comet The Return of the Halley Comet 191019862061 ??? 1531 1607 1682 239 BC Edmund Halley (1656 - 1742)

10 Page 10/71 Data Mining is Not Data Mining is Not  Data warehousing  Ad-hoc query/reporting  Online Analytical Processing (OLAP)  Data visualization

11 Page 11/71 Data Mining is Data Mining is  Automated extraction of predictive information from various data sources  Powerful technology with great potential to help users focus on the most important information stored in data warehouses or streamed through communication lines

12 Page 12/71 Data Mining can Data Mining can  Answer question that were too time consuming to resolve in the past  Predict future trends and behaviors, allowing us to make proactive, knowledge driven decision

13 Page 13/71 Focus of this Presentation Focus of this Presentation  Data Mining problem types  Data Mining models and algorithms  Efficient Data Mining  Available software

14 Page 14/71 Data Mining Problem Types Data Mining Problem Types

15 Page 15/71 Data Mining Problem Types Data Mining Problem Types  6 types  Often a combination solves the problem

16 Page 16/71 Data Description and Summarization Data Description and Summarization  Aims at concise description of data characteristics  Lower end of scale of problem types  Provides the user an overview of the data structure  Typically a sub goal

17 Page 17/71 Segmentation Segmentation  Separates the data into interesting and meaningful subgroups or classes  Manual or (semi)automatic  A problem for itself or just a step in solving a problem

18 Page 18/71 Classification Classification  Assumption: existence of objects with characteristics that belong to different classes  Building classification models which assign correct labels in advance  Exists in wide range of various application  Segmentation can provide labels or restrict data sets

19 Page 19/71 Concept Description Concept Description  Understandable description of concepts or classes  Close connection to both segmentation and classification  Similarity and differences to classification

20 Page 20/71 Prediction (Regression) Prediction (Regression)  Similar to classification- difference: discrete becomes continuous  Finds the numerical value of the target attribute for unseen objects

21 Page 21/71 Dependency Analysis Dependency Analysis  Finding the model that describes significant dependences between data items or events  Prediction of value of a data item  Special case: associations

22 Page 22/71 Data Mining Models Data Mining Models

23 Page 23/71 Neural Networks Neural Networks  Characterizes processed data with single numeric value  Efficient modeling of large and complex problems  Based on biological structures Neurons  Network consists of neurons grouped into layers

24 Page 24/71 Neuron Functionality Neuron Functionality I1 I2 I3 In Output W1 W2 W3 Wn f Output = f (W1*I1, W2*I1, …, Wn*In)

25 Page 25/71 Training Neural Networks Training Neural Networks

26 Page 26/71 Neural Networks - Conclusion Neural Networks - Conclusion  Once trained, Neural Networks can efficiently estimate value of output variable for given input  Neurons and network topology are essentials  Usually used for prediction or regression problem types  Difficult to understand  Data pre-processing often required

27 Page 27/71 Decision Trees Decision Trees  A way of representing a series of rules that lead to a class or value  Iterative splitting of data into discrete groups maximizing distance between them at each split  CHAID, CHART, Quest, C5.0  Classification trees and regression trees  Unlimited growth and stopping rules  Univariate splits and multivariate splits

28 Page 28/71 Decision Trees Balance>10Balance<=10 Age<=32Age>32 Married=NOMarried=YES

29 Page 29/71 Decision Trees Decision Trees

30 Page 30/71 Rule Induction Rule Induction  Method of deriving a set of rules to classify cases  Creates independent rules that are unlikely to form a tree  Rules may not cover all possible situations  Rules may sometimes conflict in a prediction

31 Page 31/71 Rule Induction Rule Induction If balance>100.000 then confidence=HIGH & weight=1.7 If balance>25.000 and status=married then confidence=HIGH & weight=2.3 If balance<40.000 then confidence=LOW & weight=1.9

32 Page 32/71 K-nearest Neighbor and Memory-Based Reasoning (MBR) K-nearest Neighbor and Memory-Based Reasoning (MBR)  Usage of knowledge of previously solved similar problems in solving the new problem  Assigning the class to the group where most of the k-”neighbors” belong  First step – finding the suitable measure for distance between attributes in the data  How far is black from green?  + Easy handling of non-standard data types  - Huge models

33 Page 33/71 K-nearest Neighbor and Memory-Based Reasoning (MBR) K-nearest Neighbor and Memory-Based Reasoning (MBR)

34 Page 34/71 Data Mining Models and Algorithms Data Mining Models and Algorithms  Logistic regression  Discriminant analysis  Generalized Adaptive Models (GAM)  Genetic algorithms  Etc…  Many other available models and algorithms  Many application specific variations of known models  Final implementation usually involves several techniques  Selection of solution that match best results

35 Page 35/71 Efficient Data Mining Efficient Data Mining

36 Page 36/71 Don’t Mess With It! YES NO YES You Shouldn’t Have! NO Will it Explode In Your Hands? NO Look The Other Way Anyone Else Knows? You’re in TROUBLE! YES NO Hide It Can You Blame Someone Else? NO NO PROBLEM! YES Is It Working? Did You Mess With It?

37 Page 37/71 DM Process Model DM Process Model  CRISP–DM – tends to become a standard  5A – used by SPSS Clementine (Assess, Access, Analyze, Act and Automate )  SEMMA – used by SAS Enterprise Miner (Sample, Explore, Modify, Model and Assess)

38 Page 38/71 CRISP - DM CRISP - DM  CRoss-Industry Standard for DM  Conceived in 1996 by three companies:

39 Page 39/71 CRISP – DM methodology CRISP – DM methodology Four level breakdown of the CRISP-DM methodology: Phases Generic Tasks Process Instances Specialized Tasks

40 Page 40/71 Mapping generic models to specialized models Mapping generic models to specialized models  Analyze the specific context  Remove any details not applicable to the context  Add any details specific to the context  Specialize generic context according to concrete characteristic of the context  Possibly rename generic contents to provide more explicit meanings

41 Page 41/71 Generalized and Specialized Cooking Generalized and Specialized Cooking  Preparing food on your own  Find out what you want to eat  Find the recipe for that meal  Gather the ingredients  Prepare the meal  Enjoy your food  Clean up everything (or leave it for later)  Raw stake with vegetables?  Check the Cookbook or call mom  Defrost the meat (if you had it in the fridge)  Buy missing ingredients or borrow the from the neighbors  Cook the vegetables and fry the meat  Enjoy your food or even more  You were cooking so convince someone else to do the dishes

42 Page 42/71 CRISP – DM model CRISP – DM model  Business understanding  Data understanding  Data preparation  Modeling  Evaluation  Deployment Business understanding Data understanding Data preparation ModelingEvaluation Deployment

43 Page 43/71 Customizing a Web Page Customizing a Web Page  User-friendly design  Prediction of the users interests  Reduction of server workload  Reduction of Web traffic

44 Page 44/71 Customizing a Web Page Customizing a Web Page

45 Page 45/71 Business Understanding Business Understanding  Determine business objectives  Assess situation  Determine data mining goals  Produce project plan

46 Page 46/71 Business Understanding - Outputs Business Understanding - Outputs  Background  Business objectives and success criteria  Inventory of resources  Requirements, assumptions, and constrains  Risks and contingencies  Terminology  Costs and benefits  Data mining goals and success criteria  Project plan  Initial assessment of tools and techniques

47 Page 47/71 Customizing a Web Page – Business Understanding Example Customizing a Web Page – Business Understanding Example  Business objectives  Assess situation  Data mining goals  Project plan  Make the users surfing more comfortable  Decrease of overhead for users  Reduction of workload and Web traffic  Make the users surfing more comfortable  Decrease of overhead for users  Reduction of workload and Web traffic  Find the patterns in the user behavior

48 Page 48/71 Data Understanding Data Understanding  Collect initial data  Describe data  Explore data  Verify data quality

49 Page 49/71 Data Understanding - Outputs Data Understanding - Outputs  Data collection report  Data description report  Data exploration report  Data quality report  Background of data  List of data sources  For each data source, method of acquisition  Problems encountered in data acquisition  Detailed description of each data source  List of tables or other database objects  Description of each field including units, codes, etc.  Expected regularities or patterns and methods of detection  Regularities or patterns found (expected and unexpected)  Any other surprises  Conclusions for data transformation, data cleaning and any other pre-processing  Conclusions related to data mining goals or business objectives  Approach taken to assess data quality  Results of data quality assessment

50 Page 50/71 Customizing a Web Page – Data Understanding Example Customizing a Web Page – Data Understanding Example  Collecting the data  Data description  Results of data exploring  Verification of the quality of the data  Update the server to monitor user behavior  Record the users activities into a storage  Analyze recorded data  Decide which data is usable for mining

51 Page 51/71 Data Preparation Data Preparation  Select data  Clean data  Construct data  Integrate data  Format data

52 Page 52/71 Data Preparation - Outputs Data Preparation - Outputs  Dataset description report  Background including broad goals and plan for pre-processing  Description of pre-processing  Detailed description of resultant datasets  Rational for inclusion/exclusion of attributes  Discoveries made during pre-processing and implications for further work  Dataset

53 Page 53/71 Customizing a Web Page – Data Preparation Example Customizing a Web Page – Data Preparation Example  Decide from what period will the users monitored actions be considered  Make assumptions about unnecessary monitored data and discard them  Classify user actions into categories, group interesting links, etc…  If more information about user is available from other sources, use them  Transform data into suitable forms so several modeling techniques could be applied

54 Page 54/71 Modeling Modeling  Select modeling technique  Generate test design  Build model  Assess model

55 Page 55/71 Modeling - Outputs Modeling - Outputs  Assessment of DM results with respect to business success criteria  Test design  Model description  Model assessment  Broad description of the type of model and the training data to be used  Explanation of how the model will be tested or assessed  Description of any data required for testing  Description of any planned examination of models by domain or data experts  Type of model and relation to data mining goals  Parameter settings used to produce model  Detailed description of the model and any special features  Conclusions regarding patterns in the data  Overview of assessment process including deviations from the test plan  Detailed assessment of the model  Comments on models by domain or data experts  Insights into why a certain modeling technique and certain parameter setting lead to good/bad results

56 Page 56/71 Customizing a Web Page – Modeling Example Customizing a Web Page – Modeling Example  The problem is prediction of behavior  Regression could be a good solution due to distinct nature of the data  Create the software according to the project plan  Observe the behavior of the software  Tune the model after each evaluation phase if needed

57 Page 57/71 Evaluation Evaluation  Evaluate results  Review process  Determine next steps results = models + findings

58 Page 58/71 Evaluation - Outputs Evaluation - Outputs  Assessment of DM results with respect to business success criteria  Review of process  List of possible actions  Review of Business Objectives and Business Success Criteria  Comparison between success criterion and DM results  Conclusion about achievability of success criterion and suitability of data mining process  Review of “Project Success”  Are there new business objectives?

59 Page 59/71 Customizing a Web Page – Evaluation Example Customizing a Web Page – Evaluation Example  Observe the model behavior at work  Collect response from Beta testers  Check user satisfaction  Check server and network engagement  Classify results  Determine which parameter of the model should be changed  Present new ideas and modifications  Step back into previous phases as needed

60 Page 60/71 Deployment Deployment  Plan deployment  Plan monitoring and maintenance  Produce final report  Review project

61 Page 61/71 Deployment - Outputs Deployment - Outputs  Monitoring and maintenance plan  Final report  Overview of deployment results and indication which of results may require updating  Description of how updating will be triggered  Description of how updating will be performed  Summary of Business Understanding (background, objectives and success criteria)  Summary of data mining process  Summary of data mining results  Summary of results evaluation  Summary of deployment and maintenance plan  Cost/benefit analysis  Conclusions for the business  Conclusions for future data mining

62 Page 62/71 Customizing a Web Page – Deployment Example Customizing a Web Page – Deployment Example  Make the feature available to all users  Make plan for maintenance and user feedback  Analyze costs and benefits  Summarize the whole documentation  Summarize network and server additional activity  Collect the new ideas  Award according to results  Leave space for upgrade

63 Page 63/71 At Last… At Last…

64 Page 64/71 Available Software Available Software

65 Page 65/71 Available Software Available Software Discussion of data mining vendors and software is not included into this slide set

66 Page 66/71 Conclusions Conclusions

67 Page 67/71 WWW.NBA. COM WWW.NBA.COM

68 Page 68/71 Se7en Se7en

69 Page 69/71 CD – ROM CD – ROM CD – ROM

70 Page 70/71 Credits Credits Anne Stern, SPSS, Inc. Djuro Gluvajic, ITE, Denmark Obrad Milivojevic, PC PRO, Yugoslavia

71 Page 71/71 References References  Bruha, I., ‘Data Mining, KDD and Knowledge Integration: Methodology and A case Study”, SSGRR 2000  Fayyad, U., Shapiro, P., Smyth, P., Uthurusamy, R., “Advances in Knowledge Discovery and Data Mining”, MIT Press, 1996  Glumour, C., Maddigan, D., Pregibon, D., Smyth, P., “Statistical Themes nad Lessons for Data Mining”, Data Mining And Knowledge Discovery 1, 11-28, 1997  Hecht-Nilsen, R., “Neurocomputing”, Addison-Wesley, 1990  Pyle, D., “Data Preparation for Data Mining”, Morgan Kaufman, 1999  galeb.etf.bg.ac.yu/~vm  www.thearling.com www.thearling.com  www.crisp-dm.com www.crisp-dm.com  www.twocrows.com www.twocrows.com  www.sas.com/products/miner www.sas.com/products/miner  www.spss.com/clementine www.spss.com/clementine

72 Page 72/71 The END The END


Download ppt "Page 1/71 Issues in Data Mining Infrastructure Issues in Data Mining Infrastructure Authors:Nemanja Jovanovic, Valentina Milenkovic,"

Similar presentations


Ads by Google