Download presentation
Presentation is loading. Please wait.
Published byEleanor Henderson Modified over 9 years ago
1
2001/11/27IDS Lab Seminar1 Adaptive Fraud Detection Advisor: Dr. Hsu Graduate: Yung-Chu Lin Source: Fawcett, Tom and Foster Provost, Journal of Data Mining and Knowledge Discovery, Volume 1, Issue 3, September 1997, pp. 291-316
2
2001/11/27IDS Lab Seminar2 Outline Motivation & objective Definition What ’ s cloning fraud Detriment of cloning fraud Strategies for dealing with cloning fraud The need to be adaptive Problems of learning algorithms The detector constructor framework How framework work Experiments Conclusion
3
2001/11/27IDS Lab Seminar3 Motivation Cellular fraud costs hundreds of millions of dollars per year Existing methods are ad hoc
4
2001/11/27IDS Lab Seminar4 Objective Presenting a framework/system for automatically generating detectors
5
2001/11/27IDS Lab Seminar5 Definition A customer ’ s account = MIN + ESN MIN (Mobile Identification Number) ESN (Electronic Serial Number) Bandit: a cloned phone user Carrier: the cellular service provider
6
2001/11/27IDS Lab Seminar6 What ’ s Cloning Fraud A customer ’ s MIN and ESN not belonging to the customer A bandit makes virtually unlimited calls The attraction of free and untraceable communication popular
7
2001/11/27IDS Lab Seminar7 Detriment of Cloning Fraud Service to be denied to legitimate customers Crediting process is costly to the carrier and inconvenient to the customer Fraud incurs land-line usage charges Cellular carries must pay costs to other carriers
8
2001/11/27IDS Lab Seminar8 Strategies for Dealing with Cloning Fraud Pre-call methods Post-call methods User profiling
9
2001/11/27IDS Lab Seminar9 Pre-call Methods Requiring PIN (Personal Identification Number) PIN is entered before every call RF Fingerprinting Identifying cellular phones by their transmission characteristics Authentication A reliable and secure private-key encryption method
10
2001/11/27IDS Lab Seminar10 Post-call Methods Collision detection Analyzing call data for temporally overlapping calls Velocity checking Analyzing the locations and times for consecutive calls Dialed digit analysis
11
2001/11/27IDS Lab Seminar11 User Profiling Analyzing calling behavior to detect usage anomalies suggestive of fraud Working well with low-usage
12
2001/11/27IDS Lab Seminar12 The Need to Be Adaptive The patterns of fraud are dynamic Bandits constantly change their strategies The environment is dynamic in other ways
13
2001/11/27IDS Lab Seminar13 Problems of Learning Algorithms Context The discovery of context-sensitive fraud which call features are important? The profiling of individual accounts how should profiles be created? Granularity Aggregating customer behavior, smoothing out the variation Watching for coarser-grained changes that have better predictive power when should alarms be issued?
14
2001/11/27IDS Lab Seminar14 The Detector Constructor Framework
15
2001/11/27IDS Lab Seminar15 How Framework Works
16
2001/11/27IDS Lab Seminar16 Learning Fraud Rules Rule generation Rule are generated locally for each account Using RL program Rule selection Most of the rules created by generating step are specific only to single accounts The rule found in ( “ covers ” ) many accounts is worth using
17
2001/11/27IDS Lab Seminar17 Constructing Profiling Monitors (1/3) Sensitivity to different users is accomplished Profiling phase The monitor is applied to a segment of an account ’ s typical(non-fraud) usage Use phase The monitor processes a single account- day at a time
18
2001/11/27IDS Lab Seminar18 Constructing Profiling Monitors (2/3) Profiling monitors are created by the monitor constructor, which employs a set of templates
19
2001/11/27IDS Lab Seminar19 Constructing Profiling Monitors (3/3)
20
2001/11/27IDS Lab Seminar20 Combining Evidence from the Monitors The outputs of the monitors are used to a standard learning program Using Linear Threshold Unit (LTU) In training, the monitors ’ outputs are presented along with the desired output The evidence combination weights the monitor outputs and learns a threshold on the sum
21
2001/11/27IDS Lab Seminar21 The Data Records of cellular calls placed over four months by users in the New York City area Each call is described by 31 attributes Adding 7 attributes TIME-OF-DAY etc. Each call is given a class label of legitimate or fraudulent
22
2001/11/27IDS Lab Seminar22 Data Selection Rule learning: 879 accounts 500,000 calls Profiling, training, testing: 3600 accounts 30 days (fraud-free) profiling Remaining days 96,000 account-days Randomly selecting 10,000 for training 5000 for testing (20% fraud; 80% non-fraud)
23
2001/11/27IDS Lab Seminar23 Experiments Rule learning generated 3630 rules The rule selection process, yielded 99 rules Each of the 99 rules was used to instantiate 2 monitor templates, yielding 198 monitors The final feature selection step reduced to 7 monitors
24
2001/11/27IDS Lab Seminar24 Experiments
25
2001/11/27IDS Lab Seminar25 Conclusion Fraud behavior changes frequently, and fraud detection systems should be adaptive as well To build usage monitors we must know which aspects of customers ’ behavior to profile This framework is not specific to cloning fraud
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.