September 2003 Fraud Formalization and Detection Bharat Bhargava, Yuhui Zhong, Yunhua Lu Center for Education and Research in Information Assurance and.

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

Sensor-Based Abnormal Human-Activity Detection Authors: Jie Yin, Qiang Yang, and Jeffrey Junfeng Pan Presenter: Raghu Rangan.
1 CS 6910: Advanced Computer and Information Security Lecture on 11/2/06 Trust in P2P Systems Ahmet Burak Can and Bharat Bhargava Center for Education.
Defining Marketing Marketing
Introduction to Statistical Quality Control, 4th Edition Chapter 7 Process and Measurement System Capability Analysis.
Dejan Lavbič University of Ljubljana, Faculty of Computer and Information Science, SLOVENIA.
An Approach to Evaluate Data Trustworthiness Based on Data Provenance Department of Computer Science Purdue University.
Continuous Audit at Insurance Companies
1 Trust and Privacy in Authorization Bharat Bhargava Yuhui Zhong Leszek Lilien CERIAS Security Center CWSA Wireless Center Department of CS and ECE Purdue.
Secure communication in cellular and ad hoc environments Bharat Bhargava Department of Computer Sciences, Purdue University This is supported.
Report on Intrusion Detection and Data Fusion By Ganesh Godavari.
1 Learning to Detect Objects in Images via a Sparse, Part-Based Representation S. Agarwal, A. Awan and D. Roth IEEE Transactions on Pattern Analysis and.
Dept. of Computer Science & Engineering, CUHK1 Trust- and Clustering-Based Authentication Services in Mobile Ad Hoc Networks Edith Ngai and Michael R.
/16/2015 9:20:53 PM 9. Role-Based Access Control (RBAC) Role Classification Algorithm Prof. Bharat Bhargava Center for Education and Research in.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering On-line Alert Systems for Production Plants A Conflict Based Approach.
1 Intrusion Detection CSSE 490 Computer Security Mark Ardis, Rose-Hulman Institute May 4, 2004.
7-2 Estimating a Population Proportion
Chapter 5 Data mining : A Closer Look.
Intrusion and Anomaly Detection in Network Traffic Streams: Checking and Machine Learning Approaches ONR MURI area: High Confidence Real-Time Misuse and.
1. Introduction Generally Intrusion Detection Systems (IDSs), as special-purpose devices to detect network anomalies and attacks, are using two approaches.
University of Toronto Department of Computer Science © 2001, Steve Easterbrook CSC444 Lec22 1 Lecture 22: Software Measurement Basics of software measurement.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
CS490D: Introduction to Data Mining Prof. Chris Clifton April 14, 2004 Fraud and Misuse Detection.
© 2010 IBM Corporation © 2011 IBM Corporation September 6, 2012 NCDHHS FAMS Overview for Behavioral Health Managed Care Organizations.
Intrusion Detection Jie Lin. Outline Introduction A Frame for Intrusion Detection System Intrusion Detection Techniques Ideas for Improving Intrusion.
Intrusion Detection for Grid and Cloud Computing Author Kleber Vieira, Alexandre Schulter, Carlos Becker Westphall, and Carla Merkle Westphall Federal.
Introduction to Statistical Quality Control, 4th Edition Chapter 7 Process and Measurement System Capability Analysis.
Chapter 2 The process Process, Methods, and Tools
Network Intrusion Detection Using Random Forests Jiong Zhang Mohammad Zulkernine School of Computing Queen's University Kingston, Ontario, Canada.
IIT Indore © Neminah Hubballi
EVAL 6970: Cost Analysis for Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
INTRUSION DETECTION INTRUSION DETECTION INTRUSION DETECTION INTRUSION DETECTION INTRUSION DETECTION INTRUSION DETECTION INTRUSION DETECTION INTRUSION DETECTION.
Population All members of a set which have a given characteristic. Population Data Data associated with a certain population. Population Parameter A measure.
1 ECE 453 – CS 447 – SE 465 Software Testing & Quality Assurance Instructor Kostas Kontogiannis.
nd Joint Workshop between Security Research Labs in JAPAN and KOREA Profile-based Web Application Security System Kyungtae Kim High Performance.
Using Identity Credential Usage Logs to Detect Anomalous Service Accesses Daisuke Mashima Dr. Mustaque Ahamad College of Computing Georgia Institute of.
NEURAL NETWORKS FOR DATA MINING
Report on Intrusion Detection and Data Fusion By Ganesh Godavari.
Operating system Security By Murtaza K. Madraswala.
Maximum Network Lifetime in Wireless Sensor Networks with Adjustable Sensing Ranges Cardei, M.; Jie Wu; Mingming Lu; Pervaiz, M.O.; Wireless And Mobile.
2001/11/27IDS Lab Seminar1 Adaptive Fraud Detection Advisor: Dr. Hsu Graduate: Yung-Chu Lin Source: Fawcett, Tom and Foster Provost, Journal of Data Mining.
Leveraging Asset Reputation Systems to Detect and Prevent Fraud and Abuse at LinkedIn Jenelle Bray Staff Data Scientist Strata + Hadoop World New York,
April 28, 2003 Early Fault Detection and Failure Prediction in Large Software Systems Felix Salfner and Miroslaw Malek Department of Computer Science Humboldt.
Boundary Detection in Tokenizing Network Application Payload for Anomaly Detection Rachna Vargiya and Philip Chan Department of Computer Sciences Florida.
DATA MINING WITH CLUSTERING AND CLASSIFICATION Spring 2007, SJSU Benjamin Lam.
Measuring Behavioral Trust in Social Networks
CHAPTER 12 MANAGERIAL ACCOUNTING AND COST-VOLUME- PROFIT RELATIONSHIPS McGraw-Hill/Irwin©The McGraw-Hill Companies, Inc., 2002.
IMPORTANCE OF STATISTICS MR.CHITHRAVEL.V ASST.PROFESSOR ACN.
GENDER AND AGE RECOGNITION FOR VIDEO ANALYTICS SOLUTION PRESENTED BY: SUBHASH REDDY JOLAPURAM.
Intrusion Detection Systems Paper written detailing importance of audit data in detecting misuse + user behavior 1984-SRI int’l develop method of.
© Copyright McGraw-Hill 2004
Time-Space Trust in Networks Shunan Ma, Jingsha He and Yuqiang Zhang 1 College of Computer Science and Technology 2 School of Software Engineering.
Computer Science 1 Systematic Structural Testing of Firewall Policies JeeHyun Hwang 1, Tao Xie 1, Fei Chen 2, and Alex Liu 2 North Carolina State University.
Discriminative Frequent Pattern Analysis for Effective Classification By Hong Cheng, Xifeng Yan, Jiawei Han, Chih- Wei Hsu Presented by Mary Biddle.
Data Mining and Decision Support
A Security Framework with Trust Management for Sensor Networks Zhiying Yao, Daeyoung Kim, Insun Lee Information and Communication University (ICU) Kiyoung.
Approaches to Intrusion Detection statistical anomaly detection – threshold – profile based rule-based detection – anomaly – penetration identification.
IEEE AI - BASED POWER SYSTEM TRANSIENT SECURITY ASSESSMENT Dr. Hossam Talaat Dept. of Electrical Power & Machines Faculty of Engineering - Ain Shams.
Chapter 2: The Research Enterprise in Psychology.
1. ABSTRACT Information access through Internet provides intruders various ways of attacking a computer system. Establishment of a safe and strong network.
Do Not Pay Business Center- Using Analytics to Help Agencies Prevent Improper Payments JFMIP May 2016.
Shadow Detection in Remotely Sensed Images Based on Self-Adaptive Feature Selection Jiahang Liu, Tao Fang, and Deren Li IEEE TRANSACTIONS ON GEOSCIENCE.
Profiling: What is it? Notes and reflections on profiling and how it could be used in process mining.
Lecture 1.31 Criteria for optimal reception of radio signals.
Recommendation Based Trust Model with an Effective Defense Scheme for ManetS Adeela Huma 02/02/2017.
Evaluating a Real-time Anomaly-based IDS
Chapter Three Research Design.
Data Warehousing Data Mining Privacy
Formalization of Trust, Fraud, and Vulnerability Analysis
Presentation transcript:

September 2003 Fraud Formalization and Detection Bharat Bhargava, Yuhui Zhong, Yunhua Lu Center for Education and Research in Information Assurance and Security and Department of Computer Sciences Purdue University, W. Lafayette, IN, USA {bb, zhong,

2 Introduction Fraudsters can be classified into impersonators and swindlers Impersonator: an illegitimate user who steals resources from the victims by “taking over'' their accounts Swindler: a legitimate user who intentionally harms the system or other users by deception

3 Introduction Fraud prevention –Cryptographic technologies: prevent frauds caused by impersonators –Separation of duty and dual-log bookkeeping: prevent frauds conducted by swindlers Fraud detection –Existing research efforts: identifying frauds caused by impersonators –This paper: detecting frauds conducted by swindlers

4 Related Work Fraud detection techniques –Most fraud detection techniques address impersonator issues –An adaptive fraud rule-based detection framework (T. Fawcett and F. Provost) –neural network technique based on unsupervised learning for fraud detection (P. Burge and J. Shawe-Taylor) –Generation and selection rule set should combine both user-level and behavior-level attributes (S. Rosset)

5 Evaluation criteria Receiver Operating Characteristics –A ROC graph shows the relationship between True Positive rate and False positive rate Accuracy –the number of detected fraud over the total number of classified frauds Fraud coverage –the number of detected frauds over the total number of frauds False alarm rate –Percentage of false alarm in alarm set Fraud detection rate –Loss by detected fraud over the total loss due to fraud Cost-based metric –If the loss resulting from a fraud is smaller than the investigation cost, this fraud is ignored

6 Formal Definitions A swindler is an entity that has no intention to keep his commitment in cooperation. Commitment: conjunction of expressions describing an entity’s promise in a process of cooperation –Example: (Received_by=04/01)  (Prize=$1000)  (Quality=“A”)  ReturnIfAnyQualityProblem Outcome: conjunction of expressions describing the actual results of a cooperation –Example: (Received_by=04/05)  (Prize=$1000)  (Quality=“B”)  ¬ReturnIfAnyQualityProblem

7 Formal Definitions Intention-testifying: –Predicate P: ¬P in an outcome  entity making the promise is a swindler. –Attribute variable V: V's expected value is more desirable than the actual value  the entity is a swindler. Intention-dependent indicates an possibility –Predicate P: ¬P in an outcome  entity making the promise may be a swindler. –Attribute variable V: V's expected value is more desirable than the actual value  the entity may be a swindler. An intention-testifying variable or predicate is intention-dependent. The opposite direction is not necessarily true.

8 Model deceiving intentions Satisfaction rating –Associate with the actual value of each intention-dependent variable in an outcome. –Range from [0,1]. The higher the rating is, the more satisfied the user is. –Relate to deceiving intention and unpredicted factors –Modeled by using random variable with normal distribution –mean function fm(n) determines the mean value of the normal distribution at the the nth rating

9 Model deceiving intentions (Cont’d) Uncovered deceiving intention –The satisfaction ratings are stably low. –The ratings vary in a small range over time.

10 Model deceiving intentions (Cont’d) Trapping intention –The rating sequence can be divided into two phases: preparing and trapping. –A swindler behaves well to achieve a trustworthy image before he conducts frauds.

11 Model deceiving intentions (Cont’d) Illusive intention –A smart swindler attempts to cover the bad effects by intentionally doing something good after misbehaviors. –The process of preparing and trapping are repeated.

12 Architecture for Swindler Detection

13 Architecture for Swindler Detection Profile-based anomaly detector –Monitor suspicious actions based upon the established patterns of an entity State transition analysis –Provide state description when an activity results in entering a dangerous state Deceiving intention predictor –Discover deceiving intention based on satisfaction ratings. Decision making

14 Profile-based anomaly detector

15 Profile-based anomaly detector Rule generation and weighting –Generate fraud rules and weights associated with the rules User profiling –Variable selection –Data filtering Online detection –Retrieve rules upon an activity occurs –Retrieve current and history behavior patterns –Calculate deviation of two patterns

16 Deceiving intention predictor Kernel of the predictor: DIP algorithm Belief of deceiving intention as the complementary of trust belief Trust belief is evaluated based on the satisfaction sequence. Trust belief formation satisfies: –Time dependent –Trustee dependent –Easy-destruction-hard-construction property

17

18 Experimental study Goal: Investigate DIP’s capability of discovering deceiving intentions Initial values for parameters: –Construction factor (Wc): 0.05 –Destruction factor (Wd): 0.1 –Penalty ratios for construction factor (r1): 0.9 –Penalty ratios for destruction factor (r2): 0.1 –Penalty ratios for supervision-period (r3): 2 –Threshold for a foul event (fThreshold): 0.18

19 Discover swindler with uncovered deceiving intention trust values are close to the minimum rating of interactions that is 0.1 Deceiving intention belief is around 0.9

20 Discover swindler with trapping intention DIP responds to the sharp drop quickly It takes 6 interactions for DI- confidence increasing from to after the sharp drop

21 Discover swindler with illusive intention DIP is able to catch this smart swindler in the sense that the belief in deceiving intention eventually increases to about 0.9 The swindler's effort to cover a fraud with good behaviors has less and less effect with the number of frauds.

22 Conclusion Define concepts relevant to frauds conducted by swindlers Model three deceiving intentions Propose an approach for swindler detection and an architecture realizing the approach Develop a deceiving intention prediction algorithm