Spam Image Identification Using an Artificial Neural Network

Slides:



Advertisements
Similar presentations
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Advertisements

Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB.
Computer Vision Lecture 18: Object Recognition II
Visualization of hidden node activity in a feed forward neural network Adam Arvay.
1 Part I Artificial Neural Networks Sofia Nikitaki.
Handwritten Character Recognition Using Artificial Neural Networks Shimie Atkins & Daniel Marco Supervisor: Johanan Erez Technion - Israel Institute of.
Neural Networks. R & G Chapter Feed-Forward Neural Networks otherwise known as The Multi-layer Perceptron or The Back-Propagation Neural Network.
Neural Networks Chapter Feed-Forward Neural Networks.
Artificial Neural Networks (ANNs)
MACHINE LEARNING AND ARTIFICIAL NEURAL NETWORKS FOR FACE VERIFICATION
Hub Queue Size Analyzer Implementing Neural Networks in practice.
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
1 st Neural Network: AND function Threshold(Y) = 2 X1 Y X Y.
Artificial Neural Networks
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
Artificial Neural Networks An Overview and Analysis.
Appendix B: An Example of Back-propagation algorithm
NEURAL NETWORKS FOR DATA MINING
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
COMPARISON OF IMAGE ANALYSIS FOR THAI HANDWRITTEN CHARACTER RECOGNITION Olarik Surinta, chatklaw Jareanpon Department of Management Information System.
Gap filling of eddy fluxes with artificial neural networks
Handwritten Recognition with Neural Network Chatklaw Jareanpon, Olarik Surinta Mahasarakham University.
Multi-Layer Perceptron
Music Genre Classification Alex Stabile. Example File
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Over-Trained Network Node Removal and Neurotransmitter-Inspired Artificial Neural Networks By: Kyle Wray.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Alex Stabile. Research Questions: Could a computer learn to distinguish between different composers? Why does music by different composers even sound.
Essential components of the implementation are:  Formation of the network and weight initialization routine  Pixel analysis of images for symbol detection.
Automatic Screening of Sonar Imagery Using Artificial Intelligence Techniques John Tran.
Neural Networks Lecture 11: Learning in recurrent networks Geoffrey Hinton.
Neural Networks References: “Artificial Intelligence for Games” "Artificial Intelligence: A new Synthesis"
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Leaves Recognition By Zakir Mohammed Indiana State University Computer Science.
Vision Based Automation of Steering using Artificial Neural Network Team Members: Sriganesh R. Prabhu Raj Kumar T. Senthil Prabu K. Raghuraman V. Guide:
ANN-based program for Tablet PC character recognition
Data Mining, Neural Network and Genetic Programming
Neural Network Implementations on Parallel Architectures
Seizure Prediction System: An Artificial Neural Network Approach
CSE 473 Introduction to Artificial Intelligence Neural Networks
بحث في موضوع : Neural Network
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Artificial Neural Networks
Prof. Carolina Ruiz Department of Computer Science
Artificial Neural Networks for Pattern Recognition
BACKPROPOGATION OF NETWORKS
شبکه عصبی تنظیم: بهروز نصرالهی-فریده امدادی استاد محترم: سرکار خانم کریمی دانشگاه آزاد اسلامی واحد شهرری.
Artificial Intelligence Methods
BACKPROPAGATION Multlayer Network.
XOR problem Input 2 Input 1
Face Recognition with Neural Networks
network of simple neuron-like computing elements
CSC 578 Neural Networks and Deep Learning
Optimization for Fully Connected Neural Network for FPGA application
Capabilities of Threshold Neurons
Machine Learning: Lecture 4
Machine Learning: UNIT-2 CHAPTER-1
Image Compression Using An Adaptation of the ART Algorithm
Artificial Neural Networks
Computer Vision Lecture 19: Object Recognition III
CSC321: Neural Networks Lecture 11: Learning in recurrent networks
Sanguthevar Rajasekaran University of Connecticut
CSC 578 Neural Networks and Deep Learning
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

Spam Image Identification Using an Artificial Neural Network 2008 MIT Spam Conference Spam Image Identification Using an Artificial Neural Network Jason R. Bowling, Priscilla Hope and Kathy J. Liszka The University of Akron

We know it’s bad… 2005 – roughly 1% of all emails mid 2006 – rose to 21% J. Swartz, “Picture this: A sneakier kind of spam,” USA Today, Jul. 23, 2006.

The University of Akron December 2007 28,000,000 messages 24,000,000 identified as spam and dropped

Inspiration

FANN Fast Artificial Neural Network Library open source adaptive, learn by example (given good input) input hidden output

Image Preparation open source converts from virtually any format to another tradeoffs

image2fann.cpp input images training data 150 × 150 pixel 8-bit grayscale jpg images

number of input nodes number of output nodes number of images (input sets) 500 22500 1 .128 .123 .156 .128 .156 .254 … 1 .156 .128 .128 .123 .156 .254 … -1 spam ham

22,500 input nodes two layers of hidden nodes 1 output node

Training the Network A fully connected back propagation neural network. Supervised learning paradigm.

Activation Function Takes the inputs to a node, uses a weight for each input and determines the weight of the output from the node.

Steepness 1.0 0.5 0.0

Widrow and Nguyen’s algorithm An even distribution of weights across each input node’s active region. Used at initialization.

Epoch One cycle where the weights are adjusted to match the output in the training file. I’m spam! Too many epochs can cause the ANN to fail. I’m ham!

Learning Rate Train to a desired error. Step down the training rate at preset intervals to avoid oscillation.

Training 22604 nodes in network Max epochs 200. Desired error: 0.4 Epochs 1. Current error: 0.2800000012. Bit fail 56. Learning rate is: 0.500000 Max epochs 5000. Desired error: 0.2000000030. Epochs 20. Current error: 0.2800000012. Bit fail 56. Epochs 40. Current error: 0.2251190692. Bit fail 56. Epochs 60. Current error: 0.2074941099. Bit fail 65. Epochs 71. Current error: 0.1479636133. Bit fail 48.

image2fann.cpp train.c test.c ham spam input images training data FANN

572 Trained Images 75 hidden nodes

572 Trained Images 50 hidden nodes

Corpus

Scaling to number < 1 (divide by 1000) grayscale intensity 0 - 256 training data limited to 0 – 0.25

Current Work complete corpus multipart images separate ANNs hidden nodes color image size

Priscilla Hope

Thank you!