Prediction of Voting Patterns Based on Census and Demographic Data Analysis Performed by: Mike He ECE 539, Fall 2005.

Slides:



Advertisements
Similar presentations
A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Advertisements

EE 690 Design of Embodied Intelligence
Artificial Neural Networks (1)
Scott Wiese ECE 539 Professor Hu
Charles Rodenkirch December 11 th, 2013 ECE 539 – Introduction to Artificial Neural Networks PREDICTING INDIVIDUAL PLACEMENT IN COLLEGIATE WATERSKI TOURNAMENTS.
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Campaign Financing and Election Outcome
Alberto Trindade Tavares ECE/CS/ME Introduction to Artificial Neural Network and Fuzzy Systems.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Decision Support Systems
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2010 Shreekanth Mandayam ECE Department Rowan University.
Lecture 08 Classification-based Learning
Multi Layer Perceptrons (MLP) Course website: The back-propagation algorithm Following Hertz chapter 6.
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Neural Networks An Introduction.
MLP Exercise (2006) Become familiar with the Neural Network Toolbox in Matlab Construct a single hidden layer, feed forward network with sigmoidal units.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2006 Shreekanth Mandayam ECE Department Rowan University.
MEASURING AND PREDICTING UW BADGERS’S PERFORMANCE BY QUARTERBACK AND RUNNING BACK STATS By: Tyler Chu ECE 539 Fall 2013.
How to become President of the United States
ECE 539 Final Project ANN approach to help manufacturing of a better car Prabhdeep Singh Virk Fall 2010.
Artificial Neural Networks
Multi-Layer Perceptrons Michael J. Watts
Explorations in Neural Networks Tianhui Cai Period 3.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
© Copyright 2004 ECE, UM-Rolla. All rights reserved A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C.
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
Appendix B: An Example of Back-propagation algorithm
Matlab Matlab Sigmoid Sigmoid Perceptron Perceptron Linear Linear Training Training Small, Round Blue-Cell Tumor Classification Example Small, Round Blue-Cell.
Learning BlackJack with ANN (Aritificial Neural Network) Ip Kei Sam ID:
Using Neural Networks to Predict Claim Duration in the Presence of Right Censoring and Covariates David Speights Senior Research Statistician HNC Insurance.
Back-Propagation MLP Neural Network Optimizer ECE 539 Andrew Beckwith.
NEURAL NETWORKS FOR DATA MINING
Neural Networks1 Introduction to NETLAB NETLAB is a Matlab toolbox for experimenting with neural networks Available from:
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Applying Neural Networks Michael J. Watts
Multi-Layer Perceptron
Non-Bayes classifiers. Linear discriminants, neural networks.
A.N.N.C.R.I.P.S The Artificial Neural Networks for Cancer Research in Prediction & Survival A CSI – VESIT PRESENTATION Presented By Karan Kamdar Amit.
Margin of Error How accurate are statistics in the media?
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
Face Image-Based Gender Recognition Using Complex-Valued Neural Network Instructor :Dr. Dong-Chul Kim Indrani Gorripati.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Modelleerimine ja Juhtimine Tehisnärvivõrgudega Identification and Control with artificial neural networks.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Copyright 2009 Pearson Education, Inc., Publishing as Longman Longman PoliticalScienceInteractive Magleby & Light Government by the People Chapter 5 The.
Artificial Neural Networks for Data Mining. Copyright © 2011 Pearson Education, Inc. Publishing as Prentice Hall 6-2 Learning Objectives Understand the.
Artificial Neural Network System to Predict Golf Score on the PGA Tour ECE 539 – Fall 2003 Final Project Robert Steffes ID:
Unit I Geography: Its Nature and Perspective F. Sources of geographical ideas and data: the field, census data Unit II Population A.Geographical analysis.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
A Document-Level Sentiment Analysis Approach Using Artificial Neural Network and Sentiment Lexicons Yan Zhu.
Automatic Classification of Audio Data by Carlos H. L. Costa, Jaime D. Valle, Ro L. Koerich IEEE International Conference on Systems, Man, and Cybernetics.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Map Activity Reflecting on the vocabulary given to you during the Power Point Presentation for session 3, complete the following questions.
Modelleerimine ja Juhtimine Tehisnärvivõrgudega
CSE 473 Introduction to Artificial Intelligence Neural Networks
MLP Based Feedback System for Gas Valve Control in a Madison Symmetric Torus Andrew Seltzman Dec 14, 2010.
Neural Networks: Improving Performance in X-ray Lithography Applications ECE 539 Ryan T. Hogg May 10, 2000.
CSE P573 Applications of Artificial Intelligence Neural Networks
7th Grade Social Studies Name____________________
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Multi-Layer Perceptron
Multilayer Perceptron: Learning : {(xi, f(xi)) | i = 1 ~ N} → W
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Lecture 04: Multilayer Perceptron
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Predicting Voter Choice from Census Data
Presentation transcript:

Prediction of Voting Patterns Based on Census and Demographic Data Analysis Performed by: Mike He ECE 539, Fall 2005

Abstract  Prediction of Voting Patterns in 2004 Presidential Election  Multi-Layer Perceptron, Back-Propagation  Based on Demographic Data  Population Size  Gender Composition  Racial Composition  Age Composition

Voting Representations Area-Based Winner- Takes-All Map Strict Red/Blue binary color coding Can misrepresent actual popular opinion Population-Based Winner- Takes-All Cartogram Counties resized to reflect actual population More accurately reflects popular opinion Illustrates high density of urban areas and tendency to vote Democratic Linearly Shaded Vote- Percentage Map Colors shaded according to vote percentages Accurately portrays closeness of most races and political homogeneity throughout country

Experimental Procedures  Data Pre-Processing  Network Structure Determination  # of Hidden Layers, Neurons in Layers  Coefficients Determination  Training, Training Error Testing  Error from vote percentages, calling for candidate  Testing on Testing Data Set

Experimental Parameters  14 Features, 3 Outputs  Hyperbolic Tangent Activation Function for Hidden Layers  Sigmoid Activation Function for Output Layer  Learning coefficient α=0.2  Momentum coefficient μ=0.5

Experiment 1 – Network Structure  Many different structures tested according to total square error  Best performers isolated for further testing  Comparison of error across multiple trials between tested structures  Winner: 15 neurons in hidden layer, 4 hidden layers

Experiment 2 - Coefficients  To determine optimum α and μ  Different sets of coefficients tested based on total square error as well as maximum square error  Chosen configuration:  α = 0.2, and μ = 0.5

Classification Results  Application of MLP to attempt to predict which candidate will win each county  100 training and prediction trials  For Wisconsin (training data), 77% classification rate  For Minnesota (testing data), 75% classification rate  Less than 3% standard deviation in classification rate between trials

Concluding Remarks  Impressive overall predictive power  Retains predictive power for different states:  Wisconsin and Minnesota similar demographically, different politically  Predictions based only on demographics – innocuous data leads to powerful results  Demonstrates effectiveness of MLP’s as well as element of truth in common generalizations of demographic voting tendencies