Download presentation
Presentation is loading. Please wait.
Published byAaliyah Fieldhouse Modified over 9 years ago
1
Intelligent Environments1 Computer Science and Engineering University of Texas at Arlington
2
Intelligent Environments2 Prediction for Intelligent Environments Motivation Techniques Issues
3
Intelligent Environments3 Motivation An intelligent environment acquires and applies knowledge about you and your surroundings in order to improve your experience. “acquires” prediction “applies” decision making
4
Intelligent Environments4 What to Predict Inhabitant behavior Location Task Action Environment behavior Modeling devices Interactions
5
Intelligent Environments5 Example Where will Bob go next? Location t+1 = f(…) Independent variables Location t, Location t-1, … Time, date, day of the week Sensor data Context Bob’s task
6
Intelligent Environments6 Example (cont.) TimeDateDayLocation t Location t+1 063002/25MondayBedroomBathroom 070002/25MondayBathroomKitchen 073002/25MondayKitchenGarage 173002/25MondayGarageKitchen 180002/25MondayKitchenBedroom 181002/25MondayBedroomLiving room 220002/25MondayLiving roomBathroom 221002/25MondayBathroomBedroom 063002/26TuesdayBedroomBathroom
7
Intelligent Environments7 Example Learned pattern If Day = Monday…Friday & Time > 0600 & Time < 0700 & Location t = Bedroom Then Location t+1 = Bathroom
8
Intelligent Environments8 Prediction Techniques Regression Neural network Nearest neighbor Bayesian classifier Decision tree induction Others
9
Intelligent Environments9 Linear Regression xy 13 25 37 49
10
Intelligent Environments10 Multiple Regression n independent variables Find b i System of n equations and n unknowns
11
Intelligent Environments11 Regression Pros Fast, analytical solution Confidence intervals y = a ± b with C% confidence Piecewise linear and nonlinear regression Cons Must choose model beforehand Linear, quadratic, … Numeric variables
12
Intelligent Environments12 Neural Networks
13
Intelligent Environments13 Neural Networks 10-10 5 synapses per neuron Synapses propagate electrochemical signals Number, placement and strength of connections changes over time (learning?) Massively parallel
14
Intelligent Environments14 Computer vs. Human Brain ComputerHuman Brain Computational units1 CPU, 10 8 gates10 11 neurons Storage units10 10 bits RAM, 10 12 bits disk 10 11 neurons, 10 14 synapses Cycle time10 -9 sec10 -3 sec Bandwidth10 9 bits/sec10 14 bits/sec Neuron updates / sec10 6 10 14
15
Intelligent Environments15 Computer vs. Human Brain “The Age of Spiritual Machines,” Kurzweil.
16
Intelligent Environments16 Artificial Neuron
17
Intelligent Environments17 Artificial Neuron Activation functions
18
Intelligent Environments18 Perceptron
19
Intelligent Environments19 Perceptron Learning
20
Intelligent Environments20 Perceptron Learns only linearly-separable functions
21
Intelligent Environments21 Sigmoid Unit
22
Intelligent Environments22 Multilayer Network of Sigmoid Units
23
Intelligent Environments23 Error Back-Propagation Errors at output layer propagated back to hidden layers Error proportional to link weights and activation Gradient descent in weight space
24
Intelligent Environments24 NN for Face Recognition 90% accurate learning head pose for 20 different people.
25
Intelligent Environments25 Neural Networks Pros General purpose learner Fast prediction Cons Best for numeric inputs Slow training Local optima
26
Intelligent Environments26 Nearest Neighbor Just store training data (x i,f(x i )) Given query x q, estimate using nearest neighbor x k : f(x q ) = f(x k ) k nearest neighbor Given query x q, estimate using majority (mean) of k nearest neighbors
27
Intelligent Environments27 Nearest Neighbor
28
Intelligent Environments28 Nearest Neighbor Pros Fast training Complex target functions No loss of information Cons Slow at query time Easily fooled by irrelevant attributes
29
Intelligent Environments29 Bayes Classifier Recall Bob example D = training data h = sample rule
30
Intelligent Environments30 Naive Bayes Classifier Naive Bayes assumption Naive Bayes classifier y represents Bob’s location
31
Intelligent Environments31 Bayes Classifier Pros Optimal Discrete or numeric attribute values Naive Bayes easy to compute Cons Bayes classifier computationally intractable Naive Bayes assumption usually violated
32
Intelligent Environments32 Decision Tree Induction Day Time > 0600 Location t Time < 0700 Bathroom M…F yes Bedroom … no Sat Sun
33
Intelligent Environments33 Decision Tree Induction Algorithm (main loop) 1. A = best attribute for next node 2. Assign A as attribute for node 3. For each value of A, create descendant node 4. Sort training examples to descendants 5. If training examples perfectly classified, then Stop, else iterate over descendants
34
Intelligent Environments34 Decision Tree Induction Best attribute Based on information-theoretic concept of entropy Choose attribute reducing entropy (~uncertainty) from parent to descendant nodes A1A2 Bathroom (0) Kitchen (50) Bathroom (50) Kitchen (0) Bathroom (25) Kitchen (25) Bathroom (25) Kitchen (25) ??BK v2v2 v1v1 v1v1 v2v2
35
Intelligent Environments35 Decision Tree Induction Pros Understandable rules Fast learning and prediction Cons Replication problem Limited rule representation
36
Intelligent Environments36 Other Prediction Methods Hidden Markov models Radial basis functions Support vector machines Genetic algorithms Relational learning
37
Intelligent Environments37 Prediction Issues Representation of data and patterns Relevance of data Sensor fusion Amount of data
38
Intelligent Environments38 Prediction Issues Evaluation Accuracy False positives vs. false negatives Concept drift Time-series prediction Distributed learning
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.