Vikramaditya R. Jakkula, G. Michael Youngblood and Diane J. Cook AAAI ’06 Workshop on Computational Aesthetics July 16, 2006.

Slides:



Advertisements
Similar presentations
Technology Enabled High-Touch Care Majd Alwan, Ph.D. Medical Automation Research Center University of Virginia Improving healthcare quality and efficiency.
Advertisements

Integrated Instance- and Class- based Generative Modeling for Text Classification Antti PuurulaUniversity of Waikato Sung-Hyon MyaengKAIST 5/12/2013 Australasian.
Bab /44 Bab 4 Classification: Basic Concepts, Decision Trees & Model Evaluation Part 1 Classification With Decision tree.
Jeff Howbert Introduction to Machine Learning Winter Collaborative Filtering Nearest Neighbor Approach.
ASSESSING SEARCH TERM STRENGTH IN SPOKEN TERM DETECTION Amir Harati and Joseph Picone Institute for Signal and Information Processing, Temple University.
The Wisdom of the Few A Collaborative Filtering Approach Based on Expert Opinions from the Web Xavier Amatriain Telefonica Research Nuria Oliver Telefonica.
1. 2 Outline Background on Landslides Landslides Prediction System Architecture Solution Evaluation.
Multiple Criteria for Evaluating Land Cover Classification Algorithms Summary of a paper by R.S. DeFries and Jonathan Cheung-Wai Chan April, 2000 Remote.
Vikramaditya Jakkula. MavPad Argus Sensor Network  around 100 Sensors.  include Motion, Devices, Light, Pressure, Humidity and more.
Vikramaditya R. Jakkula, Diane J. Cook & Aaron S. Crandall Washington State University Presented by Aaron S. Crandall.
Vikramaditya Jakkula Washington State University First International Workshop on Smart Homes for Tele-Health.
Semantic Analysis of Movie Reviews for Rating Prediction
Vikramaditya R. Jakkula & Diane J. Cook Washington State University.
CS 590M Fall 2001: Security Issues in Data Mining Lecture 3: Classification.
By Fernando Seoane, April 25 th, 2006 Demo for Non-Parametric Classification Euclidean Metric Classifier with Data Clustering.
Instance Based Learning. Nearest Neighbor Remember all your data When someone asks a question –Find the nearest old data point –Return the answer associated.
Comparison of Instance-Based Techniques for Learning to Predict Changes in Stock Prices iCML Conference December 10, 2003 Presented by: David LeRoux.
June 27-28, 2006 Vikramaditya Jakkula Monitoring Health by Detecting Drifts and Outliers for a Smart Environment Inhabitant Gaurav Jain, Diane J. Cook,
Face Processing System Presented by: Harvest Jang Group meeting Fall 2002.
05/06/2005CSIS © M. Gibbons On Evaluating Open Biometric Identification Systems Spring 2005 Michael Gibbons School of Computer Science & Information Systems.
CS Instance Based Learning1 Instance Based Learning.
12 -1 Lecture 12 User Modeling Topics –Basics –Example User Model –Construction of User Models –Updating of User Models –Applications.
Supervised Learning and k Nearest Neighbors Business Intelligence for Managers.
Slide Image Retrieval: A Preliminary Study Guo Min Liew and Min-Yen Kan National University of Singapore Web IR / NLP Group (WING)
Chapter 8 Prediction Algorithms for Smart Environments
ENN: Extended Nearest Neighbor Method for Pattern Recognition
Overview: Humans are unique creatures. Everything we do is slightly different from everyone else. Even though many times these differences are so minute.
An Evaluation of A Commercial Data Mining Suite Oracle Data Mining Presented by Emily Davis Supervisor: John Ebden.
GEOMETRIC VIEW OF DATA David Kauchak CS 451 – Fall 2013.
Human Gesture Recognition Using Kinect Camera Presented by Carolina Vettorazzo and Diego Santo Orasa Patsadu, Chakarida Nukoolkit and Bunthit Watanapa.
Vikramaditya Jakkula Washington State University IEEE Workshop of Data Mining in Medicine 2007 (DMMed '07) In conjunction with IEEE.
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
January Smart Environments: Artificial Intelligence in the Home and Beyond Diane J. Cook
Vikramaditya Jakkula & Diane J. Cook Artificial Intelligence Lab Washington State University 2 nd International Conference on Technology and Aging (ICTA)
Developing Risk of Mortality and Early Warning Score Models using Routinely Collected Data Healthy Computing Seminar Tessy Badriyah 22 May 2013.
Final Year Project Lego Robot Guided by Wi-Fi (QYA2) Presented by: Li Chun Kit (Ash) So Hung Wai (Rex) 1.
Final Year Project Lego Robot Guided by Wi-Fi (QYA2)
Lecture 27: Recognition Basics CS4670/5670: Computer Vision Kavita Bala Slides from Andrej Karpathy and Fei-Fei Li
Introduction  Use Allen’s Temporal Relations [3] to identify temporal relations among Activities in Daily Life of the resident.  Allen’s relations form.
A Content-Based Approach to Collaborative Filtering Brandon Douthit-Wood CS 470 – Final Presentation.
Computational Approaches for Biomarker Discovery SubbaLakshmiswetha Patchamatla.
CS378 Final Project The Netflix Data Set Class Project Ideas and Guidelines.
Outline K-Nearest Neighbor algorithm Fuzzy Set theory Classifier Accuracy Measures.
Lazy Learners K-Nearest Neighbor algorithm Fuzzy Set theory Classifier Accuracy Measures.
Doc.: IEEE At home, IoT Use Case(s) for LRLP January 2016 SubmissionSlide 1 At home, IoT Use Case(s) for LRLP Date: Intel Authors:
Brian Lukoff Stanford University October 13, 2006.
A Clustering-based QoS Prediction Approach for Web Service Recommendation Shenzhen, China April 12, 2012 Jieming Zhu, Yu Kang, Zibin Zheng and Michael.
A Brief Introduction and Issues on the Classification Problem Jin Mao Postdoc, School of Information, University of Arizona Sept 18, 2015.
Learning Photographic Global Tonal Adjustment with a Database of Input / Output Image Pairs.
Data Mining By Farzana Forhad CS 157B. Agenda Decision Tree and ID3 Rough Set Theory Clustering.
Meta-learning for Algorithm Recommendation Meta-learning for Algorithm Recommendation Background on Local Learning Background on Algorithm Assessment Algorithm.
Data Mining By: Johan Johansson. Mining Techniques Association Rules Association Rules Decision Trees Decision Trees Clustering Clustering Nearest Neighbor.
Combining multiple learners Usman Roshan. Decision tree From Alpaydin, 2010.
The Little Big Data Showdown ECE 8110 Pattern Recognition and Machine Learning Temple University Christian Ward.
ASSESSING SEARCH TERM STRENGTH IN SPOKEN TERM DETECTION Amir Harati and Joseph Picone Institute for Signal and Information Processing, Temple University.
Target Classification in Wireless Distributed Sensor Networks (WSDN) Using AI Techniques Can Komar
Unveiling Zeus Automated Classification of Malware Samples Abedelaziz Mohaisen Omar Alrawi Verisign Inc, VA, USA Verisign Labs, VA, USA
Anomaly Detection Carolina Ruiz Department of Computer Science WPI Slides based on Chapter 10 of “Introduction to Data Mining” textbook by Tan, Steinbach,
ItemBased Collaborative Filtering Recommendation Algorithms 1.
Vivaldi: A Decentralized Network Coordinate System
Prepared by: Mahmoud Rafeek Al-Farra
Efficient Image Classification on Vertically Decomposed Data
Evaluating Techniques for Image Classification
Trees, bagging, boosting, and stacking
Essential Questions Why is accurate weather data important?
Efficient Image Classification on Vertically Decomposed Data
Predicting Pneumonia & MRSA in Hospital Patients
Project 1: Smart Home REU student: Jason Ling Graduate mentors: Safa Bacanli Faculty mentor(s): Damla Turgut Week 9 (July 9 – July ) Accomplishments:
Xin Qi, Matthew Keally, Gang Zhou, Yantao Li, Zhen Ren
Jia-Bin Huang Virginia Tech
Presentation transcript:

Vikramaditya R. Jakkula, G. Michael Youngblood and Diane J. Cook AAAI ’06 Workshop on Computational Aesthetics July 16, 2006

 UN report also predicts the number of people 60 and over will triple, increasing from 606 million in 2000 to nearly 1.9 billion by 2050  Medicare home health agency benefit payments increased between 2004 and 2005 from $10.5 billion to $12.5 billion  The need for, smart and cost effective, in-home health monitoring technology to replace the existing home caregivers arises.  Early prediction based on the observations would aid to an improved quality of life at home.

 The evaluation environment is a student apartment with a deployed Argus and X-10 network.  There are over 150 sensors deployed in the MavPad that include light, temperature, humidity, and switches.

Example Sensor Readings Includes Motion, Temperature, Light, Humidity Example motion on bed distance traveled at home Example Instrumental Activities X-10 based activities Low Level Metrics Mid-Level Metrics High- Level Metrics

 First: we look for correlations and perform t-tests to find any similarities among the classification metrics.  Second: we use the machine learning technique, the k-nearest neighbor algorithm, to predict the state of wellness the inhabitant will experience on the following day.

Experiment One Results: T-test and correlations on collected metrics. T-test and correlations on collected metrics.

Figure 2: Comparison of Upper, Lower and Lowest body motion in bed.

Figure: 3 Comparison of number of visits to the bathroom, closet and kitchen.

Figure: 4 Comparison of Distance traveled in home, calorie intake and monitored pulse.

Figure 5: Comparison of Happy Vs Health state of Inhabitant.

Figure 6: Comparison of X10 readings and sensor firing which constitutes the high level metrics.

Experiment Two Results: Prediction Using K-Nearest Neighbor. Prediction Using K-Nearest Neighbor.

 Level of happiness classified into three (yes– happy, ok–mediocre, no–not happy) on scale of values 6 to 10 to be yes, 5 to be ok, and below 5 is no.  Used Weka, application of K-Nearest Neighbor and other techniques for comparison.  k-nearest neighbors’ out performs other learning techniques by accuracy of % and with error of %

Table 11: Sample predictions on test split.

Learning AlgorithmAccuracy (%) J % IB % SVM (SMO)65% KNN % Table 12: Comparison of Accuracy of prediction techniques.

 A simple sensor network in smart home can be used to detect lifestyle patterns.  Performance of the classifier would be expected to improve given a larger collected dataset.  The k-nearest neighbor technique performed better than other commonly used techniques.

 Datasets still have many interesting findings.  Continue Collecting Data for a longer time period.  Plans to address the problem of automating the SF-36® health survey form.

Thank You