EWSN08: European Workshop on Wireless Sensor Networks

Slides:



Advertisements
Similar presentations
Mining customer ratings for product recommendation using the support vector machine and the latent class model William K. Cheung, James T. Kwok, Martin.
Advertisements

Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Naïve-Bayes Classifiers Business Intelligence for Managers.
Decision Tree Approach in Data Mining
An Overview of Machine Learning
Social Media Mining Chapter 5 1 Chapter 5, Community Detection and Mining in Social Media. Lei Tang and Huan Liu, Morgan & Claypool, September, 2010.
Service Discrimination and Audit File Reduction for Effective Intrusion Detection by Fernando Godínez (ITESM) In collaboration with Dieter Hutter (DFKI)
Assuming normally distributed data! Naïve Bayes Classifier.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Data Mining Techniques Outline
1 Learning to Detect Objects in Images via a Sparse, Part-Based Representation S. Agarwal, A. Awan and D. Roth IEEE Transactions on Pattern Analysis and.
TRADING OFF PREDICTION ACCURACY AND POWER CONSUMPTION FOR CONTEXT- AWARE WEARABLE COMPUTING Presented By: Jeff Khoshgozaran.
1 On Constructing Efficient Shared Decision Trees for Multiple Packet Filters Author: Bo Zhang T. S. Eugene Ng Publisher: IEEE INFOCOM 2010 Presenter:
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Ensemble Learning: An Introduction
Data-intensive Computing Algorithms: Classification Ref: Algorithms for the Intelligent Web 6/26/20151.
Presented by Zeehasham Rasheed
ML ALGORITHMS. Algorithm Types Classification (supervised) Given -> A set of classified examples “instances” Produce -> A way of classifying new examples.
Chapter 5 Data mining : A Closer Look.
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
Extracting Places and Activities from GPS Traces Using Hierarchical Conditional Random Fields Yong-Joong Kim Dept. of Computer Science Yonsei.
Online Chinese Character Handwriting Recognition for Linux
Comparing the Parallel Automatic Composition of Inductive Applications with Stacking Methods Hidenao Abe & Takahira Yamaguchi Shizuoka University, JAPAN.
Active Learning for Class Imbalance Problem
by B. Zadrozny and C. Elkan
6/28/2014 CSE651C, B. Ramamurthy1.  Classification is placing things where they belong  Why? To learn from classification  To discover patterns  To.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Low-Power Wireless Sensor Networks
Distributed Anomaly Detection in Wireless Sensor Networks Ksutharshan Rajasegarar, Christopher Leckie, Marimutha Palaniswami, James C. Bezdek IEEE ICCS2006(Institutions.
Bug Localization with Machine Learning Techniques Wujie Zheng
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
Partially Supervised Classification of Text Documents by Bing Liu, Philip Yu, and Xiaoli Li Presented by: Rick Knowles 7 April 2005.
Data Mining Practical Machine Learning Tools and Techniques Chapter 4: Algorithms: The Basic Methods Section 4.6: Linear Models Rodney Nielsen Many of.
Today Ensemble Methods. Recap of the course. Classifier Fusion
Exploiting Context Analysis for Combining Multiple Entity Resolution Systems -Ramu Bandaru Zhaoqi Chen Dmitri V.kalashnikov Sharad Mehrotra.
Ensembles. Ensemble Methods l Construct a set of classifiers from training data l Predict class label of previously unseen records by aggregating predictions.
Chapter 11 Statistical Techniques. Data Warehouse and Data Mining Chapter 11 2 Chapter Objectives  Understand when linear regression is an appropriate.
Slides for “Data Mining” by I. H. Witten and E. Frank.
CHAPTER 8 DISCRIMINATIVE CLASSIFIERS HIDDEN MARKOV MODELS.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Multi-target Detection in Sensor Networks Xiaoling Wang ECE691, Fall 2003.
Dimensionality Reduction in Unsupervised Learning of Conditional Gaussian Networks Authors: Pegna, J.M., Lozano, J.A., Larragnaga, P., and Inza, I. In.
Discriminative Training and Machine Learning Approaches Machine Learning Lab, Dept. of CSIE, NCKU Chih-Pin Liao.
Statistical Models for Automatic Speech Recognition Lukáš Burget.
Toward Reliable and Efficient Reporting in Wireless Sensor Networks Authors: Fatma Bouabdallah Nizar Bouabdallah Raouf Boutaba.
Preliminary Transformations Presented By: -Mona Saudagar Under Guidance of: - Prof. S. V. Jain Multi Oriented Text Recognition In Digital Images.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Classification COMP Seminar BCB 713 Module Spring 2011.
BAYESIAN LEARNING. 2 Bayesian Classifiers Bayesian classifiers are statistical classifiers, and are based on Bayes theorem They can calculate the probability.
Ing-Ray Chen, Member, IEEE, Hamid Al-Hamadi Haili Dong Secure and Reliable Multisource Multipath Routing in Clustered Wireless Sensor Networks 1.
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
CMPS 142/242 Review Section Fall 2011 Adapted from Lecture Slides.
Data-intensive Computing Algorithms: Classification Ref: Algorithms for the Intelligent Web 7/10/20161.
Unsupervised Learning Part 2. Topics How to determine the K in K-means? Hierarchical clustering Soft clustering with Gaussian mixture models Expectation-Maximization.
Mingze Zhang, Mun Choon Chan and A. L. Ananda School of Computing
Hidden Markov Models BMI/CS 576
Data-intensive Computing Algorithms: Classification
Data Science Algorithms: The Basic Methods
QianZhu, Liang Chen and Gagan Agrawal
Statistical Models for Automatic Speech Recognition
Distributed Energy Efficient Clustering (DEEC) Routing Protocol
Net 435: Wireless sensor network (WSN)
Vijay Srinivasan Thomas Phan
Data Mining Lecture 11.
Hidden Markov Models Part 2: Algorithms
Statistical Models for Automatic Speech Recognition
The Naïve Bayes (NB) Classifier
Prepared by: Mahmoud Rafeek Al-Farra
Machine Learning: UNIT-3 CHAPTER-1
Sofia Pediaditaki and Mahesh Marina University of Edinburgh
NON-NEGATIVE COMPONENT PARTS OF SOUND FOR CLASSIFICATION Yong-Choon Cho, Seungjin Choi, Sung-Yang Bang Wen-Yi Chu Department of Computer Science &
Presentation transcript:

EWSN08: European Workshop on Wireless Sensor Networks Activity Recognition from On-Body Sensors: Accuracy-Power Trade-Off by Dynamic Sensor Selection EWSN08: European Workshop on Wireless Sensor Networks Piero Zappi, Clemens Lombriser, Thomas Stiefmeier, Elisabetta Farella,Daniel Roggen, Luca Benini, and Gerhard Troster 발표자 : 20095376 최재운

Contents Introduction System Details Evaluation Conclusion

Contents Introduction System Details Evaluation Conclusion Abstract Background Problem Statement Solution Approach System Details Evaluation Conclusion

Abstract In this paper, Authors present Dynamic Sensor Selection. In order to use efficiently available energy while achieving a desired activity recognition accuracy. They introduce an activity recognition method. Activity recognition method It relies on a meta-classifier that fuses the information of classifiers on individual sensors. Sensors are selected according to their contribution to classification accuracy.

Background Wearable computing Supporting people by delivering context-aware services Wearable technology has been used in behavioral modeling, health monitoring systems, information technologies and media development. Gestures and activities are important aspect of the user’s context Small and low-power wireless sensor nodes are used. Limited memory and computational power.

Background Wearable computing

Problem Statement Wearable computing issue Trade-off solution is needed!! High classification accuracy is needed Large number of sensors distribute over the body. For high classification accuracy, many sensors should be activated. Minimize energy use Sensors have battery limitations. For enhancing lifetime, minimizing sensor size is needed.

Solution Approach Related works about energy use Adaptive sampling rate and unpredictable duty cycle are representative methods. In this case, they can not be used to minimize energy use. Since, user gestures can occur at any time, fixed sensor sampling rate and continuous sensor node operation are needed. Here, they investigate how to extend network life in an activity recognition system, while maintaining a desired accuracy.

Contents Introduction System Details Evaluation Conclusion System Overview Metaclassifier for Activity Recognition Dynamic Sensor Selection Evaluation Conclusion

System Overview System Overview System relies on classifier fusion to combine multiple sensor data Gesture classification is performed on individual nodes using Hidden Markov Models (HMM). A Naïve Bayes classifier fuses these individual classification results to improve classification accuracy. System introduce dynamic sensor selection to cope with dynamically changing networks Most sensor nodes are kept in low power state and they are activated when their contribution is needed to keep the desired accuracy.

Metaclassifier for Activity Recognition This activity recognition algorithm is based on a metaclassifier fusing the contributions from several sensor nodes.

Metaclassifier for Activity Recognition Hidden Markov Models (HMM) A hidden Markov model (HMM) is a statistical model in which the system being modeled is assumed to be a Markov process with unobserved state.

Metaclassifier for Activity Recognition Features extracted from the sensor data are classified by competing Hidden Markov Models In this paper, they started with 15 random initial models and select the one that shows best classification accuracy on the training set.

Metaclassifier for Activity Recognition Finally, they fuse the class label using a naïve Bayes technique.

Metaclassifier for Activity Recognition The naïve Bayes classifier Probabilistic classifier based on the Bayes’ theorem and the hypothesis that the input features are independent A typical decision rule is to classify an instance as beloning to the class that maximizes the a posteriori probability. C : Class, Ai : n input attributes It is hard to compute

Metaclassifier for Activity Recognition The naïve Bayes classifier Applying the hypothesis of independence and the decision rule they obtain; The Likelihood is the only parameter that has to be calculated. Do not need to compute by experiments Common elements

Metaclassifier for Activity Recognition The naïve Bayes classifier Defining tc : the number of training instances for which the C=c and Ai=ai t : the number of training instances for class c Some classes c may not have a sample for which Ai=ai. => = 0 For this reason, they define as follows; m : the virtual sample per class added to the training set p : a priori probability of a certain value for an attribute

Dynamic Sensor Selection Purpose : To achieve a desired classification accuracy while prolonging the system lifetime To select at run-time the sensors which are combined to perform gesture classification. The system minimize the number of sensor used.

Dynamic Sensor Selection Example Activated cluster set of sensors to achieve the desired classification accuracy is first selected ( Cluster Size = D ) All subclusters of size (D-1) must still achieve the desired accuracy

Dynamic Sensor Selection Example When a node fails, they first test whether the remaining nodes fulfill this condition( sub cluster of size D-1 must achieve desired accuracy)

Dynamic Sensor Selection Example If not, all the clusters of size D+1 that can be built by adding one idle node to the given cluster are tested. The one that achieves the best performance is selected

Dynamic Sensor Selection Example If not, the process is repeated until a cluster that fulfills the condition or no idle nodes are left. In the latter case the system is not able to achieve the desired performance any more.

Contents Introduction System Details Evaluation Conclusion Evaluation of Activity Recognition Performance Network Lifetime Conclusion

Evaluation of Activity Recognition Performance

Evaluation of Activity Recognition Performance Purpose : Evaluate the performance of classification as a function of the number of nodes They perform a set of experiments using 19 nodes placed on the two arms of a tester They applied their algorithm to clusters of nodes with increasing size (one to 19 nodes). For each size, they created 200 clusters from randomly selected sensor nodes. For each cluster size, the average, maximum and minimum classification accuracy is recorded

Evaluation of Activity Recognition Performance Correct classification ratio among random cluster as a function of cluster size

Network Lifetime Dynamic sensor selection scheme vs all sensors (90% minimum correct classification ratio)

Network Lifetime Dynamic sensor selection scheme vs all sensors (85% minimum correct classification ratio)

Network Lifetime Dynamic sensor selection scheme vs all sensors (80% minimum correct classification ratio)

Network Lifetime Network life as a function of the minimum accuracy required

Evolution of the network Network Lifetime Evolution of the network On the left, in dark, are the active nodes On the right, the number of active nodes A) 80% minimum accuracy. B) 90% minimum accuracy

Contents Introduction System Details Evaluation Conclusion Pros Cons

Conclusion Energy aware design aims to extend sensor nodes life by using low power devices and poweraware applications. Their method minimizes the number of nodes necessary to achieve a given classification ratio.

Conclusion 1. System Level Pros 주어진 classification ratio를 만족시키면서 network lifetime을 증가시킬 수 있었음. 전체를 다 사용하는 것 보다 월등히 좋은 lifetime을 가지고 있다는 것을 알 수 있음. 각 노드에서 병렬적으로 datamining을 수행하기 때문에, sensor network 특성에 잘 맞음. 각 센서가 제한된 자원을 가지는 센서네트워크의 특성상 병렬적 처리가 적합함.

Conclusion 1. System Level Cons naïve Bayes classifier 계산시 모든 class의 P(C=c)가 같다는 가정의 신빙성 결여 실험 상 모든 class가 나올 확률이 같다고 하였지만, 사람이 처한 상황 등 기타 조건에 따라 class가 나올 확률이 다를 가능성도 높음. 이러한 확률을 미리 계산하여 계산에 추가를 하였다면, 계산량은 많아지겠지만 정확도를 높일 수 있을 것으로 예상. naïve Bayes classifier 계산시 (a1, a2, …) 의 independence 가정 각각의 sensor에서 분석한 a1, a2 등이 independence하다는 가정하에 naïve Bayes classifier 를 수행하였음. 한 동작에 대해서 각각의 sensor가 동시에 분석하여 나온 결과물이 independence 하다는 가정은 적합하지 않을 것 같음.

Conclusion 1. System Level Cons Dynamic sensor selection에서 새로운 노드 추가하는 방법에 대한 추가 논의 필요 본 논문에서는 cluster에 새로운 노드를 추가할 시, 모든 조합을 다 맞춰본 후 가장 성능이 좋은 것을 추가하기로 하였음. 이러한 방법은 실시간으로 실행시 overhead가 발생할 수 있기 때문에, 미리 노드별로 priority를 선정하고 이에 맞춰서 새로운 노드 추가 방안 고려. Network lifetime 늘리는데 더욱 초점을 맞추고자 한다면, idle 노드 중 잔여 배터리가 많은 노드에게 배치하는 방안 고려.

Conclusion 2. Literature Level Pros Cons 기본의 data mining 기법 중 신뢰도가 높은 것을 선정하여 classifier로 삼았음. Cons 타 알고리즘과 비교 부족 본 논문에서는 자신들의 selection 기법과 전체노드가 다 사용되는 방법을 비교. Network lifetime을 늘리는 것을 더욱 강조하기 위해서는, lifetime을 늘리기 위한 다른 방안들과 직접적이 비교가 더 필요.