Jiann-Ming Wu, Ya-Ting Zhou, Chun-Chang Wu National Dong Hwa University Department of Applied Mathematics Hualien, Taiwan Learning Markov-chain embedded.

Slides:



Advertisements
Similar presentations
An Adaptive Learning Method for Target Tracking across Multiple Cameras Kuan-Wen Chen, Chih-Chuan Lai, Yi-Ping Hung, Chu-Song Chen National Taiwan University.
Advertisements


Kriging.
Cost of surrogates In linear regression, the process of fitting involves solving a set of linear equations once. For moving least squares, we need to.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
An Introduction to Variational Methods for Graphical Models.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
A New Nonparametric Bayesian Model for Genetic Recombination in Open Ancestral Space Presented by Chunping Wang Machine Learning Group, Duke University.
Hidden Markov Models Fundamentals and applications to bioinformatics.
Graduate School of Information Sciences, Tohoku University
1 CE 530 Molecular Simulation Lecture 8 Markov Processes David A. Kofke Department of Chemical Engineering SUNY Buffalo
HMM-BASED PATTERN DETECTION. Outline  Markov Process  Hidden Markov Models Elements Basic Problems Evaluation Optimization Training Implementation 2-D.
1 Data Persistence in Large-scale Sensor Networks with Decentralized Fountain Codes Yunfeng Lin, Ben Liang, Baochun Li INFOCOM 2007.
Clustering over Multiple Evolving Streams by Events and Correlations Mi-Yen Yeh, Bi-Ru Dai, Ming-Syan Chen Electrical Engineering, National Taiwan University.
Kurtis Cahill James Badal.  Introduction  Model a Maze as a Markov Chain  Assumptions  First Approach and Example  Second Approach and Example 
Face Recognition Using Embedded Hidden Markov Model.
PART 7 Constructing Fuzzy Sets 1. Direct/one-expert 2. Direct/multi-expert 3. Indirect/one-expert 4. Indirect/multi-expert 5. Construction from samples.
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Metamorphic Malware Research
1 Hybrid Agent-Based Modeling: Architectures,Analyses and Applications (Stage One) Li, Hailin.
Modern Navigation Thomas Herring
Radial Basis Function Networks
Lecture 11 – Stochastic Processes
The effect of New Links on Google Pagerank By Hui Xie Apr, 07.
1 CE 530 Molecular Simulation Lecture 7 David A. Kofke Department of Chemical Engineering SUNY Buffalo
1 Statistical Mechanics and Multi- Scale Simulation Methods ChBE Prof. C. Heath Turner Lecture 11 Some materials adapted from Prof. Keith E. Gubbins:
Isolated-Word Speech Recognition Using Hidden Markov Models
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
1 Physical Fluctuomatics 5th and 6th Probabilistic information processing by Gaussian graphical model Kazuyuki Tanaka Graduate School of Information Sciences,
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 16: NEURAL NETWORKS Objectives: Feedforward.
Clustering Spatial Data Using Random Walk David Harel and Yehuda Koren KDD 2001.
1 RECENT DEVELOPMENTS IN MULTILAYER PERCEPTRON NEURAL NETWORKS Walter H. Delashmit Lockheed Martin Missiles and Fire Control Dallas, TX 75265
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
Binary Stochastic Fields: Theory and Application to Modeling of Two-Phase Random Media Steve Koutsourelakis University of Innsbruck George Deodatis Columbia.
Well Log Data Inversion Using Radial Basis Function Network Kou-Yuan Huang, Li-Sheng Weng Department of Computer Science National Chiao Tung University.
10 December, 2008 CIMCA2008 (Vienna) 1 Statistical Inferences by Gaussian Markov Random Fields on Complex Networks Kazuyuki Tanaka, Takafumi Usui, Muneki.
The Dirichlet Labeling Process for Functional Data Analysis XuanLong Nguyen & Alan E. Gelfand Duke University Machine Learning Group Presented by Lu Ren.
14 October, 2010LRI Seminar 2010 (Univ. Paris-Sud)1 Statistical performance analysis by loopy belief propagation in probabilistic image processing Kazuyuki.
CHAPTER 5 SIGNAL SPACE ANALYSIS
© 2015 McGraw-Hill Education. All rights reserved. Chapter 19 Markov Decision Processes.
ECE-7000: Nonlinear Dynamical Systems Overfitting and model costs Overfitting  The more free parameters a model has, the better it can be adapted.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Sampling and estimation Petter Mostad
A Dynamic Conditional Random Field Model for Object Segmentation in Image Sequences Duke University Machine Learning Group Presented by Qiuhua Liu March.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Lab for Remote Sensing Hydrology and Spatial Modeling Dept of Bioenvironmental Systems Engineering National Taiwan University 1/45 GEOSTATISTICS INTRODUCTION.
Learning Chaotic Dynamics from Time Series Data A Recurrent Support Vector Machine Approach Vinay Varadan.
U of Minnesota DIWANS'061 Energy-Aware Scheduling with Quality of Surveillance Guarantee in Wireless Sensor Networks Jaehoon Jeong, Sarah Sharafkandi and.
6.4 Random Fields on Graphs 6.5 Random Fields Models In “Adaptive Cooperative Systems” Summarized by Ho-Sik Seok.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Discriminative Training and Machine Learning Approaches Machine Learning Lab, Dept. of CSIE, NCKU Chih-Pin Liao.
Stochastic Processes and Transition Probabilities D Nagesh Kumar, IISc Water Resources Planning and Management: M6L5 Stochastic Optimization.
Linear Classifiers Dept. Computer Science & Engineering, Shanghai Jiao Tong University.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
Hidden Markov Models. A Hidden Markov Model consists of 1.A sequence of states {X t |t  T } = {X 1, X 2,..., X T }, and 2.A sequence of observations.
A Hybrid Model of HMM and RBFN Model of Speech Recognition 길이만, 김수연, 김성호, 원윤정, 윤아림 한국과학기술원 응용수학전공.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 17: Boltzmann Machines as Probabilistic Models Geoffrey Hinton.
An unsupervised conditional random fields approach for clustering gene expression time series Chang-Tsun Li, Yinyin Yuan and Roland Wilson Bioinformatics,
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Jun Liu Department of Statistics Stanford University
Course: Autonomous Machine Learning
Convolutional Networks
Kernel Stick-Breaking Process
Hidden Markov Models Part 2: Algorithms
REMOTE SENSING Multispectral Image Classification
A Tutorial on Bayesian Speech Feature Enhancement
Graduate School of Information Sciences, Tohoku University
On the Design of RAKE Receivers with Non-uniform Tap Spacing
Random Neural Network Texture Model
Presentation transcript:

Jiann-Ming Wu, Ya-Ting Zhou, Chun-Chang Wu National Dong Hwa University Department of Applied Mathematics Hualien, Taiwan Learning Markov-chain embedded recurrence relations for chaotic time series analysis

Outline Introduction High-order Markov processes for stochastic modeling Nonlinear recurrence relations for deterministic modeling Recurrence relation approximation by supervised learning of radial or projective basis functions Markov-chain embedded recurrence relations Numerical Simulations Conclusions

High-order Markov assumption Let Z[t] denote time series, where t is positive integers High-order Markov assumption- Given chaotic time series are oriented from a generative source well characterized by a high-order Markov process. An order-  Markov process obeys memory-less property Current event only depends on instances of  most recently events instead of all historic events

Recurrence relation Conditional expectation of an upcoming event to  most recently events is expressed by a recurrence relation

Recurrence relation for time series modeling predictor target

Mackey-Glass 30 chaotic time series data Chaotic time series Laser data from the SFI competition

RECURRENCE RELATION APPROXIMATION Learning neural networks for approximating underlying recurrence relation F denotes a mapping realized by radial or projective basis functions denotes adaptive network parameters

Recurrence relation approximation Form paired predictor and target by assigning Define the mean square error of approximating Apply Levenberg-Marquardt learning to resolve unconstrained optimization Apply the proposed pair-data generative model to formulate F

Pair-data generative model (PGM) K sub-models

Mixtures of paired Gaussians A stochastic model for formation emulation of given paired data Each time one of joined pairs is selected according to a set of prior probabilities Apply the selected paired Gaussians to generate paired data

Each pair is exactly generated by a sub-model Let denote the exclusive membership of where denotes a unitary vector with the ith bit active By exclusive membership The conditional expectation of y to given x is defined by r denotes local means of the target variable Exclusive Memberships

Overlapping memberships A Potts random variable is applied to encode overlapping membership The probability of being the kth state is set to where modulates the overlapping degree and denotes local mean of the predictor

Normalized radial basis functions ( NRBF ) The conditional expectation exactly sketches a mapping realized by normalized radial basis functions

Figure 4

Figure 9 Mackey-Glass 17 chaotic time series data

Multiple recurrence relations Multiple recurrence relations for modeling more complex chaotic time series Chaotic time series Laser data from the SFI competition

Markov-chain embedded recurrence relations A Markov chain of PGMs (pair-data generative models) Transition matrix denotes the probability of transition from model i to model j

Data generation Emulate data generation by a stochastic Markov chain of PGMs

Inverse problem of Markov chain embedded PGMs

Segmentation for phase change A time tag is regarded as a switching point if its moving average error greater than a threshold value

A simple rule for merging two PGMs The goodness of fitting the ith PGM to paired data in S j is defined by Two PGMs are merged. S i and S j are regarded from the same PGM if (E i,j +E j,i )/2 is less than a threshold value

NUMERICAL SIMULATIONS – Synthetic data

Temporal sequence generated by MC-embedded PGMs

Numerical results – original and reconstructed MC- embedded PGMs

Chaotic time series Markov chain embedded recurrence relations Generated chaotic time series Laser data from the SFI competition M=60,[K, ,, N 0 ] = [ 10, 10, 0.001, 500 ] Learning

Conclusions This work has presented learning Markov-chain embedded recurrence relations for complex time series analysis. Levenberg-Marquardt supervised learning of neural networks has been shown potential for extracting essential recurrence relation underlying given time series Markov-chain embedded recurrence relations are shown applicable for characterizing complex chaotic time series The proposed systematic approach integrates pattern segmentation, hidden state absorption and transition probability estimation based on supervised learning of neural networks