Download presentation
Presentation is loading. Please wait.
1
Forecasting Wavelet Transformed Time Series with Attentive Neural Networks
Yi Zhao1, Yanyan Shen*1, Yanmin Zhu1, Junjie Yao2 1Shanghai Jiao Tong University 2East China Normal University ICDM 2018
2
Outline Motivation Preliminaries Model Experiments Conclusion
3
Motivation Forecasting complex time series demands time-domain & frequency-domain information. e.g., stock prices, web traffic, etc. Various methods to extract local time-frequency features which are important to predict the future values. Fourier Transform Short-time Fourier Transform Wavelet Transform Use the varying global trend to identify the most salient parts of local time-frequency information to better predict the future values.
4
Preliminaries Problem Statement Wavelets
Given a time series πΏ= π₯ π‘ βπ
π‘=1, 2, β¦, π}, predict π₯ π+π πβ β + , the future value in time π+π via a function π : π₯ π+π =π(πΏ) Wavelets Given a basic wavelet function h(β), we can get the waveletsοΌ β π,π = π β π‘ βπ π Continuous Wavelet Transform (CWT) The continuous wavelet transform refers to the βsimilarityβ between the signal x(π‘) and the basis function β π,π β οΌ πΆππ π₯ (π, π)= π x(π‘) β ( π‘ βπ π ) ππ‘
5
3. CNN feature extraction
Model Overview 1. Input time series LSTM π₯ π+π f_att(, W) 2. Scalogram 3. CNN feature extraction 4. Attention Module 5. Fusion & Prediction Preprocessing Given input time series πΏ, we denote by πΆππ π₯ (π, π) the wavelet transform coefficients matrix. The scalogram πΏ s is defined as follows: πΏ s = || πΆππ π₯ (π, π)|| 2 Source: Wavelet Tutorial by Robi Polikar,
6
Model D π₯ 1 β¦ β¦ AttentionNet π₯ t β¦ β¦ β¦ π₯ T β¦ β¦ β¦β¦ C β¦ β¦ β¦ D β¦
LSTM π₯ 1 β 1 β¦ β¦ β π‘β1 AttentionNet LSTM π₯ t β¦ β π‘ β¦ β¦ β πβ1 πΆ π β π LSTM π₯ T β¦ β¦ πΆ π β¦β¦ C π₯ π+π β¦ β¦ πΆ πͺβπ β¦ D πΆ πͺ β¦ VGG output features Attention Module Fusion & Prediction
7
Model CNN: extract local time-frequency features
Feed scalogram πΏ s to a stack of convolution layers: πΏ π (π) = π( πΎ π π β πΏ s + π π (π) ) LSTM: learn global long-term trend and get hidden state π π in the last step Attention module: discriminate the importance of local features dynamically Given time-frequency features πΏ π πΏ = π π πβ[1, πΆ]} and π π Attention score: π π = π ππ‘π‘ π π , π π = π π π πΎ π π π ; π π + π π +π; πΌ π = expβ‘( π π ) π=1 πΆ expβ‘( π π ) Weighted sum of local time-frequency features: π= π=1 πΆ πΌ π π π Fusion & Prediction: combine local and global features for prediction π₯ π+π = π π π π π π§; π π + π π Objective Function Squared Loss: π³ πππ π = π=1 π ( π₯ π‘+π π β π₯ π‘+π π ) 2 + π ||πΎ|| 2
8
Datasets Stock opening prices Power consumption
Collected from Yahoo! Finance. Daily opening prices of 50 stocks among 10 sectors from 2007 to Each stock has 2518 daily opening prices. Daily opening prices from 2007 to 2014 are used as training data, and those in and 2016 are used for validation and testing, respectively. Power consumption Electric power consumption in one household over 4 years. Sampled at one-minute rate. 475,023 data points in year 2010.
9
Main Results Metric: Baselines
Mean Squared Error: πππΈ= 1 π π=1 π ( π₯ π‘+π π β π₯ π‘+π π ) 2 Baselines NaΓ―ve: take the last value in the series as the prediction value Ensemble of LSTM & CNN: feed the concatenation of features from VGGnet and the last hidden state from LSTM into the fusion and prediction directly.
10
Case Study Illustration of attention mechanism
Given an input of 20 stock prices, we show the scalogram, and the attention weights. The model attends to the local features that are similar to the global trend and helps in predicting the future value.
11
Conclusion Wavelet transform is able to explicitly disclose the latent components at different frequencies from a complex time series. We develop a novel attention-based neural network that leverages CNN to extract local time-frequency features and applies LSTM to capture the long-term global trend simultaneously. The experimental results on two real life datasets verify the usefulness of time-frequency information from wavelet transformed time series and the our method in terms of prediction accuracy.
12
THANK you! Q&A
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.