Download presentation
Presentation is loading. Please wait.
Published byEmerald Mathews Modified over 9 years ago
1
Data Mining: Neural Network Applications by Louise Francis CAS Convention, Nov 13, 2001 Francis Analytics and Actuarial Data Mining, Inc. louise_francis@msn.com www.francisanalytics.com louise_francis@msn.com
2
Objectives of Presentation Introduce insurance professionals to neural networks Show that neural networks are a lot like some conventional statistics Indicate where use of neural networks might be helpful Show practical examples of using neural networks Show how to interpret neural network models
3
Conventional Statistics: Regression One of the most common methods is linear regression Models a relationship between two variables by fitting a straight line through points.
4
A Common Actuarial Example: Trend Estimation
6
The Severity Trend Model Severity t = Constant(1+trend) t-t 0 Log(Severity)= Constant+(t-t 0 )*log(1+trend)+error
7
Solution to Regression Problem
8
Neural Networks Also minimizes squared deviation between fitted and actual values Can be viewed as a non-parametric, non-linear regression
9
The MLP Neural Network
10
The Activation Function The sigmoid logistic function
11
The Logistic Function
12
Simple Trend Example: One Hidden Node
13
Logistic Function of Simple Trend Example
14
Fitting the Curve Typically use a procedure which minimizes the squared error – like regression does
15
Trend Example: 1 Hidden Node
16
Trend Example: 2 Hidden Nodes
17
Trend Example: 3 Hidden Nodes
18
Universal Function Approximator u The backpropigation neural network with one hidden layer is a universal function approximator u Theoretically, with a sufficient number of nodes in the hidden layer, any nonlinear function can be approximated
19
How Many Hidden Nodes? u Too few nodes: Don’t fit the curve very well u Too many nodes: Over parameterization u May fit noise as well as pattern
20
How Do We Determine the Number of Hidden Nodes? u Hold out part of the sample u Cross-Validation u Resampling u Bootstrapping u Jacknifing u Algebraic formula
21
Hold Out Part of Sample u Fit model on 1/2 to 2/3 of data u Test fit of model on remaining data u Need a large sample
22
Cross-Validation u Hold out 1/n (say 1/10) of data u Fit model to remaining data u Test on portion of sample held out u Do this n (say 10) times and average the results u Used for moderate sample sizes u Jacknifing similar to cross-validation
23
Bootstrapping u Create many samples by drawing samples, with replacement, from the original data u Fit the model to each of the samples u Measure overall goodness of fit and create distribution of results u Uses for small and moderate sample sizes
24
Jacknife Result
25
Result for Sample Hold Out
26
Interpreting Complex Multi-Variable Model u How many hidden nodes? u Which variables should the analyst keep?
27
Measuring Variable Importance u Look at weights to hidden layer u Compute sensitivities: u a measure of how much the predicted value’s error increases when the variables are excluded from the model one at a time
28
Technical Predictors of Stock Price A Complex Multivariate Example
29
Stock Prediction: Which Indicator is Best? u Moving Averages u Measures of Volatility u Seasonal Indicators u The January effect u Oscillators
30
The Data u S&P Index since 1930 u Close Only u S&P 500 since 1962 u Open u High u Low u Close
31
Moving Averages u A very commonly used technical indicator u 1 week MA of returns u 2 week MA of returns u 1 month MA of returns u These are trend following indicators u A more complicated time series smoother based on running medians called T4253H
32
Volatility Measures u Finance literature suggests volatility of market changes over time u More turbulent market -> higher volatility u Measures u Standard deviation of returns u Range of returns u Moving averages of above
33
Seasonal Effects
34
Oscillators u May indicate that market is overbought or oversold u May indicate that a trend is nearing completion u Some oscillators u Moving average differences u Stochastic
35
Stochastic u Based on observation that as prices increase closing prices tend to be closer to upper end of range u In downtrends, closing prices are near lower end of range u %K = (C – L5)/(H5 – L5) u C is closing prince, L5 is 5 day low, H5 is 5 day high u %D = 3 day moving average of %K
36
Neural Network Result u Variable Importance 1. Month 2. %K (from stochastic) 3. Smoothed standard deviation 4. Smoothed return 5. 2 Week %D (from stochastic) 6. 1 week range of returns 7. Smoothed %K u R 2 was.15 or 15% of variance explained
37
What are the Relationships between the Variables?
38
Neural Network Result for Seasonality
39
Neural Network Result for Oscillator
40
Neural Network Result for Seasonality and Oscillator
41
Neural Network Result for Seasonality and Standard Deviation
43
How Many Nodes?
44
Conclusions u Neural Networks are a lot like conventional statistics u They address some problems of conventional statistics: nonlinear relationships, correlated variables and interactions u Despite black block aspect, we now can interpret them u Can find further information, including paper, at www.casact.org/aboutcas/mdiprize.htm u Paper and presentation can be found at www.francisanalytics.com
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.