A Taylor Rule with Monthly Data

Slides:



Advertisements
Similar presentations
Beyond Linear Separability
Advertisements

NEURAL NETWORKS Backpropagation Algorithm
Hopefully a clearer version of Neural Network. I1 O2 O1 H1 H2I2.
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
Principle Components & Neural Networks How I finished second in Mapping Dark Matter Challenge Sergey Yurgenson, Harvard University Pasadena, 2011.
Artificial Neural Networks
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Artificial Neural Networks
The back-propagation training algorithm
September 30, 2010Neural Networks Lecture 8: Backpropagation Learning 1 Sigmoidal Neurons In backpropagation networks, we typically choose  = 1 and 
Neural Networks Marco Loog.
Neural Networks Chapter Feed-Forward Neural Networks.
October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 1 Example I: Predicting the Weather We decide (or experimentally determine) to use.
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Hopefully a clearer version of Neural Network. With Actual Weights.
MLP Exercise (2006) Become familiar with the Neural Network Toolbox in Matlab Construct a single hidden layer, feed forward network with sigmoidal units.
Detailed q2/Q2 results for 100 bootstraps for final runs with (38 + dummy features)
Radial Basis Function (RBF) Networks
Modeling the Behavior of the S&P 500 Index Mary Malliaris Loyola University Chicago 10 th IEEE Conference on Artificial Intelligence for Applications.
Relation Between Consumer Confidence Index and other Macroeconomics indicators using RBF approach Sun Hao Course:
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
Kumar Srijan ( ) Syed Ahsan( ). Problem Statement To create a Neural Networks based multiclass object classifier which can do rotation,
Chapter 11 – Neural Networks COMP 540 4/17/2007 Derek Singer.
Appendix B: An Example of Back-propagation algorithm
Neural Networks - Berrin Yanıkoğlu1 Applications and Examples From Mitchell Chp. 4.
Lecture 3 Introduction to Neural Networks and Fuzzy Logic President UniversityErwin SitompulNNFL 3/1 Dr.-Ing. Erwin Sitompul President University
Classification / Regression Neural Networks 2
Neural Network Introduction Hung-yi Lee. Review: Supervised Learning Training: Pick the “best” Function f * Training Data Model Testing: Hypothesis Function.
N-tuple S&P Patterns Across Decades, 1950s to 2011
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Artificial Intelligence Techniques Multilayer Perceptrons.
A Taylor Rule with Monthly Data A.G. Malliaris Mary.E. Malliaris Loyola University Chicago.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
M. E. Malliaris Loyola University Chicago, S. G. Malliaris Yale University,
An informal description of artificial neural networks John MacCormick.
Multi-Layer Perceptron
Revenue Generation in Hospital Foundations: Neural Network versus Regression Model Recommendations Mary E. Malliaris Loyola University Chicago Maria Pappas.
1 FORECASTING ENERGY PRODUCT PRICES M.E. Malliaris Loyola University Chicago S.G. Malliaris Massachusetts Institute of Technology.
Beating the Best: A Neural Network Challenges the Black- Scholes Formula Mary Malliaris and Linda Salchenberger Loyola University Chicago Ninth IEEE Conference.
Neural Networks - Berrin Yanıkoğlu1 Applications and Examples From Mitchell Chp. 4.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
CHEE825 Fall 2005J. McLellan1 Nonlinear Empirical Models.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Time-Series Forecast Models  A time series is a sequence of evenly time-spaced data points, such as daily shipments, weekly sales, or quarterly earnings.
BACKPROPAGATION (CONTINUED) Hidden unit transfer function usually sigmoid (s-shaped), a smooth curve. Limits the output (activation) unit between 0..1.
Neural Networks for Predicting Options Volatility Mary Malliaris and Linda Salchenberger Loyola University Chicago World Congress on Neural Networks San.
Chapter 11 – Neural Nets © Galit Shmueli and Peter Bruce 2010 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
Multiple-Layer Networks and Backpropagation Algorithms
Neural Networks.
Real Neurons Cell structures Cell body Dendrites Axon
CSE 473 Introduction to Artificial Intelligence Neural Networks
Derivation of a Learning Rule for Perceptrons
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
CSE P573 Applications of Artificial Intelligence Neural Networks
Prof. Carolina Ruiz Department of Computer Science
Example: Voice Recognition
Neural Networks Advantages Criticism
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Backpropagation.
Exponential and Logarithmic Forms
Macroeconomics Economic Indicators.
More Properties of Logarithms
Rural Transit Invoice New Invoice
Learning Combinational Logic
Regression and Correlation of Data
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

A Taylor Rule with Monthly Data A.G. Malliaris M.E. Malliaris Loyola University Chicago

Fed Funds 1957-2005

Unemployment Rate 1957-2005

CPI-All Items 12 month logarithmic change rate Jan 1957-Nov 2005

CPI, All Items, 1957 - 2005

Standard Approaches Random Walk rt = α + βrt-1 + ε Taylor Model rt = α + β1CPI + β2(CPI-2) + β3(Un-4) + ε (or, rt = 1 + 1.5CPI + .5Gap + ε ) Econometric Models rt = α + β1rt-1 + β2(CPI-2) + β3(Un-4) + ε

Neural Network Architecture F(sum inputs*weights)=node output w2 w19 w3 F(sum inputs*weights)=output w20 w21 Input, Hidden and Output Layers with sigmoid function applied to weighted sum w17 w16 w18

Network Process The neural network adjusts the weights and recalculates the total error. This process continues to some specified ending point (amount of error, training time, or number of weight changes). The final network is the one with the lowest error from the sets of possible weights tried during the training process

Variable Designations rt : the Fed Funds rate at time t, the dependent variable CPIt-1 : the Consumer Price Index at time t-1 Adjusted CPIt-1 : CPI minus 2 at time t-1 Unt-1 : the Unemployment Rate at time t-1 Gapt-1 : the Unemployment Rate minus 4 at time t-1

Variables Per Model rt-1 CPIt-1 Gapt-1 Random Walk X Taylor Econometric Neural Net

Data Sets Data Set Training Validation Total PreGreenspan 319 36 355 Jan 58 to Jul 87 319 36 355 Greenspan Aug 87 to Nov 05 197 22 219 rt-1 : 0 to 5 24 243 rt-1 : 5.01 to 10 27 270 rt-1 : over 10 55 6 61

Random Walk Intercept Coefficient of r at t-1 PreGreenspan 0.177 0.973 0.006 0.995 High 1.481 0.879 Medium 0.021 Low 0.022

Taylor Equation Original Equation rt = 1 + 1.5*CPI + .5*Gap Calculated Equation Intercept CPI Gap PreGreenspan 2.334 0.789 0.296 Greenspan 1.797 1.477 -0.935 High 5.005 0.564 0.910 Medium 5.755 0.197 0.161 Low 2.837 0.496 -0.490

Econometric Model Intercept Fed Funds Adj. CPI Gap PreGreenspan 0.291 0.965 0.019 -0.035 Greenspan 0.047 0.994 -0.007 -0.024 High 1.442 0.862 0.066 -0.027 Medium 0.007 1.002 -0.003 -0.019 Low 0.125 0.983 0.018 -0.022

Neural Networks Significance of Variables PreGreenspan Greenspan Low Medium High Fed Funds CPI UnRate

Mean Squared Error Comparisons on Validation Sets Model / Data Set PreGreenspan Greenspan Low Medium High Random Walk 0.676 0.034 0.122 0.271 0.574 Taylor 10.036 8.392 6.651 9.701 16.754 Taylor2 6.793 3.001 0.985 2.221 1.263 Econometric 0.657 0.030 0.124 0.262 0.613 Neural Network 1.121 0.129 0.104 0.269 0.372