A Taylor Rule with Monthly Data A.G. Malliaris Mary.E. Malliaris Loyola University Chicago.

Slides:



Advertisements
Similar presentations
Beyond Linear Separability
Advertisements

NEURAL NETWORKS Backpropagation Algorithm
Hopefully a clearer version of Neural Network. I1 O2 O1 H1 H2I2.
Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
Principle Components & Neural Networks How I finished second in Mapping Dark Matter Challenge Sergey Yurgenson, Harvard University Pasadena, 2011.
Artificial Neural Networks
Artificial Neural Networks
The back-propagation training algorithm
September 30, 2010Neural Networks Lecture 8: Backpropagation Learning 1 Sigmoidal Neurons In backpropagation networks, we typically choose  = 1 and 
Neural Networks Marco Loog.
Neural Networks Chapter Feed-Forward Neural Networks.
October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 1 Example I: Predicting the Weather We decide (or experimentally determine) to use.
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
© T Madas Finding the amount before a Percentage Change.
WALMART RECRUITING – STORE SALES FORECASTING Chaoyi Liu, Yuqing Lu, Haoran Wu Rajendran, Goutham.
Hopefully a clearer version of Neural Network. With Actual Weights.
MLP Exercise (2006) Become familiar with the Neural Network Toolbox in Matlab Construct a single hidden layer, feed forward network with sigmoidal units.
Radial Basis Function (RBF) Networks
Modeling the Behavior of the S&P 500 Index Mary Malliaris Loyola University Chicago 10 th IEEE Conference on Artificial Intelligence for Applications.
Relation Between Consumer Confidence Index and other Macroeconomics indicators using RBF approach Sun Hao Course:
Artificial Neural Networks
Kumar Srijan ( ) Syed Ahsan( ). Problem Statement To create a Neural Networks based multiclass object classifier which can do rotation,
© Negnevitsky, Pearson Education, Will neural network work for my problem? Will neural network work for my problem? Character recognition neural.
Appendix B: An Example of Back-propagation algorithm
Classification / Regression Neural Networks 2
Neural Network Introduction Hung-yi Lee. Review: Supervised Learning Training: Pick the “best” Function f * Training Data Model Testing: Hypothesis Function.
N-tuple S&P Patterns Across Decades, 1950s to 2011
Artificial Intelligence Techniques Multilayer Perceptrons.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
M. E. Malliaris Loyola University Chicago, S. G. Malliaris Yale University,
An informal description of artificial neural networks John MacCormick.
BACKPROPAGATION: An Example of Supervised Learning One useful network is feed-forward network (often trained using the backpropagation algorithm) called.
Multi-Layer Perceptron
Revenue Generation in Hospital Foundations: Neural Network versus Regression Model Recommendations Mary E. Malliaris Loyola University Chicago Maria Pappas.
11 1 Backpropagation Multilayer Perceptron R – S 1 – S 2 – S 3 Network.
1 FORECASTING ENERGY PRODUCT PRICES M.E. Malliaris Loyola University Chicago S.G. Malliaris Massachusetts Institute of Technology.
Beating the Best: A Neural Network Challenges the Black- Scholes Formula Mary Malliaris and Linda Salchenberger Loyola University Chicago Ninth IEEE Conference.
Neural Networks - Berrin Yanıkoğlu1 Applications and Examples From Mitchell Chp. 4.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Neural Networks for Predicting Options Volatility Mary Malliaris and Linda Salchenberger Loyola University Chicago World Congress on Neural Networks San.
Intro. ANN & Fuzzy Systems Lecture 11. MLP (III): Back-Propagation.
Chapter 11 – Neural Nets © Galit Shmueli and Peter Bruce 2010 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
An Introduction To The Backpropagation Algorithm.
Neural Networks.
A Taylor Rule with Monthly Data
第 3 章 神经网络.
CSE 473 Introduction to Artificial Intelligence Neural Networks
Derivation of a Learning Rule for Perceptrons
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Random walk initialization for training very deep feedforward networks
CSE P573 Applications of Artificial Intelligence Neural Networks
Correcting Stats for Inflation & Aggregate Supply and Demand
Prof. Carolina Ruiz Department of Computer Science
Lecture 11. MLP (III): Back-Propagation
Example: Voice Recognition
Artificial Intelligence Chapter 3 Neural Networks
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Artificial Intelligence Chapter 3 Neural Networks
Backpropagation.
Exponential and Logarithmic Forms
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence 10. Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Backpropagation.
Learning Combinational Logic
Artificial Intelligence Chapter 3 Neural Networks
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

A Taylor Rule with Monthly Data A.G. Malliaris Mary.E. Malliaris Loyola University Chicago

Fed Funds

Unemployment Rate

CPI-All Items 12 month logarithmic change rate Jan 1957-Nov 2005

CPI, All Items,

Standard Approaches Random Walk r t = α + βr t-1 + ε Taylor Model r t = α + β 1 (CPI-2) + β 2 (Un-4) + ε Econometric Model r t = α + β 1 r t-1 + β 2 (CPI-2) + β 3 (Un-4) + ε

Neural Network Architecture Input, Hidden and Output Layers with sigmoid function applied to weighted sum w1 w2 w3 w16 w17 w18 w19 w20 w21 F(sum inputs*weights)=node output F(sum inputs*weights)=output

Network Process The neural network adjusts the weights and recalculates the total error. This process continues to some specified ending point (amount of error, training time, or number of weight changes). The final network is the one with the lowest error from the sets of possible weights tried during the training process

Variable Designations r t : the Fed Funds rate at time t, the dependent variable CPI t-1 : the Consumer Price Index at time t-1 Adjusted CPI t-1 : CPI minus 2 at time t-1 Un t-1 : the Unemployment Rate at time t-1 Gap t-1 : the Unemployment Rate minus 4 at time t-1

Variables Per Model r t-1 CPI t-1 Gap t-1 Random WalkX TaylorXX EconometricXXX Neural NetXXX

Data Sets Data SetTrainingValidationTotal PreGreenspan Jan 58 to Jul Greenspan Aug 87 to Nov r t-1 : 0 to r t-1 : 5.01 to r t-1 : over

Random Walk InterceptCoefficient of r at t-1 PreGreenspan Greenspan High Medium Low

Taylor Equation Original Equation r t = *CPI +.5*Gap Calculated Equation InterceptCPIGap PreGreenspan Greenspan High Medium Low

Econometric Model InterceptFed FundsAdj. CPIGap PreGreenspan Greenspan High Medium Low

Neural Networks Significance of Variables PreGreenspanGreenspanLowMediumHigh Fed Funds CPI UnRateCPI UnRate CPIUnRate Fed Funds

Model / Data Set PreGreenspanGreenspanLowMediumHigh Random Walk Taylor Taylor Econometric Neural Network Mean Squared Error Comparisons on Validation Sets

Summary Several approaches to modeling Econometric approach best when applied to pre-Greenspan and Greenspan Neural Network best when sample is divided to low, medium and high