Back to square one: Identification issues in DSGE models

Slides:



Advertisements
Similar presentations
Introduction Describe what panel data is and the reasons for using it in this format Assess the importance of fixed and random effects Examine the Hausman.
Advertisements

Assumptions underlying regression analysis
Tests of Hypotheses Based on a Single Sample
Evaluating an estimated new Keynesian small open economy model Malin Adolfson, Stefan Laséen, Jesper Lindé, Mattias Villani Marc Goñi – 19 th April.
General Linear Model With correlated error terms  =  2 V ≠  2 I.
Brief introduction on Logistic Regression
Chapter 6 Sampling and Sampling Distributions
Time Varying Coefficient Models; A Proposal for selecting the Coefficient Driver Sets Stephen G. Hall, P. A. V. B. Swamy and George S. Tavlas,
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Chapter 7 Title and Outline 1 7 Sampling Distributions and Point Estimation of Parameters 7-1 Point Estimation 7-2 Sampling Distributions and the Central.
Chap 8-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 8 Estimation: Single Population Statistics for Business and Economics.
Random effects estimation RANDOM EFFECTS REGRESSIONS When the observed variables of interest are constant for each individual, a fixed effects regression.
Visual Recognition Tutorial
Maximum likelihood (ML) and likelihood ratio (LR) test
Resampling techniques Why resampling? Jacknife Cross-validation Bootstrap Examples of application of bootstrap.
Chapter 12 Simple Regression
Simple Linear Regression
Chapter 7 Sampling and Sampling Distributions
Efficient Estimation of Emission Probabilities in profile HMM By Virpi Ahola et al Reviewed By Alok Datar.
End of Chapter 8 Neil Weisenfeld March 28, 2005.
Chapter 8 Estimation: Single Population
1 Simple Linear Regression Chapter Introduction In this chapter we examine the relationship among interval variables via a mathematical equation.
Chapter 11 Multiple Regression.
Using ranking and DCE data to value health states on the QALY scale using conventional and Bayesian methods Theresa Cain.
Chapter 7 Estimation: Single Population
Experimental Evaluation
Lecture 17 Interaction Plots Simple Linear Regression (Chapter ) Homework 4 due Friday. JMP instructions for question are actually for.
Maximum likelihood (ML)
Simple Linear Regression. Introduction In Chapters 17 to 19, we examine the relationship between interval variables via a mathematical equation. The motivation.
1 Simple Linear Regression 1. review of least squares procedure 2. inference for least squares lines.
9. Binary Dependent Variables 9.1 Homogeneous models –Logit, probit models –Inference –Tax preparers 9.2 Random effects models 9.3 Fixed effects models.
Review of Statistical Inference Prepared by Vera Tabakova, East Carolina University ECON 4550 Econometrics Memorial University of Newfoundland.
Introduction to Statistical Inferences
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
Confidence Interval Estimation
Prof. Dr. S. K. Bhattacharjee Department of Statistics University of Rajshahi.
Lecture 12 Statistical Inference (Estimation) Point and Interval estimation By Aziza Munir.
1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,
CSDA Conference, Limassol, 2005 University of Medicine and Pharmacy “Gr. T. Popa” Iasi Department of Mathematics and Informatics Gabriel Dimitriu University.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
Various topics Petter Mostad Overview Epidemiology Study types / data types Econometrics Time series data More about sampling –Estimation.
Managerial Economics Demand Estimation & Forecasting.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
Lecture 4: Statistics Review II Date: 9/5/02  Hypothesis tests: power  Estimation: likelihood, moment estimation, least square  Statistical properties.
BioSS reading group Adam Butler, 21 June 2006 Allen & Stott (2003) Estimating signal amplitudes in optimal fingerprinting, part I: theory. Climate dynamics,
Generalised method of moments approach to testing the CAPM Nimesh Mistry Filipp Levin.
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
Chapter 8: Simple Linear Regression Yang Zhenlin.
Machine Learning 5. Parametric Methods.
1 Chapter 8 Interval Estimation. 2 Chapter Outline  Population Mean: Known  Population Mean: Unknown  Population Proportion.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Stats Term Test 4 Solutions. c) d) An alternative solution is to use the probability mass function and.
Review of Statistical Inference Prepared by Vera Tabakova, East Carolina University.
Univariate Time series - 2 Methods of Economic Investigation Lecture 19.
The inference and accuracy We learned how to estimate the probability that the percentage of some subjects in the sample would be in a given interval by.
Statistics for Business and Economics 8 th Edition Chapter 7 Estimation: Single Population Copyright © 2013 Pearson Education, Inc. Publishing as Prentice.
Chapter 6 Sampling and Sampling Distributions
1 Simple Linear Regression Chapter Introduction In Chapters 17 to 19 we examine the relationship between interval variables via a mathematical.
Estimating standard error using bootstrap
Inference about the slope parameter and correlation
The simple linear regression model and parameter estimation
Inference for Least Squares Lines
Linear Regression.
Ch8 Time Series Modeling
Multivariate Regression
Fundamentals of regression analysis
Chapter 9 Hypothesis Testing.
Filtering and State Estimation: Basic Concepts
Simple Linear Regression
Objectives 6.1 Estimating with confidence Statistical confidence
Presentation transcript:

Back to square one: Identification issues in DSGE models Fabio Canova, Luca Sala Marc Goñi – 19 th April

Informational Cascades Introduction Generics of id Population id Sample id Diagnosting id Conclusion 31/03/2017 Motivation In the last years DSGE models have greatly evolved with the objective of better forecasting and deriving policy implications Development in the specification of DSGE Comparing models with the data through the ability to match conditional dynamics in response to structural shocks However, this inference depends crucially on identification, which has been partially ignored This paper investigates identificability issues in DSGE models in the class of minimum distance estimators

Introduction Generics of id Population id Sample id Diagnosting id Conclusion Literature Review Choi and Phillips (1992), Stock and Wright (2000), Rosen (2006), Kleibergen and Mavroidis (2008) Beyer and Farmer (2004), Moon and Schorfheide (2007) Christiano et al (2006), Fernandez-Villaverde et al (2007), Chari et al (2008)

Introduction Generics of id Population id Sample id Diagnosting id Conclusion Outline 1. Generics of Identification 2. Population Identification Christiano et al (2005), Smets and Wouters (2003) 3. Sample Identification 4. Dealing with Identification

Introduction Generics of identification Population id Sample id Diagnosting id Conclusion Identification is the ability to draw inference about the parameters of the model from the data Identification requires the objective function to have A unique extreme at the true parameter Sufficient curvature in all the relevant dimension The mapping from structural parameters to the objective function is usually non-linear or doesn’t have a closed form solution

Problems Under Identification Introduction Generics of identification Population id Sample id Diagnosting id Conclusion Problems Under Identification If the objective function is independent of certain structural parameters Partial Identification If the parameters enter the objective function only proportionally and they cannot be separately analyzed Weak Identification If the objective function does not have enough curvature in all the relevant dimensions This problems can induce Observational Equivalence, i.e, that different models with different theoretical implications become indistinguishible

Source of the problems Location of the true parameters Introduction Generics of identification Population id Sample id Diagnosting id Conclusion Source of the problems Location of the true parameters Choice of the objective function Consider the optimality conditions of a DSGE model The unique stable RE solution is In State Space representation

Introduction Generics of identification Population id Sample id Diagnosting id Conclusion The Likelihood function provides a natural upper bound to identification of the information available in the data. Using the Kalman filter and assuming normality for the errors And, thus, an identification upper bound is Which, compared to a minimum distance objective function

3. Mapping of Structural parameters and sample objective function Introduction Generics of identification Population id Sample id Diagnosting id Conclusion 3. Mapping of Structural parameters and sample objective function Solution mapping Linking the parameters and the coefs. of the solution Parameters disappear from the solution, do not have independent variability Moment mapping Links the coefs of the solution with the function of interest Selection of a particular Impulse Response may poorly identify coeficients Objective function mapping Links the function of interest with the pop. objective function Function may not have a unique minimum or may not display enough curvature Data mapping Links the pop. Objective function with the sample objective function Estimated VAR responses may not reflect population ones

A simple example Solution mapping Introduction Generics of identification Population id Sample id Diagnosting id Conclusion A simple example Solution mapping Parameter a1 , a3 , a5 disappears from the solution

A simple example Moment mapping The impulse response takes the form of Introduction Generics of identification Population id Sample id Diagnosting id Conclusion A simple example Moment mapping The impulse response takes the form of Even if we pick responses to all shocks, some parameters remain underidentified a2a4 respond jointly to e3

A simple example Objective mapping Choosing a MD objective function Introduction Generics of identification Population id Sample id Diagnosting id Conclusion A simple example Objective mapping Choosing a MD objective function Weak identification problems

Solutions Calibration Introduction Generics of identification Population id Sample id Diagnosting id Conclusion Solutions Calibration Calibrate some of the parameters based on micro-evidence… Problem: If the calibrated parameters are partially identified small calibration differences may shift the estimates

Introduction Generics of identification Population id Sample id Diagnosting id Conclusion 2. Bayesian Methods Estimate structural parameters with Bayesian techniques If the parameter space is not variation free, Id. Problems can be detected by setting a more diffuse prior and checking if the posterior becomes more diffuse However, this can be driven by restrictions Bayesian methods plus tight prior produce well behaved posteriors even when the objective function behaves poorly

Introduction Generics of identification Population id Sample id Diagnosting id Conclusion 3. Serially Correlated Disturbances Allowing for serially correlated disturbances maintains the forward looking coefficients in the solution However, separating internal and external propagation parameters might be difficult.

Population Idenification Introduction Generics of id Population identification Sample id Diagnosting id Conclusion Population Idenification Even when the true model is known, identification problems may make inference not feasible Consider a standard DSGE (Christiano et al, Dedola and Neri, Smets and Wouters). The analytical mapping between structural parameters and the objective function is no longer available Instead examine the slope of the distance function in a neighborhood of the true parameter

Introduction Generics of id Population identification Sample id Diagnosting id Conclusion Model

Preliminary identification evidence Introduction Generics of id Population identification Sample id Diagnosting id Conclusion Preliminary identification evidence Let and For each parameter in compute the elasticity of the distance function to it by varying it while holding fixed the rest of the parameters to the true value. Preliminary evidence for weak identification: Although distance functions have a unique minimum at the true parameter, variation within the neighborhood is small

Size of Identification Problems Introduction Generics of id Population identification Sample id Diagnosting id Conclusion Size of Identification Problems Check how severe weak identification actually is and to test if it can yield observational equivalence. 1. Construct the distribution of the distance function Pick randomly 100000 vector parameters from before and compute the distance between its Impulse responses and 5 benchmark IR True models: benchmark model with monetary shocks, either p, w stickiness or indexation out and benchmark with monetary and technology shocks 2. Pick draws in the 0.1 percentile of the distribution

Case 5: Same results when more shocks are added Introduction Generics of id Population identification Sample id Diagnosting id Conclusion Case 1: Large intervals make difficult to infer how important p, w stickiness and indexation are Case 1-4: As intervals are similar, it is difficult to infer which friction matters (observational equivalence) Case 5: Same results when more shocks are added So, weak and partial identification problems are severe and may induce observational equivalence

Introduction Generics of id Population identification Sample id Diagnosting id Conclusion Remark These poorly identified DSGE model may be good for forecasting but not optimal for policy inference In the presence of such a weak and partial identification problems, one needs to bring information external to the dynamics to be able to interpret the estimates

Introduction Generics of id Population id Sample identification Diagnosting id Conclusion Sample problems What are the effects of these population identification issues when the analysis has to be conducted with sample data, rather than populational. Simulate 500 time series from the true model Estimate a 6-variable VAR with 6 lags, identify monetary shocks and construct the data based Impulse Responses Avoid non-invertibility and correctly identify shocks Estimate theoretical parameters minimizing the distance with the VAR based Impulse Responses

Results Mean estimates do not depend on sample size Introduction Generics of id Population id Sample identification Diagnosting id Conclusion Results Mean estimates do not depend on sample size Standard errors and biases decrease with sample size, but are large Bimodal distributions of parameter estimates with peaks at the boundaries That is, standard asymptotic approximations seem not reliable

Introduction Generics of id Population id Sample identification Diagnosting id Conclusion Results Model based responses fall in the range of VAR-based responses confidence bands Mean estimates are statistically and economically different from the true ones (observational equivalence) That is, the technique of showing model responses in VAR-based confidence bands may lead to wrong inference in the presence of population identification problems

Introduction Generics of id Population id Sample identification Diagnosting id Conclusion Alternatives Under certain conditions, asymptotic methods to compute estimates are robust to Identification problem (Stock and Wright, Kleibergen and Mavroeidis) Rosen (2006) methodology yields similar results here

Introduction Generics of id Population id Sample id Diagnosting identification Conclusion Theory Consider the following mapping of structural parameters to sample objective function First order Taylor expansion To translate information from the function to the parameter we need to be invertible Thus, if the rank is not full: under identification If the eigenvalues are small: weak and partial identification

2. Split the problem in two Introduction Generics of id Population id Sample id Diagnosting identification Conclusion 2. Split the problem in two First find the θ that minimizes the distance between data and model VAR parameters (solution mapping) and then find the reduced form parameters parameters that make the IR close Check the rank and the eigenvalues of and

3. Split the problem in three Introduction Generics of id Population id Sample id Diagnosting identification Conclusion 3. Split the problem in three When only one estimate is available, the problem can be splitted into a solution mapping, moment mapping and data mapping Compute F and G by calibrating θ and then performing sensitivity analysis Compare fixed and estimated parameter for identificability issues (as before)

Introduction Generics of id Population id Sample id Diagnosting identification Conclusion Practical issues Methods to test for the rank and the size of the eigenvalues Anderson (1984) Estimates of eigenvalues have asymptotic normal distribution. Therefore, test if the smallest eigenvalue is different from 0 We can normalize the test by using the ratio of the sum of the smallest eigenvalues over the sum of all eigenvalues Concentration Statistics Measures the curvature of the objective function around θo For large values of accept that the objective function has an optimum at 0

Standard errors not useful Introduction Generics of id Population id Sample id Diagnosting identification Conclusion Standard errors not useful Relatively small std. errors can coexist with identification issues Identification analysis should preceed estimation Standard errors do not infere if the problem is model or data based

An application Applying this methods to our example Introduction Generics of id Population id Sample id Diagnosting identification Conclusion An application Applying this methods to our example G’G matrix has one eigenvalue representing 99.9% of the trace G’F’FG smallest 13 eigenvalues account for 0.001% of the trace Thus, solution mapping (GG) is the source of the problem Gs’Gs (mapping structural coefs with LOM coefs) has one eigenvalue representing 99.9% of the trace Thus, the source of the identification problem is the insensibility of the Law of Motion coeficients to the structural parameters

Solution Model reespecification Reparametrize commonly used functions Introduction Generics of id Population id Sample id Diagnosting identification Conclusion Solution Model reespecification Reparametrize commonly used functions Choice of different pivotal points around which log linearize Use Higher order approximations

Introduction Generics of id Population id Sample id Diagnosting identification Conclusion Identification problems ignored for a long time, with consequences in policy recommendations Tools given here to be applied before structural estimation Montecarlo Methods may help When choosing the objective function, choose the most informative Bringing more data only helps if the problem is data based Robust methods exists but only give you intervals