Presentation is loading. Please wait.

Presentation is loading. Please wait.

Kajal Lahiri and Wuwei Wang State University of New York at Albany

Similar presentations


Presentation on theme: "Kajal Lahiri and Wuwei Wang State University of New York at Albany"— Presentation transcript:

1 Kajal Lahiri and Wuwei Wang State University of New York at Albany
Estimating Macroeconomic Uncertainty Using Information Measures from SPF Density Forecasts July 2017 Kajal Lahiri and Wuwei Wang State University of New York at Albany

2 Motivation and Background
Uncertainty is important to policy makers and market participants alike. We now have many studies that document the real effects of uncertainty on business planning and other economic activities. (Bloom et al. 2007, Bloom 2009, 2014, Baker et al. 2013, Shin et al. 2016, Carriero et al. 2016, Rossi et al. 2016, Mumtaz et al. 2016, Piffer et al ) Thus, in ex ante sense, uncertainty in forecasts is a variable of interest and has often been proxied by the variance of point forecasts. Other approaches to measure uncertainty include Jurado et al. (2015)’s uncertainty from a large set of economic variables and Baker et al. (2013)’s economic policy uncertainty related to real time media news counts and stock market volatility.

3 Motivation and Background
US-SPF data is of great help in estimating benchmark uncertainty from the densities directly. Many economists have utilized density forecasts in SPF starting with Zarnowitz and Lambros (1987), Lahiri, Teigland and Zaporowski (1988), Lahiri and Liu (2007), Boero, Smith and Wallis (2008, 2013) and others. There have been different approaches to handle density histograms: Many assume the probability mass is lumped at the midpoint of each interval (D’Amico et al & Shoja et al. 2015), uniform distribution over each interval (Abel et al. 2015), normal distribution (Giordani & Soderlind 2003, Boero et al. 2013) and generalized beta distribution (Engelberg et al. 2006) .

4 Motivation and Background
Aggregate economy-wide uncertainty from density forecasts have been measured by: Simple average of individual variances Interquartile ranges of individual variances (Abel et al. 2015) Variances of pooled average distributions (Boero et al. 2013) Scoring methods (Rossi et al. 2015) Regressions on measures derived from densities (D’Amico et al. 2008) Entropy & information measure (Lahiri et al. 2005, Rich et al. 2010, Shoja et al. 2015)

5 Motivation and Background
Some of the uncertainty measures above have certain relationships. Lahiri et al. (1988) first propose a decomposition of variance of the consensus distribution, with total variation decomposed into within variation and between variation. Boero et al. (2013) has a similar identity. Rich et al. (2010) identify disagreement a component of overall uncertainty. Shoja and Soofi (2015) decompose overall entropy into average entropy and Kullback-Leibler information measure.

6 Data Density forecasts from Survey of Professional Forecasters by Philadelphia Fed, including output forecasts from 1981 to 2016 and inflation forecasts from 1968 to These forecasts have fixed targets and shortening horizons as time passes to the end of year. On average 36 forecasters per quarter. We carefully fit generalized beta distributions or symmetric triangular distributions to densities (histograms), similar to Engelberg et al. (2006). These distributions match the histograms better than normal distributions or discrete distributions.

7 Uncertainty We measure uncertainty from the individual density forecasts and also from the aggregate density forecasts (aggregated across forecasters). The difference between the two is disagreement. The above three measures are obtained by applying information theories, i.e. entropy and information measures (information divergence). To compare with conventional methods, we use variance approach to measure the three series first.

8 Uncertainty and its components using variances
We use variance of aggregate distribution as a proxy for overall uncertainty. Assume forecaster i has a probability forecast in the form of a histogram pi (k ×1 vector). A continuous distribution fi is fitted to it. fi has mean μi and variance σi2 The variance of the means is (disagreement in forecast means) It can be proved that: Variance of aggregate distribution = disagreement in means +

9 Differing bin sizes Density forecasts from 1981 to 1991 have bin size of 2 points; in other years the size is only 1. Bin size affects variance measures. If we combine bins to convert 1-point bin forecasts into 2-point bins, variance would be larger, especially in recent low uncertainty years.

10 Differing bin sizes We illustrate this point with 2-quarter ahead output forecasts.

11 Differing bin sizes To make variance measures comparable, we convert the variances during when bin length was 2 points to a comparable level using OLS regression. We regress variances of 1 bin cases on corresponding variances calculated assuming bin length of 2 points. Aggregate var (1 point bin)= *Aggregate var (2 point bin) with R2=0.89 We will now present the variance decompositions for forecasts with different horizons.

12 Uncertainty and its components using variances
Output forecasts Variance of aggregate distribution Average variance Disagreement on density means

13 Uncertainty and its components using variances
Inflation forecasts Variance of aggregate distribution Average variance Disagreement on density means

14 Uncertainty and its components using variances
The longer the horizon, the higher the uncertainty. For output forecasts, uncertainty was modestly high in 1980s. For inflation forecasts 1970s and 80s both witnessed very high uncertainty. For both variables, uncertainty fell significantly since the great moderation, but increased modestly in the recent financial crisis. Disagreement is much more volatile than average variance and contributes to large spikes in total variance in quarters with high uncertainty. Thus, the relation is far from perfect.

15 Uncertainty and its components using entropy
Recently information theory has been utilized to analyze uncertainty. Shannon entropy measures the dispersion of a distribution, thus measures the overall uncertainty of the distribution. Shannon entropy: It measures how close a distribution is to an infinitely dispersed distribution and is similar to variance.

16 Uncertainty and its components using entropy
In information literature, there are several information measures to capture the disagreement between distributions. Kullback-Leibler divergence: Jensen-Shannon divergence: Information divergence is analogous to disagreement. where

17 Uncertainty and its components using entropy
Analogous to variance decomposition, entropy of the aggregate distribution has two components, and one of them is disagreement. It can be shown that Entropy of aggregate distribution = Average individual entropies + Information measure

18 Uncertainty and its components using entropy
The similarity of entropy and variance is clearly shown in our data. We find that individual entropy is highly correlated with variance. For normal distributions the entropy is linearly related to log of variance. For a normal distribution with variance σ2, its entropy is For a beta distribution with shape parameters α and β,

19 Entropy vs. variance Individual entropy is linearly related to log of variance. On the right hand side graph, points not on the straight line frontier are from non-normal distributions.

20 Problems with discrete histograms
We also compute entropy treating densities as discrete distributions, same as Rich et al (2010) and Shoja et al (2015). We find that entropy values computed in this way do not match the entropy values from properly fitted distributions for many individual observations. The discrete distribution method underestimates entropy. We can conclude that fitting with continuous distributions is more reasonable than fitting with discrete histograms.

21 Problems with discrete histograms
Using discrete distributions, entropy is affected by the number of intervals but not affected by the length of bins, which is counterintuitive. In many cases, the entropy values from similar histograms deviate from each other even after multiplying a correction term of 1/log(m) (m is number of bins) which is the adjustment Shoja et al used.

22 Entropy decomposition
Given the above reasons, we stick with entropy measures calculated using continuous distributions. We then decompose the aggregate entropy into two components. Aggregate entropy is: H( fc ) where The first component is average entropy The other component of overall uncertainty, which captures the disagreement part is information divergence. We use Jensen-Shannon divergence.

23 Entropy approach - correction for bin size
Similar to the variance approach, entropy and related measures are affected by bin size as shown below.

24 Correction for bin size
Change of bin length has a large effect on aggregate entropy We regress entropy from 1-point bins on entropy assuming 2-point bins, and use the result to project entropy if bin size were to be 1-point from 1981 to 1991 when bin size is actually 2. H(1-point)= *H(2-point), R2=0.80 We use the same method to correct for bin size for the average entropy series as well.

25 Entropy decomposition
Output forecasts Entropy of aggregate distribution Average entropy Information measure

26 Entropy decomposition
Inflation forecasts Entropy of aggregate distribution Average entropy Information measure

27 Entropy decomposition
The longer the horizon, the higher the uncertainty. Aggregate entropy decreases as horizon shortens, but the effect is not large for horizons of 4 quarters or longer. For inflation forecasts uncertainty was high in the 1970s with high disagreement. In recent quarters there has been, a significant downward movement. For output and inflation forecasts, uncertainty was high in the 1980s due to high average entropy, not due to higher disagreement. Compared with variance approach, disagreement seldom contributes to 50% of the total entropy. The spikes are also less sharp than those in the variance approach.

28 Macroeconomic Uncertainty
We next net out the horizons effects and plot the Q3 equivalent uncertainty of output and inflation forecasts measured in entropy and variance respectively.

29 Variances vs. entropy Entropy based uncertainty is lot less variable than those based on variances. Temporal properties are also somewhat different.

30 Macroeconomic Uncertainty
Compare with other works (all series standardized) Jurado, Ludvigson and Ng (2015) Baker, Bloom and Davis – economic policy uncertainty Entropy of output forecasts

31 Macroeconomic Uncertainty
Compare with other works (all series standardized) Jurado, Ludvigson and Ng (2015) Baker, Bloom and Davis – economic policy uncertainty Entropy of inflation forecasts

32 Macroeconomic Uncertainty
Our output uncertainty measure and JLN and BBD all capture the fall of uncertainty in the great moderation era. But the measures have deviated since the financial crisis. Our measures suggest different magnitudes and persistence in uncertainty during the financial crisis and afterwards. Our measures suggest apparent and permanent fall of uncertainty after 1990. Our inflation uncertainty has gone to historical lows in recent years, possibly caused by the persistent zero interest rate environment since 2009.

33 Estimating common shocks
Following Lahiri and Sheng (2010), we can decompose where is the idiosyncratic adjustment to the common shocks It can be proved by information theory that Under some simplifying assumptions we can have Common shock = average entropy – average information measure This is a further link missed by most. Like Lahiri and Sheng (2010), it shows that the average uncertainty itself contains some disagreement.

34 Estimating common shocks
We plot the average entropy and common shock here for horizon 2 inflation forecasts. Common shocks account for the bulk of the individual entropy, thus disagreement is only a small component here. We also observed that common shocks display strong horizon effects - the longer the horizon, the larger the common shock.

35 Application of informatics–estimating news
We propose a further extension in the application of information measures. The Jensen-Shannon information measure could be applied to individual forecasts to estimate the ‘news’. “News” is calculated as the Jensen Shannon information divergence between f(i,t,h) and f(i,t-1,h+1). We find that news is countercyclical, higher in recession quarters.

36 Application of informatics–estimating news
Using a panel data fixed effect model to regress news on differences of moments at individual level, we find that “news” is affected by the difference in means, variances and skewness of the density forecasts, but kurtosis doesn’t seem to matter. Regression: news on differences of forecast moments (1 to 4 moments) Estimate p-Value (Intercept) x *** x *** x *** x

37 Impacts of uncertainty on macro variables
We use Vector Autoregression (VAR) models with Cholesky decomposed shocks to study the effects. In our baseline model, we run the VAR of the following six quarterly variables: log of real GDP, log of nonfarm payroll, log of private domestic investment, federal funds rate, log of S&P 500 index and aggregate entropy of output forecasts. We draw the impulse responses of all variables to a one standard deviation change of the entropy. We find that an increase of output uncertainty has negative effects on GDP, employment, private domestic investment, interest rate and stock index. Uncertainty itself converges back quickly after the initial shock.

38 Impacts of uncertainty on macro variables
ENT=Entropy of output forecasts LOGRGDP=log of real GDP LOGNFP=log of non farm payroll LOGPDE=log of private domestic investment FFR=federal funds rate LOGSP500=log of stock index S&P500

39 Impacts of uncertainty on macro variables
In alternative specifications, we run VAR models with four variables, i.e. log of GDP deflator, log of real GDP, federal funds rate and aggregate entropy of inflation forecasts. We find that an increase of the federal funds rate leads to lower output and higher inflation uncertainty, consistent with most research. A shock in inflation uncertainty leads to lower interest rates and lower output. The price puzzle (response of inflation to tightening monetary policy (a rise in interest rate) is positive, which is counterintuitive) exists for the full sample of 1968Q4 to 2015Q4, but is not apparent in the sample of 1981Q3 to 2015Q4.

40 Impacts of uncertainty on macro variables
Impose a shock to entropy

41 Impacts of uncertainty on macro variables
Impose a shock to interest rate Sample: 1968-present Sample: 1981-present

42 Current work We propose that we can regress “news” on lagged news to test forecast efficiency in the sense of Nordhaus (1987).

43 Conclusion We fit generalized beta and triangular distributions to histograms. We use fitted data to show a decomposition of uncertainty into average uncertainty and disagreement in parallel ways for variance and entropy approaches, with a focus on entropy computational techniques and performance. Entropy measures are less volatile than variance measures. Disagreement in the entropy approach is far less volatile and seldom contributes to 50% of total uncertainty. After correcting for bin size and horizon, we find a permanent fall in uncertainty since the great moderation. In the most recent 2 years, output forecast uncertainty is slowly increasing while inflation uncertainty is falling to historical lows. Average uncertainty contains common shock and disagreement. information measure can measure “news”.

44 Thank you!


Download ppt "Kajal Lahiri and Wuwei Wang State University of New York at Albany"

Similar presentations


Ads by Google