Search Results
Working Paper
Forecasting Economic Activity with Mixed Frequency Bayesian VARs
Mixed frequency Bayesian vector autoregressions (MF-BVARs) allow forecasters to incorporate a large number of mixed frequency indicators into forecasts of economic activity. This paper evaluates the forecast performance of MF-BVARs relative to surveys of professional forecasters and investigates the influence of certain specification choices on this performance. We leverage a novel real-time dataset to conduct an out-of-sample forecasting exercise for U.S. real gross domestic product (GDP). MF-BVARs are shown to provide an attractive alternative to surveys of professional forecasters for ...
Working Paper
Using stochastic hierarchical aggregation constraints to nowcast regional economic aggregates
Recent decades have seen advances in using econometric methods to produce more timely and higher-frequency estimates of economic activity at the national level, enabling better tracking of the economy in real time. These advances have not generally been replicated at the sub–national level, likely because of the empirical challenges that nowcasting at a regional level presents, notably, the short time series of available data, changes in data frequency over time, and the hierarchical structure of the data. This paper develops a mixed– frequency Bayesian VAR model to address common ...
Working Paper
Information in the revision process of real-time datasets
Rationality of early release data is typically tested using linear regressions. Thus, failure to reject the null does not rule out the possibility of nonlinear dependence. This paper proposes two tests which instead have power against generic nonlinear alternatives. A Monte Carlo study shows that the suggested tests have good finite sample properties. Additionally, we carry out an empirical illustration using a real-time dataset for money, output, and prices. Overall, we find strong evidence against data rationality. Interestingly, for money stock the null is not rejected by linear tests but ...
Working Paper
Time-varying Uncertainty of the Federal Reserve’s Output Gap Estimate
What is the output gap and when do we know it? A factor stochastic volatility model estimates the common component to forecasts of the output gap produced by the staff of the Federal Reserve, its time-varying volatility, and time-varying, horizon-specific forecast uncertainty. The common factor to these forecasts is highly procyclical, and unexpected increases to the common factor are associated with persistent responses in other macroeconomic variables. However, output gap estimates are very uncertain, even well after the fact. Output gap uncertainty increases around business cycle turning ...
Working Paper
The Fed's Asymmetric Forecast Errors
I show that the probability that the Board of Governors of the Federal Reserve System staff's forecasts (the "Greenbooks'") overpredicted quarterly real gross domestic product (GDP) growth depends on both the forecast horizon and also whether the forecasted quarter was above or below trend real GDP growth. For forecasted quarters that grew below trend, Greenbooks were much more likely to overpredict real GDP growth, with one-quarter ahead forecasts overpredicting real GDP growth more than 75% of the time, and this rate of overprediction was higher for further ahead forecasts. For forecasted ...
Working Paper
Forecasting of small macroeconomic VARs in the presence of instabilities
Small-scale VARs have come to be widely used in macroeconomics, for purposes ranging from forecasting output, prices, and interest rates to modeling expectations formation in theoretical models. However, a body of recent work suggests such VAR models may be prone to instabilities. In the face of such instabilities, a variety of estimation or forecasting methods might be used to improve the accuracy of forecasts from a VAR. These methods include using different approaches to lag selection, observation windows for estimation, (over-) differencing, intercept correction, stochastically ...
Working Paper
Why Have Long-term Treasury Yields Fallen Since the 1980s? Expected Short Rates and Term Premiums in (Quasi-) Real Time
Treasury yields have fallen since the 1980s. Standard decompositions of Treasury yields into expected short-term interest rates and term premiums suggest term premiums account for much of the decline. In an alternative real-time decomposition, term premiums have fluctuated in a stable range, while long-run expected short-term interest rates have fallen. For example, a real-time decomposition of the 10-yr. Treasury yield shows term premiums essentially equal in late 2013 and 2023, while the long-run value of expected short-term interest rates is estimated to have fallen in a manner similar to ...
Working Paper
Predicting Benchmarked US State Employment Data in Real Time
US payroll employment data come from a survey and are subject to revisions. While revisions are generally small at the national level, they can be large enough at the state level to alter assessments of current economic conditions. Users must therefore exercise caution in interpreting state employment data until they are “benchmarked” against administrative data 5–16 months after the reference period. This paper develops a state-space model that predicts benchmarked state employment data in real time. The model has two distinct features: 1) an explicit model of the data revision process ...
Working Paper
Lessons from the latest data on U.S. productivity
Productivity growth is carefully scrutinized by macroeconomists because it plays key roles in understanding private savings behaviour, the sources of macroeconomic shocks, the evolution of international competitiveness and the solvency of public pension systems, among other things. However, estimates of recent and expected productivity growth rates suffer from two potential problems: (i) recent estimates of growth trends are imprecise, and (ii) recently published data often undergo important revisions. This paper documents the statistical (un)reliability of several measures of aggregate ...
Working Paper
Analyzing data revisions with a dynamic stochastic general equilibrium model
We use a structural dynamic stochastic general equilibrium model to investigate how initial data releases of key macroeconomic aggregates are related to final revised versions and how identified aggregate shocks influence data revisions. The analysis sheds light on how well preliminary data approximate final data and on how policy makers might condition their view of the preliminary data when formulating policy actions. The results suggest that monetary policy shocks and multifactor productivity shocks lead to predictable revisions to the initial release data on output growth and inflation.