Working Paper

Out-of-Sample Inference with Annual Benchmark Revisions


Abstract: This paper examines the properties of out-of-sample predictability tests evaluated with real-time data subject to annual benchmark revisions. The presence of both regular and annual revisions can create time heterogeneity in the moments of the real-time forecast evaluation function, which is not compatible with the standard covariance stationarity assumption used to derive the asymptotic theory of these tests. To cover both regular and annual revisions, we replace this standard assumption with a periodic covariance stationarity assumption that allows for periodic patterns of time heterogeneity. Despite the lack of stationarity, we show that the Clark and McCracken (2009) test statistic is robust to the presence of annual benchmark revisions. A similar robustness property is shared by the bootstrap test of Goncalves, McCracken, and Yao (2025). Monte Carlo experiments indicate that both tests provide satisfactory finite sample size and power properties even in modest sample sizes. We conclude with an application to U.S. employment forecasting in the presence of real-time data.

JEL Classification: C53; C12; C52;

https://doi.org/10.20955/wp.2025.020

Access Documents

File(s): File format is application/pdf https://doi.org/10.20955/wp.2025.020
Description: Full text

Authors

Bibliographic Information

Provider: Federal Reserve Bank of St. Louis

Part of Series: Working Papers

Publication Date: 2025-09-11

Number: 2025-020