Search Results

Showing results 1 to 10 of approximately 72.

(refine search)
SORT BY: PREVIOUS / NEXT
Jel Classification:C13 

Report
A Bayesian Approach to Inference on Probabilistic Surveys

We propose a nonparametric Bayesian approach for conducting inference on probabilistic surveys. We use this approach to study whether U.S. Survey of Professional Forecasters density projections for output growth and inflation are consistent with the noisy rational expectations hypothesis. We find that in contrast to theory, for horizons close to two years, there is no relationship whatsoever between subjective uncertainty and forecast accuracy for output growth density projections, both across forecasters and over time, and only a mild relationship for inflation projections. As the horizon ...
Staff Reports , Paper 1025

Working Paper
BLP Estimation Using Laplace Transformation and Overlapping Simulation Draws

We derive the asymptotic distribution of the parameters of the Berry et al. (1995, BLP) model in a many markets setting which takes into account simulation noise under the assumption of overlapping simulation draws. We show that, as long as the number of simulation draws R and the number of markets T approach infinity, our estimator is ?m = ?min(R,T) consistent and asymptotically normal. We do not impose any relationship between the rates at which R and T go to infinity, thus allowing for the case of R
Working Paper Series , Paper 2019-24

Working Paper
The use and abuse of \"real-time\" data in economic forecasting

We distinguish between three different ways of using real-time data to estimate forecasting equations and argue that the most frequently used approach should generally be avoided. The point is illustrated with a model that uses monthly observations of industrial production, employment, and retail sales to predict real GDP growth. When the model is estimated using our preferred method, its out-of-sample forecasting performance is clearly superior to that obtained using conventional estimation, and compares favorably with that of the Blue-Chip consensus.
Working Papers , Paper 0004

Working Paper
Assessing Bayesian model comparison in small samples

We investigate the Bayesian approach to model comparison within a two-country framework with nominal rigidities using the workhorse New Keynesian open-economy model of Martnez-Garca and Wynne (2010). We discuss the trade-offs that monetary policy characterized by a Taylor-type rule faces in an interconnected world, with perfectly flexible exchange rates. We then use posterior model probabilities to evaluate the weight of evidence in support of such a model when estimated against more parsimonious specifications that either abstract from monetary frictions or assume autarky by means of ...
Globalization Institute Working Papers , Paper 189

Working Paper
Quasi Maximum Likelihood Analysis of High Dimensional Constrained Factor Models

Factor models have been widely used in practice. However, an undesirable feature of a high dimensional factor model is that the model has too many parameters. An effective way to address this issue, proposed in a seminal work by Tsai and Tsay (2010), is to decompose the loadings matrix by a high-dimensional known matrix multiplying with a low-dimensional unknown matrix, which Tsai and Tsay (2010) name the constrained factor models. This paper investigates the estimation and inferential theory of constrained factor models under large-N and large-T setup, where N denotes the number of cross ...
Supervisory Research and Analysis Working Papers , Paper RPA 18-2

Working Paper
Shrinkage estimation of high-dimensional factor models with structural instabilities

In high-dimensional factor models, both the factor loadings and the number of factors may change over time. This paper proposes a shrinkage estimator that detects and disentangles these instabilities. The new method simultaneously and consistently estimates the number of pre- and post-break factors, which liberates researchers from sequential testing and achieves uniform control of the family-wise model selection errors over an increasing number of variables. The shrinkage estimator only requires the calculation of principal components and the solution of a convex optimization problem, which ...
Working Papers , Paper 14-4

Working Paper
Financial Frictions, Financial Shocks, and Aggregate Volatility

I revisit the Great Inflation and the Great Moderation. I document an immoderation in corporate balance sheet variables so that the Great Moderation is best described as a period of divergent patterns in volatilities for real, nominal and financial variables. A model with time-varying financial frictions and financial shocks allowing for structural breaks in the size of shocks and the institutional framework is estimated. The paper shows that (i) while the Great Inflation was driven by bad luck, the Great Moderation is mostly due to better institutions; (ii) the slowdown in credit spreads is ...
Finance and Economics Discussion Series , Paper 2014-084

Working Paper
Financial Frictions, Financial Shocks, and Aggregate Volatility

I revisit the Great Inflation and the Great Moderation for nominal and real variables. I document an immoderation in corporate balance sheet variables so that the Great Moderation is best described as a period of divergent patterns in volatilities for real, nominal and financial variables. A model with time-varying financial frictions and financial shocks allowing for structural breaks in the size of shocks and the institutional framework is estimated. The paper shows that (i) while the Great Inflation was driven by bad luck, the Great Moderation was mostly due to better institutions; (ii) ...
Finance and Economics Discussion Series , Paper 2014-84

Working Paper
Financial Frictions, Financial Shocks, and Aggregate Volatility

The Great Moderation in the U.S. economy was accompanied by a widespread increase in the volatility of financial variables. We explore the sources of the divergent patterns in volatilities by estimating a model with time-varying financial rigidities subject to structural breaks in the size of the exogenous processes and two institutional characteristics: the coefficients in the monetary policy rule and the severity of the financial rigidity at the steady state. To do so, we generalize the estimation methodology developed by Curdia and Finocchiaro (2013). Institutional changes are key in ...
Finance and Economics Discussion Series , Paper 2018-054

Working Paper
Decomposing the Monetary Policy Multiplier

Financial markets play an important role in generating monetary policy transmission asymmetries in the US. Credit spreads only adjust to unexpected increases in interest rates, causing output and prices to respond more to a monetary tightening than to an expansion. At a one year horizon, the ‘financial multiplier’ of monetary policy—defined as the ratio between the cumulative responses of employment and credit spreads—is zero for a monetary expansion, -2 for a monetary tightening, and -4 for a monetary tightening that takes place under strained credit market conditions. These results ...
Working Paper Series , Paper 2023-14

FILTER BY year

FILTER BY Content Type

Working Paper 60 items

Report 10 items

Journal Article 2 items

FILTER BY Author

Chudik, Alexander 9 items

Pesaran, M. Hashem 8 items

Gospodinov, Nikolay 7 items

Martinez-Garcia, Enrique 7 items

Gayle, George-Levi 5 items

Wynne, Mark A. 5 items

show more (99)

FILTER BY Jel Classification

C11 16 items

C12 16 items

C33 9 items

G12 9 items

C23 8 items

show more (61)

FILTER BY Keywords

asset pricing 5 items

Maximum likelihood estimation 4 items

Researcher bias 4 items

Monetary Policy 4 items

Autoregressive-Distributed Lag model (ARDL) 3 items

Bayesian methods 3 items

show more (236)

PREVIOUS / NEXT