Search Results

Showing results 1 to 10 of approximately 68.

(refine search)
SORT BY: PREVIOUS / NEXT
Jel Classification:C13 

Report
A Bayesian Approach to Inference on Probabilistic Surveys

We propose a nonparametric Bayesian approach for conducting inference on probabilistic surveys. We use this approach to study whether U.S. Survey of Professional Forecasters density projections for output growth and inflation are consistent with the noisy rational expectations hypothesis. We find that in contrast to theory, for horizons close to two years, there is no relationship whatsoever between subjective uncertainty and forecast accuracy for output growth density projections, both across forecasters and over time, and only a mild relationship for inflation projections. As the horizon ...
Staff Reports , Paper 1025

Working Paper
BLP Estimation Using Laplace Transformation and Overlapping Simulation Draws

We derive the asymptotic distribution of the parameters of the Berry et al. (1995, BLP) model in a many markets setting which takes into account simulation noise under the assumption of overlapping simulation draws. We show that, as long as the number of simulation draws R and the number of markets T approach infinity, our estimator is ?m = ?min(R,T) consistent and asymptotically normal. We do not impose any relationship between the rates at which R and T go to infinity, thus allowing for the case of R
Working Paper Series , Paper 2019-24

Working Paper
The use and abuse of \"real-time\" data in economic forecasting

We distinguish between three different ways of using real-time data to estimate forecasting equations and argue that the most frequently used approach should generally be avoided. The point is illustrated with a model that uses monthly observations of industrial production, employment, and retail sales to predict real GDP growth. When the model is estimated using our preferred method, its out-of-sample forecasting performance is clearly superior to that obtained using conventional estimation, and compares favorably with that of the Blue-Chip consensus.
Working Papers , Paper 0004

Working Paper
Assessing Bayesian model comparison in small samples

We investigate the Bayesian approach to model comparison within a two-country framework with nominal rigidities using the workhorse New Keynesian open-economy model of Martnez-Garca and Wynne (2010). We discuss the trade-offs that monetary policy characterized by a Taylor-type rule faces in an interconnected world, with perfectly flexible exchange rates. We then use posterior model probabilities to evaluate the weight of evidence in support of such a model when estimated against more parsimonious specifications that either abstract from monetary frictions or assume autarky by means of ...
Globalization Institute Working Papers , Paper 189

Working Paper
Quasi Maximum Likelihood Analysis of High Dimensional Constrained Factor Models

Factor models have been widely used in practice. However, an undesirable feature of a high dimensional factor model is that the model has too many parameters. An effective way to address this issue, proposed in a seminal work by Tsai and Tsay (2010), is to decompose the loadings matrix by a high-dimensional known matrix multiplying with a low-dimensional unknown matrix, which Tsai and Tsay (2010) name the constrained factor models. This paper investigates the estimation and inferential theory of constrained factor models under large-N and large-T setup, where N denotes the number of cross ...
Supervisory Research and Analysis Working Papers , Paper RPA 18-2

Working Paper
Shrinkage estimation of high-dimensional factor models with structural instabilities

In high-dimensional factor models, both the factor loadings and the number of factors may change over time. This paper proposes a shrinkage estimator that detects and disentangles these instabilities. The new method simultaneously and consistently estimates the number of pre- and post-break factors, which liberates researchers from sequential testing and achieves uniform control of the family-wise model selection errors over an increasing number of variables. The shrinkage estimator only requires the calculation of principal components and the solution of a convex optimization problem, which ...
Working Papers , Paper 14-4

Working Paper
Pooled Bewley Estimator of Long-Run Relationships in Dynamic Heterogenous Panels

This paper, using the Bewley (1979) transformation of the autoregressive distributed lag model, proposes a novel pooled Bewley (PB) estimator of long-run coefficients for dynamic panels with heterogeneous short-run dynamics, in the same setting as the widely used Pooled Mean Group (PMG) estimator. Asymptotic normality of the PB estimator is established, and Monte Carlo simulations reveal a good small sample performance of PB compared with existing estimators in the literature, namely PMG, PDOLS and FMOLS. This paper also considers application of two bias-correction methods and a bootstrapping ...
Globalization Institute Working Papers , Paper 409

Report
The FRBNY staff underlying inflation gauge: UIG

Monetary policymakers and long-term investors would benefit greatly from a measure of underlying inflation that uses all relevant information, is available in real time, and forecasts inflation better than traditional underlying inflation measures such as core inflation measures. This paper presents the ?FRBNY Staff Underlying Inflation Gauge (UIG)? for CPI and PCE. Using a dynamic factor model approach, the UIG is derived from a broad data set that extends beyond price series to include a wide range of nominal, real, and financial variables. It also considers the specific and time-varying ...
Staff Reports , Paper 672

Working Paper
Finding Needles in Haystacks: Multiple-Imputation Record Linkage Using Machine Learning

This paper considers the problem of record linkage between a household-level survey and an establishment-level frame in the absence of unique identifiers. Linkage between frames in this setting is challenging because the distribution of employment across establishments is highly skewed. To address these difficulties, this paper develops a probabilistic record linkage methodology that combines machine learning (ML) with multiple imputation (MI). This ML-MI methodology is applied to link survey respondents in the Health and Retirement Study to their workplaces in the Census Business Register. ...
Working Papers , Paper 22-11

Working Paper
Impacts of Monetary Stimulus on Credit Allocation and Macroeconomy: Evidence from China

We develop a new empirical framework to identify and estimate the effects of monetary stimulus on the real economy. The framework is applied to the Chinese economy when monetary policy in normal times was switched to an extraordinarily expansionary regime to combat the impact of the 2008 financial crisis. We show that this unprecedented monetary stimulus accounted for as high as a 4 percent increase of real gross domestic product (GDP) growth rate by the end of 2009. Monetary transmission to the real economy was through bank credit allocated disproportionately to financing investment in real ...
FRB Atlanta Working Paper , Paper 2016-9

FILTER BY year

FILTER BY Content Type

Working Paper 56 items

Report 10 items

Journal Article 2 items

FILTER BY Author

Chudik, Alexander 8 items

Gospodinov, Nikolay 7 items

Martinez-Garcia, Enrique 7 items

Pesaran, M. Hashem 7 items

Gayle, George-Levi 5 items

Wynne, Mark A. 5 items

show more (93)

FILTER BY Jel Classification

C11 16 items

C12 15 items

C33 8 items

C23 7 items

C32 7 items

show more (60)

FILTER BY Keywords

asset pricing 5 items

Researcher bias 4 items

Bayesian methods 3 items

Financial frictions 3 items

Financial shocks 3 items

GMM 3 items

show more (228)

PREVIOUS / NEXT