Search Results

Showing results 1 to 10 of approximately 16.

(refine search)
SORT BY: PREVIOUS / NEXT
Jel Classification:C18 

Report
On binscatter

Binscatter, or a binned scatter plot, is a very popular tool in applied microeconomics. It provides a flexible, yet parsimonious way of visualizing and summarizing mean, quantile, and other nonparametric regression functions in large data sets. It is also often used for informal evaluation of substantive hypotheses such as linearity or monotonicity of the unknown function. This paper presents a foundational econometric analysis of binscatter, offering an array of theoretical and practical results that aid both understanding current practices (that is, their validity or lack thereof) as well ...
Staff Reports , Paper 881

Working Paper
Estimating (Markov-Switching) VAR Models without Gibbs Sampling: A Sequential Monte Carlo Approach

Vector autoregressions with Markov-switching parameters (MS-VARs) fit the data better than do their constant-parameter predecessors. However, Bayesian inference for MS-VARs with existing algorithms remains challenging. For our first contribution, we show that Sequential Monte Carlo (SMC) estimators accurately estimate Bayesian MS-VAR posteriors. Relative to multi-step, model-specific MCMC routines, SMC has the advantages of generality, parallelizability, and freedom from reliance on particular analytical relationships between prior and likelihood. For our second contribution, we use SMC's ...
Finance and Economics Discussion Series , Paper 2015-116

Working Paper
Poor (Wo)man’s Bootstrap

The bootstrap is a convenient tool for calculating standard errors of the parameters of complicated econometric models. Unfortunately, the fact that these models are complicated often makes the bootstrap extremely slow or even practically infeasible. This paper proposes an alternative to the bootstrap that relies only on the estimation of one-dimensional parameters. The paper contains no new difficult math. But we believe that it can be useful.
Working Paper Series , Paper WP-2015-1

Working Paper
Robust Bayesian Analysis for Econometrics

We review the literature on robust Bayesian analysis as a tool for global sensitivity analysis and for statistical decision-making under ambiguity. We discuss the methods proposed in the literature, including the different ways of constructing the set of priors that are the key input of the robust Bayesian analysis. We consider both a general set-up for Bayesian statistical decisions and inference and the special case of set-identified structural models. We provide new results that can be used to derive and compute the set of posterior moments for sensitivity analysis and to compute the ...
Working Paper Series , Paper WP-2021-11

Working Paper
Revisiting the Great Ratios Hypothesis

The idea that certain economic variables are roughly constant in the long run is an old one. Kaldor described them as stylized facts, whereas Klein and Kosobud labelled them great ratios. While such ratios are widely adopted in theoretical models in economics as conditions for balanced growth, arbitrage or solvency, the empirical literature has tended to find little evidence for them. We argue that this outcome could be due to episodic failure of cointegration, possible two-way causality between the variables in the ratios and cross-country error dependence due to latent factors. We propose a ...
Globalization Institute Working Papers , Paper 415

Working Paper
In Search of Lost Time Aggregation

In 1960, Working noted that time aggregation of a random walk induces serial correlation in the first difference that is not present in the original series. This important contribution has been overlooked in a recent literature analyzing income and consumption in panel data. I examine Blundell, Pistaferri and Preston (2008) as an important example for which time aggregation has quantitatively large effects. Using new techniques to correct for the problem, I find the estimate for the partial insurance to transitory shocks, originally estimated to be 0.05, increases to 0.24. This larger ...
Finance and Economics Discussion Series , Paper 2019-075

Working Paper
Learning About Consumer Uncertainty from Qualitative Surveys: As Uncertain As Ever

We study diffusion indices constructed from qualitative surveys to provide real-time assessments of various aspects of economic activity. In particular, we highlight the role of diffusion indices as estimates of change in a quasi extensive margin, and characterize their distribution, focusing on the uncertainty implied by both sampling and the polarization of participants' responses. Because qualitative tendency surveys generally cover multiple questions around a topic, a key aspect of this uncertainty concerns the coincidence of responses, or the degree to which polarization comoves, across ...
Working Paper , Paper 15-9

Working Paper
Latent Variables Analysis in Structural Models: A New Decomposition of the Kalman Smoother

This paper advocates chaining the decomposition of shocks into contributions from forecast errors to the shock decomposition of the latent vector to better understand model inference about latent variables. Such a double decomposition allows us to gauge the inuence of data on latent variables, like the data decomposition. However, by taking into account the transmission mechanisms of each type of shock, we can highlight the economic structure underlying the relationship between the data and the latent variables. We demonstrate the usefulness of this approach by detailing the role of ...
Finance and Economics Discussion Series , Paper 2020-100

Discussion Paper
Estimating the output gap in real time

I propose a novel method of estimating the potential level of U.S. GDP in real time. The proposed wage-based measure of economic potential remains virtually unchanged when new data are released. The distance between current and potential output ? the output gap ? satisfies Okun?s law and outperforms many other measures of slack in forecasting inflation. Thus, I provide a robust statistical tool useful for understanding current economic conditions and guiding policymaking.
Staff Papers , Issue Dec

Working Paper
Explaining Machine Learning by Bootstrapping Partial Dependence Functions and Shapley Values

Machine learning and artificial intelligence methods are often referred to as “black boxes” when compared with traditional regression-based approaches. However, both traditional and machine learning methods are concerned with modeling the joint distribution between endogenous (target) and exogenous (input) variables. Where linear models describe the fitted relationship between the target and input variables via the slope of that relationship (coefficient estimates), the same fitted relationship can be described rigorously for any machine learning model by first-differencing the partial ...
Research Working Paper , Paper RWP 21-12

FILTER BY year

FILTER BY Content Type

FILTER BY Author

FILTER BY Jel Classification

C52 4 items

C10 3 items

C11 3 items

C14 3 items

C15 2 items

show more (28)

FILTER BY Keywords

bootstrap 3 items

inference 3 items

standard error 2 items

Machine learning 2 items

Achievement inequality 1 items

Artificial intelligence 1 items

show more (70)

PREVIOUS / NEXT