Search Results

Showing results 1 to 10 of approximately 10.

(refine search)
SORT BY: PREVIOUS / NEXT
Keywords:Statistical methods 

Working Paper
Assessing the Historical Role of Credit: Business Cycles, Financial Crises, and the Legacy of Charles S. Peirce
This paper provides a historical overview on financial crises and their origins. The objective is to discuss a few of the modern statistical methods that can be used to evaluate predictors of these rare events. The problem involves prediction of binary events and therefore fits modern statistical learning, signal processing theory, and classification methods. The discussion also emphasizes the need to supplement statistics and computational techniques with economics. A forecast?s success in this environment hinges on the economic consequences of the actions taken as a result of the forecast, rather than on typical statistical metrics of prediction accuracy.
AUTHORS: Jordà, Òscar
DATE: 2013-07-01

Working Paper
Testing for cointegration using the Johansen methodology when variables are near-integrated
We investigate the properties of Johansen's (1988, 1991) maximum eigenvalue and trace tests for cointegration under the empirically relevant situation of near-integrated variables. Using Monte Carlo techniques, we show that in a system with near-integrated variables, the probability of reaching an erroneous conclusion regarding the cointegrating rank of the system is generally substantially higher than the nominal size. The risk of concluding that completely unrelated series are cointegrated is therefore non-negligible. The spurious rejection rate can be reduced by performing additional tests of restrictions on the cointegrating vector(s), although it is still substantially larger than the nominal size.
AUTHORS: Hjalmarsson, Erik; Osterholm, Par
DATE: 2007

Working Paper
Diverging measures of capacity utilization: an explanation
In the wake of the recent recovery in manufacturing production, the capacity utilization rates published by the Federal Reserve Board (FRB) have rebounded much more slowly than those published by the Institute for Supply Management (ISM). As a result, some observers have speculated that the manufacturing sector may have considerably less slack than is indicated by the FRB measures. Our view is that the two characterizations of manufacturing slack are not as incongruent as they first appear. This paper discusses the practical and conceptual differences between these measures of capacity utilization, and concludes that the recent divergence simply reflects the character of the latest business cycle.
AUTHORS: Stevens, John J.; Morin, Norman J.
DATE: 2004

Working Paper
Incorporating judgement in fan charts
Within a decision-making group, such as the monetary-policy committee of a central bank, group members often hold differing views about the future of key economic variables. Such differences of opinion can be thought of as reflecting differing sets of judgement. This paper suggests modelling each agent's judgement as one scenario in a macroeconomic model. Each judgement set has a specific dynamic impact on the system, and accordingly, a particular predictive density - or fan chart - associated with it. A weighted linear combination of the predictive densities yields a final predictive density that correctly reflects the uncertainty perceived by the agents generating the forecast. In a model-based environment, this framework allows judgement to be incorporated into fan charts in a formalised manner.
AUTHORS: Pär Österholm
DATE: 2006

Working Paper
Estimating capacity utilization from survey data
In this paper, we review the history and concepts behind the Federal Reserve's measures of capacity and capacity utilization, summarize the methods used to construct the measures, and describe the principal source data for these measures--the Census Bureau's Survey of Plant Capacity. We show that the aggregate manufacturing utilization rate from the Survey of Plant Capacity does not exhibit the "cyclical bias" possessed by utilization rates from the less statistically rigorous utilization rate surveys previously used to estimate the Federal Reserve's measures. At the detailed industry level, utilization rates from the Survey of Plant Capacity for several industries do appear to possess a cyclical bias, but we demonstrate that this bias is removed in the construction of the Federal Reserve capacity measures. We further show that the Federal Reserve measures, by combining the Census survey utilization rates with other indicators of capacity, do not discard significant information contained in the Census rates. In fact, the Federal Reserve procedures add to the predictive content of the Census utilization rates in models of capital spending, capacity expansion, and changes in price inflation.
AUTHORS: Morin, Norman J.; Stevens, John J.
DATE: 2004

Working Paper
Integrating expenditure and income data: what to do with the statistical discrepancy?
The purpose of this paper is to build consistent, integrated datasets to investigate whether various disaggregated data can shed light on the possible sources of the statistical discrepancy. Our strategy is first to use disaggregated data to estimate consistent sets of input-output models that sum to either GDP or GDI and compare the two in order to see where the discrepancy resides. We find a few "problem" industries that appear to explain most of the statistical discrepancy. Second, we explore what combination of the expenditure data and the income data seem to produce the most sensible data according to a few economic criteria. A mixture of data that do not aggregate either to GDP or to GDI appears optimal.
AUTHORS: Beaulieu, J. Joseph; Bartelsman, Eric J.
DATE: 2004

Journal Article
Adjustment for seasonal variation
AUTHORS: Barton, H. C.
DATE: 1941

Report
Bootstrapping density-weighted average derivatives
Employing the "small-bandwidth" asymptotic framework of Cattaneo, Crump, and Jansson (2009), this paper studies the properties of several bootstrap-based inference procedures associated with a kernel-based estimator of density-weighted average derivatives proposed by Powell, Stock, and Stoker (1989). In many cases, the validity of bootstrap-based inference procedures is found to depend crucially on whether the bandwidth sequence satisfies a particular (asymptotic linearity) condition. An exception to this rule occurs for inference procedures involving a studentized estimator that employs a "robust" variance estimator derived from the "small-bandwidth" asymptotic framework. The results of a small-scale Monte Carlo experiment are found to be consistent with the theory and indicate in particular that sensitivity with respect to the bandwidth choice can be ameliorated by using the "robust" variance estimator.
AUTHORS: Cattaneo, Matias D.; Jansson, Michael; Crump, Richard K.
DATE: 2010

Report
Parsimonious estimation with many instruments
We suggest a way to perform parsimonious instrumental variables estimation in the presence of many, and potentially weak, instruments. In contrast to standard methods, our approach yields consistent estimates when the set of instrumental variables complies with a factor structure. In this sense, our method is equivalent to instrumental variables estimation that is based on principal components. However, even if the factor structure is weak or nonexistent, our method, unlike the principal components approach, still yields consistent estimates. Indeed, simulations indicate that our approach always dominates standard instrumental variables estimation, regardless of whether the factor relationship underlying the set of instruments is strong, weak, or absent.
AUTHORS: Groen, Jan J. J.; Kapetanios, George
DATE: 2009

Working Paper
Covariates and causal effects: the problem of context
This paper is concerned with understanding how causal effects can be identified in past data and then used to predict the future in light of the problem of context, or the fact that treatment always influences the outcome variable in combination with covariates. Structuralist and experimentalist views of econometric methodology can be reconciled by adopting notation capable of distinguishing between effects independent of and dependent on context, or direct and net effects. By showing that identification of direct and net effects imposes distinct assumptions on selection into covariates (i.e., exclusion restrictions) and explicitly constructing predictions based on past effects, the paper is able to characterize the tradeoff researchers face. Relative to direct effects, net effects can be identified in the past from more general data-generating processes (DGPs), but they can predict the future of less general DGPs. Predicting the future with either type of effect requires knowledge of direct effects. To highlight implications for applied work, I discuss why Local Average Treatment Effects and Marginal Treatment Effects of educational attainment are net effects and are therefore difficult to interpret, even when identified with a perfectly randomized treatment.
AUTHORS: Aliprantis, Dionissi
DATE: 2013

PREVIOUS / NEXT