Search Results
Showing results 1 to 10 of approximately 16.
(refine search)
Working Paper
A New Tool for Robust Estimation and Identification of Unusual Data Points
Most consistent estimators are what Müller (2007) terms “highly fragile”: prone to total breakdown in the presence of a handful of unusual data points. This compromises inference. Robust estimation is a (seldom-used) solution, but commonly used methods have drawbacks. In this paper, building on methods that are relatively unknown in economics, we provide a new tool for robust estimates of mean and covariance, useful both for robust estimation and for detection of unusual data points. It is relatively fast and useful for large data sets. Our performance testing indicates that our baseline ...
Working Paper
FRED-MD: A Monthly Database for Macroeconomic Research
This paper describes a large, monthly frequency, macroeconomic database with the goal of establishing a convenient starting point for empirical analysis that requires "big data." The dataset mimics the coverage of those already used in the literature but has three appealing features. First, it is designed to be updated monthly using the FRED database. Second, it will be publicly accessible, facilitating comparison of related research and replication of empirical work. Third, it will relieve researchers from having to manage data changes and revisions. We show that factors extracted from our ...
Working Paper
Nowcasting Tail Risks to Economic Activity with Many Indicators
This paper focuses on tail risk nowcasts of economic activity, measured by GDP growth, with a potentially wide array of monthly and weekly information. We consider different models (Bayesian mixed frequency regressions with stochastic volatility, classical and Bayesian quantile regressions, quantile MIDAS regressions) and also different methods for data reduction (either the combination of forecasts from smaller models or forecasts from models that incorporate data reduction). The results show that classical and MIDAS quantile regressions perform very well in-sample but not out-of-sample, ...
Working Paper
Important Factors Determining Fintech Loan Default: Evidence from the LendingClub Consumer Platform
This study examines key default determinants of fintech loans, using loan-level data from the LendingClub consumer platform during 2007–2018. We identify a robust set of contractual loan characteristics, borrower characteristics, and macroeconomic variables that are important in determining default. We find an important role of alternative data in determining loan default, even after controlling for the obvious risk characteristics and the local economic factors. The results are robust to different empirical approaches. We also find that homeownership and occupation are important factors in ...
Working Paper
Technological Innovation and Discrimination in Household Finance
Technology has changed how discrimination manifests itself in financial services. Replacing human discretion with algorithms in decision-making roles reduces taste-based discrimination, and new modeling techniques have expanded access to financial services to households who were previously excluded from these markets. However, algorithms can exhibit bias from human involvement in the development process, and their opacity and complexity can facilitate statistical discrimination inconsistent with antidiscrimination laws in several aspects of financial services provision, including advertising, ...
Working Paper
FRED-QD: A Quarterly Database for Macroeconomic Research
In this paper we present and describe a large quarterly frequency, macroeconomic database. The data provided are closely modeled to that used in Stock and Watson (2012a). As in our previous work on FRED-MD, our goal is simply to provide a publicly available source of macroeconomic “big data” that is updated in real time using the FRED database. We show that factors extracted from this data set exhibit similar behavior to those extracted from the original Stock and Watson data set. The dominant factors are shown to be insensitive to outliers, but outliers do affect the relative influence ...
Working Paper
Nowcasting Tail Risks to Economic Activity with Many Indicators
This paper focuses on nowcasts of tail risk to GDP growth, with a potentially wide array of monthly and weekly information. We consider different models (Bayesian mixed frequency regressions with stochastic volatility, classical and Bayesian quantile regressions, quantile MIDAS regressions) and also different methods for data reduction (either forecasts from models that incorporate data reduction or the combination of forecasts from smaller models). Our results show that, within some limits, more information helps the accuracy of nowcasts of tail risk to GDP growth. Accuracy typically ...
Working Paper
Improving the Accuracy of Economic Measurement with Multiple Data Sources: The Case of Payroll Employment Data
This paper combines information from two sources of U.S. private payroll employment to increase the accuracy of real-time measurement of the labor market. The sources are the Current Employment Statistics (CES) from BLS and microdata from the payroll processing firm ADP. We briefly describe the ADP-derived data series, compare it to the BLS data, and describe an exercise that benchmarks the data series to an employment census. The CES and the ADP employment data are each derived from roughly equal-sized samples. We argue that combining CES and ADP data series reduces the measurement error ...
Working Paper
From Transactions Data to Economic Statistics: Constructing Real-time, High-frequency, Geographic Measures of Consumer Spending
Access to timely information on consumer spending is important to economic policymakers. The Census Bureau's monthly retail trade survey is a primary source for monitoring consumer spending nationally, but it is not well suited to study localized or short-lived economic shocks. Moreover, lags in the publication of the Census estimates and subsequent, sometimes large, revisions diminish its usefulness for real-time analysis. Expanding the Census survey to include higher frequencies and subnational detail would be costly and would add substantially to respondent burden. We take an alternative ...
Working Paper
The perils of working with Big Data and a SMALL framework you can use to avoid them
The use of “Big Data” to explain fluctuations in the broader economy or guide the business decisions of a firm is now so commonplace that in some instances it has even begun to rival more traditional government statistics and business analytics. Big data sources can very often provide advantages when compared to these more traditional data sources, but with these advantages also comes the potential for pitfalls. We lay out a framework called SMALL that we have developed in order to help interested parties as they navigate the big data minefield. Based on a set of five questions, the SMALL ...