Risk bearing, implicit financial services, and specialization in the financial industry
What is the output of financial institutions? And how can we measure their nominal and, more importantly, real value, especially since many financial services are provided without explicit charges? This paper summarizes the theoretical result that, to correctly impute the nominal value of implicit financial service output, the ?user cost of money? framework needs to be extended to take account of the systematic risk in financial instruments. This extension is easy to implement in principle: One can continue using the current imputation procedure, and the only change needed is to adjust the reference rates of interest for risk. ; The paper clarifies why the risk-related income is not part of the output?or equivalently, why risk bearing is not a service?of financial institutions. The paper next argues that, to measure real output, one must first explicitly specify and define the economic services produced by financial firms, a step that is absent from the ?user cost of money? theory. Once it is established that only financial services, and not instruments, should be counted as the value added of financial firms, it follows that the quantity of services provided by these institutions is not necessarily in fixed proportion to the volume of instruments. The corollary is that the implicit price of financial services bears no definitive relationship with any reference rate. Instead, price deflators for financial services should be constructed using methods similar to those used for other services.
AUTHORS: Basu, Susanto; Wang, J. Christina
A general-equilibrium asset-pricing approach to the measurement of nominal and real bank output
This paper addresses the proper measurement of financial service output that is not priced explicitly. It shows how to impute nominal service output from financial intermediaries? interest income and how to construct price indices for those financial services. We present an optimizing model with financial intermediaries that provide financial services to resolve asymmetric information between borrowers and lenders. We embed these intermediaries in a dynamic, stochastic, general-equilibrium model where assets are priced competitively according to their systematic risk, as in the standard consumption capital-asset-pricing model. In this environment, we show that it is critical to take risk into account in order to measure financial output accurately. We also show that even using a risk-adjusted reference rate does not solve all the problems associated with measuring nominal financial service output. Our model allows us to address important outstanding questions in output and productivity measurement for financial firms, such as: (1) What are the correct ?reference rates? to use in calculating bank output? In particular, should they take account of risk? (2) If reference rates need to be risk-adjusted, does it mean that they must be ex ante rates of return? (3) What is the right price deflator for the output of financial firms? Is it just the general price index? (4) When?if ever?should we count capital gains of financial firms as part of financial service output?
AUTHORS: Wang, J. Christina; Fernald, John G.; Basu, Susanto
Technological progress, the \\"user cost of money,\\" and the real output of banks
Financial institutions provide their customers a variety of unpriced services and cover their costs through interest margins - the interest rates they receive on assets are generally higher than the rates they pay on liabilities. In particular, banks pay below-public-market interest rates on deposits while charging above-public-market rates on loans. Various authors have suggested that this situation allows one to measure the real quantity of financial services provided without explicit prices as proportional to the real stocks of financial assets held by households. We present a general-equilibrium Baumol-Tobin model where households need bank services to purchase consumption goods. Bank deposits are the single medium of exchange in the economy. The model shows that financial services are proportional to the stocks of assets only under restrictive conditions, including the assumption that either all technologies are constant or banks' technology grows at the same rate as technology in the nonfinancial economy while relative technologies of other financial institutions possibly decline. In contrast, measuring real financial output by directly counting the flow of actual services is a robust method unaffected by unbalanced technological change.
AUTHORS: Basu, Susanto; Wang, J. Christina
Uncertainty shocks in a model of effective demand
This paper examines the role of uncertainty shocks in a one-sector, representative-agent dynamic stochastic general equilibrium model. When prices are flexible, uncertainty shocks are not capable of producing business cycle comovements among key macro variables. With countercyclical markups through sticky prices, however, uncertainty shocks can generate fluctuations that are consistent with business cycles. Monetary policy usually plays a key role in offsetting the negative impact of uncertainty shocks. If the central bank is constrained by the zero lower bound, then monetary policy can no longer perform its usual stabilizing function and higher uncertainty has even more negative effects on the economy. Calibrating the size of uncertainty shocks using fluctuations in the VIX, the authors find that increased uncertainty about the future may indeed have played a significant role in worsening the Great Recession, which is consistent with statements by policymakers, economists, and the financial press.
AUTHORS: Bundick, Brent; Basu, Susanto
Productivity, welfare, and reallocation: theory and firm-level evidence
We prove that the change in welfare of a representative consumer is summarized by the current and expected future values of the standard Solow productivity residual. The equivalence holds if the representative household maximizes utility while taking prices parametrically. This result justifies total factor productivity (TFP) as the right summary measure of welfare (even in situations where it does not properly measure technology) and makes it possible to calculate the contributions of disaggregated units (industries or firms) to aggregate welfare using readily available TFP data. Based on this finding, we compute firm and industry contributions to welfare for a set of European OECD countries (Belgium, France, Great Britain, Italy, and Spain), using industry-level (EU-KLEMS) and firm-level (Amadeus) data. After adding further assumptions about technology and market structure (firms minimize costs and face common factor prices), we show that changes in welfare can be decomposed into three components that reflect, respectively, technological change, aggregate distortions, and allocative efficiency. Then, using appropriate firm-level data, we assess the importance of each of these components as sources of welfare improvement in the same set of European countries.
AUTHORS: Pascali, Luigi; Basu, Susanto; Serven, Luis; Schiantarelli, Fabio
Some evidence on the importance of sticky wages
Nominal wage stickiness is an important component of recent medium-scale macroeconomic models, but to date there has been little microeconomic evidence supporting the assumption of sluggish nominal wage adjustment. We present evidence on the frequency of nominal wage adjustment using data from the Survey of Income and Program Participation (SIPP) for the period 1996?1999. The SIPP provides high-frequency information on wages, employment, and demographic characteristics for a large and representative sample of the U.S. population. The main results of the analysis are as follows: (1) After correcting for measurement error, wages appear to be very sticky. In the average quarter, the probability that an individual will experience a nominal wage change is between 5 and 18 percent, depending on the samples and assumptions used. (2) The frequency of wage adjustment does not display significant seasonal patterns. (3) There is little heterogeneity in the frequency of wage adjustment across industries and occupations. (4) The hazard of a nominal wage change first increases and then decreases, with a peak at 12 months. (5) The probability of a wage change is positively correlated with the unemployment rate and with the consumer price inflation rate.
AUTHORS: Barattieri, Alessandro; Gottschalk, Peter; Basu, Susanto
The value of risk: measuring the service output of U. S. commercial banks
Rather than charging direct fees, banks often charge implicitly for their services via interest spreads. As a result, much of bank output has to be estimated indirectly. In contrast to current statistical practice, dynamic optimizing models of banks argue that compensation for bearing systematic risk is not part of bank output. We apply these models and find that between 1997 and 2007, in the U.S. National Accounts, on average, bank output is overestimated by 21 percent and GDP is overestimated by 0.3 percent. Moreover, compared with current methods, our new estimates imply more plausible estimates of the share of capital in income and the return on fixed capital.
AUTHORS: Wang, J. Christina; Inklaar, Robert; Basu, Susanto
Technology and business cycles; how well do standard models explain the facts?
AUTHORS: Basu, Susanto
Information and communications technology as a general-purpose technology: evidence from U.S industry data
Many people point to information and communications technology (ICT) as the key for understanding the acceleration in productivity in the United States since the mid-1990s. Stories of ICT as a 'general purpose technology' suggest that measured TFP should rise in ICT-using sectors (reflecting either unobserved accumulation of intangible organizational capital, spillovers, or both), but with a long lag. Contemporaneously, however, investments in ICT may be associated with lower TFP as resources are diverted to reorganization and learning. We find that U.S. industry results are consistent with GPT stories: the acceleration after the mid-1990s was broadbased--located primarily in ICT-using industries rather than ICT-producing industries. Furthermore, industry TFP accelerations in the 2000s are positively correlated with (appropriately weighted) industry ICT capital growth in the 1990s. Indeed, as GPT stories would suggest, after controlling for past ICT investment, industry TFP accelerations are negatively correlated with increases in ICT usage in the 2000s.
AUTHORS: Basu, Susanto; Fernald, John G.
Information and communications technology as a general purpose technology: evidence from U.S. industry data
Many people point to information and communications technology (ICT) as the key for understanding the acceleration in productivity in the United States since the mid-1990s. Stories of ICT as a general purpose technology (GPT) suggest that measured total factor productivity (TFP) should rise in ICT-using sectors (reflecting either unobserved accumulation of intangible organizational capital, spillovers, or both), but with a long lag. Contemporaneously, however, investments in ICT may be associated with lower TFP as resources are diverted to reorganization and learning. We find that U.S. industry results are consistent with GPT stories: the acceleration after the mid-1990s was broad-basedlocated primarily in ICT-using industries rather than ICT-producing industries. Furthermore, industry TFP accelerations in the 2000s are positively correlated with (appropriately weighted) industry ICT capital growth in the 1990s. Indeed, as GPT stories would suggest, after controlling for past ICT investment, industry TFP accelerations are negatively correlated with increases in ICT usage in the 2000s.
AUTHORS: Fernald, John G.; Basu, Susanto