Please Note: If the title of a paper is highlighted, you can get to the full text for that paper by clicking on the highlighted area. Full text files are in pdf format; to access them, you must have Adobe Reader.
We compare the performance of unsecured personal installment loans made by traditional bank lenders with that of LendingClub, using a stochastic frontier estimation technique to decompose the observed nonperforming loans into three components. The first is the best-practice minimum ratio that a lender could achieve if it were fully efficient at credit-risk evaluation and loan management. The second is a ratio that reflects the difference between the observed ratio (adjusted for noise) and the minimum ratio that gauges the lender’s relative proficiency at credit analysis and loan monitoring. The third is statistical noise. In 2013 and 2016, the largest bank lenders experienced the highest ratio of nonperformance, the highest inherent credit risk, and the highest lending efficiency, indicating that their high ratio of nonperformance is driven by inherent credit risk, rather than by lending inefficiency. LendingClub’s performance was similar to small bank lenders as of 2013. As of 2016, LendingClub’s performance resembled the largest bank lenders — the highest ratio of nonperforming loans, inherent credit risk, and lending efficiency — although its loan volume was smaller. Our findings are consistent with a previous study that suggests LendingClub became more effective in risk identification and pricing starting in 2015. Caveat: We note that this conclusion may not be applicable to fintech lenders in general, and the results may not hold under different economic conditions such as a downturn.
(770.0 KB, 37 pages)
We examine the role of demographics and changing industrial policies in accounting for the rapid rise in household savings and in per capita output growth in China since the mid-1970s. The demographic changes come from reductions in the fertility rate and increases in the life expectancy, while the industrial policies take many forms. These policies cause important structural changes; first benefiting private labor-intensive firms by incentivizing them to increase their share of employment, and later on benefiting capital-intensive firms resulting in an increasing share of capital devoted to heavy industries. We conduct our analysis in a general equilibrium economy that also features endogenous human capital investment. We calibrate the model to match key economic variables of the Chinese economy and show that demographic changes and industrial policies both contributed to increases in savings and output growth but with differing intensities and at different horizons. We further demonstrate the importance of endogenous human capital investment in accounting for the economic growth in China.
(825.0 KB, 45 pages)
This paper shows that the capitalization of local amenities is eﬀectively priced into land via a two-part pricing formula: a “ticket” price paid regardless of the amount of housing service consumed and a “slope” price paid per unit of services. We ﬁrst show theoretically how tickets arise as an extensive margin price when there are binding constraints on the number of households admitted to a neighborhood. We use a large national dataset of housing transactions, property characteristics, and neighborhood attributes to measure the extent to which local amenities are capitalized in ticket prices vis-à-vis slopes. We ﬁnd that in most U.S. cities, the majority of neighborhood variation in pricing occurs via tickets, although the importance of tickets rises sharply in the stringency of land development regulations, as predicted by theory. We discuss implications of two-part pricing for eﬃciency and equity in neighborhood sorting equilibria and for empirical estimates of willingness to pay for non marketed amenities, which generally assume proportional pricing only.
(1.0 MB, 88 pages)
Mortgage loss-given-default (LGD) increased significantly when house prices plummeted and delinquencies rose during the financial crisis, but it has remained over 40 percent in recent years despite a strong housing recovery. Our results indicate that the sustained high LGDs post-crisis are due to a combination of an overhang of crisis-era foreclosures and prolonged foreclosure timelines, which have offset higher sales recoveries. Simulations show that cutting foreclosure timelines by one year would cause LGD to decrease by 5–8 percentage points, depending on the trade-off between lower liquidation expenses and lower sales recoveries. Using difference-in-differences tests, we also find that recent consumer protection programs have extended foreclosure timelines and increased loss severities in spite of their benefits of increasing loan modifications and enhancing consumer protections.
Supersedes Working Paper 17-08.
(838.0 KB, 49 pages)
Larger firms (by sales or employment) have higher leverage. This pattern is explained using a model in which firms produce multiple varieties and borrow with the option to default against their future cash flow. A variety can die with a constant probability, implying that bigger firms (those with more varieties) have lower coefficient of variation of sales and higher leverage. A lower risk-free rate benefits bigger firms more as they are able to lever more and existing firms buy more of the new varieties arriving into the economy. This leads to lower startup rates and greater concentration of sales.
(577.0 KB, 37 pages)
This paper examines a novel mechanism of credit-history building as a way of aggregating information across multiple lenders. We build a dynamic model with multiple competing lenders, who have heterogeneous private information about a consumer's creditworthiness, and extend credit over multiple stages. Acquiring a loan at an early stage serves as a positive signal | it allows the borrower to convey to other lenders the existence of a positively informed lender (advancing that early loan) — thereby convincing other lenders to extend further credit in future stages. This signaling may be costly to the least risky borrowers for two reasons. First, taking on an early loan may involve cross-subsidization from the least risky borrowers to more risky borrowers. Second, the least risky borrowers may take inefficiently large loans relative to the symmetric-information benchmark. We demonstrate that, despite these two possible costs, the least risky borrowers often prefer these equilibria to those without information aggregation. Our analysis offers an interesting and novel insight into debt dilution. Contrary to the conventional wisdom, repayment of the early loan is more likely when a borrower subsequently takes on a larger rather than a smaller additional loan. This result hinges on a selection effect: larger subsequent loans are only given to the least risky borrowers.
(506.0 KB, 45 pages)
Modern urban economic theory and policymakers are coming to see the provision of consumer-leisure amenities as a way to attract population, especially the highly skilled and their employers. However, past studies have arguably only provided indirect evidence of the importance of leisure amenities for urban development. In this paper, we propose and validate the number of tourist trips and the number of crowdsourced picturesque locations as measures of consumer revealed preferences for local lifestyle amenities. Urban population growth in the 1990-2010 period was about 10 percentage points (about one standard deviation) higher in a metro area that was perceived as twice more picturesque. This measure ties with low taxes as the most important predictor of urban population growth. “Beautiful cities” disproportionally attracted highly educated individuals and experienced faster housing price appreciation, especially in supply-inelastic markets. In contrast to the generally declining trend of the American central city, neighborhoods that were close to central recreational districts have experienced economic growth, albeit at the cost of minority displacement.
Supersedes Working Paper 08-22.
(1.0 MB, 57 pages)
Banking regulation routinely designates some assets as safe and thus does not require banks to hold any additional capital to protect against losses from these assets. A typical such safe asset is domestic government debt. There are numerous examples of banking regulation treating domestic government bonds as “safe,” even when there is clear risk of default on these bonds. We show, in a parsimonious model, that this failure to recognize the riskiness of government debt allows (and induces) domestic banks to “gamble” with depositors’ funds by purchasing risky government bonds (and assets closely correlated with them). A sovereign default in this environment then results in a banking crisis. Critically, we show that permitting banks to gamble this way lowers the cost of borrowing for the government. Thus, if the borrower and the regulator are the same entity (the government), that entity has an incentive to ignore the riskiness of the sovereign bonds. We present empirical evidence in support of the key mechanism we are highlighting, drawing on the experience of Russia in the run-up to its 1998 default and on the recent Eurozone debt crisis.
(505.0 KB, 42 pages)
Can a behavioral sufficient statistic empirically capture cross-consumer variation in behavioral tendencies and help identify whether behavioral biases, taken together, are linked to material consumer welfare losses? Our answer is yes. We construct simple consumer-level behavioral sufficient statistics — “B-counts” — by eliciting seventeen potential sources of behavioral biases per person, in a nationally representative panel, in two separate rounds nearly three years apart. B-counts aggregate information on behavioral biases within-person. Nearly all consumers exhibit multiple biases, in patterns assumed by behavioral sufficient statistic models (a la Chetty), and with substantial variation across people. B-counts are stable within-consumer over time, and that stability helps to address measurement error when using B-counts to model the relationship between biases, decision utility, and experienced utility. Conditional on classical inputs — risk aversion and patience, life-cycle factors and other demographics, cognitive and non-cognitive skills, and financial resources — B-counts strongly negatively correlate with both objective and subjective aspects of experienced utility. The results hold in much lower-dimensional models employing “Sparsity B-counts” based on bias subsets (a la Gabaix) and/or fewer covariates, illuminating lower-cost ways to use behavioral sufficient statistics to help capture the combined influence of multiple behavioral biases for a wide range of research questions and applications.
(1.0 MB, 98 pages)
The Great Recession led to widespread mortgage defaults, with borrowers resorting to both foreclosures and short sales to resolve their defaults. I first quantify the economic impact of foreclosures relative to short sales by comparing the home price implications of both. After accounting for omitted variable bias, I find that homes selling as short sales transact at 9.2% to 10.5% higher prices on average than those that sell after foreclosure. Short sales also exert smaller negative externalities than foreclosures, with one short sale decreasing nearby property values by 1 percentage point less than a foreclosure. So why weren’t short sales more prevalent? These home price benefits did not increase the prevalence of short sales because free rents during foreclosures caused more borrowers to select foreclosures, even though higher advances led servicers to prefer more short sales. In states with longer foreclosure timelines, the benefits from foreclosures increased for borrowers, so short sales were less utilized. I find that one standard deviation increase in the average length of the foreclosure process decreased the short sale share by 0.35 to 0.45 standard deviation. My results suggest that policies that increase the relative attractiveness of short sales could help stabilize distressed housing markets.
(1.0 MB, 64 pages)
We construct a model of consumer credit with payment frictions, such as spatial separation and unsynchronized trading patterns, to study optimal monetary policy across different interbank market structures. In our framework, intermediaries play an essential role in the functioning of the payment system, and monetary policy influences the equilibrium allocation through the interest rate on reserves. If interbank credit markets are incomplete, then monetary policy plays a crucial role in the smooth operation of the payment system. Specifically, an equilibrium in which privately issued debt claims are not discounted is shown to exist provided the initial wealth in the intermediary sector is sufficiently large relative to the size of the retail sector. Such an equilibrium with an efficient payment system requires setting the interest rate on reserves sufficiently close to the rate of time preference.
(293.0 KB, 35 pages)
What is meant by economic progress, and how should it be measured? The conventional answer is growth in real GDP over time or compared across countries, a monetary measure adjusted for the general rate of increase in prices. However, there is increasing interest in developing an alternative understanding of economic progress, particularly in the context of digitalization of the economy and the consequent significant changes Internet use is bringing about in production and household activity. This paper discusses one alternative approach, combining an extended utility framework considering time allocation over paid work, household work, leisure, and consumption with measures of objective or subjective well-being while engaging in different activities. Developing this wider economic welfare measure would require the collection of time use statistics as well as well-being data and direct survey evidence, such as the willingness to pay for leisure time. We advocate an experimental set of time and well-being accounts, with a particular focus on the digitally driven shifts in behavior.
(492.0 KB, 26 pages)
We extend Duffie, Gârleanu, and Pedersen’s (2005) search-theoretic model of over-the-counter (OTC) asset markets, allowing for a decentralized inter-dealer market with arbitrary heterogeneity in dealers’ valuations or inventory costs. We develop a solution technique that makes the model fully tractable and allows us to derive, in closed form, theoretical formulas for key statistics analyzed in empirical studies of the intermediation process in OTC markets. A calibration to the market for municipal securities reveals that the model can generate trading patterns and prices that are quantitatively consistent with the data. We use the calibrated model to compare the gains from trade that are realized in this frictional market with those from a hypothetical, frictionless environment, and to distinguish between the quantitative implications of various types of heterogeneity across dealers.
Supersedes Working Paper 15-22.
(549.0 KB, 80 pages)
In recent years, there has been an abundance of empirical work examining price setting behavior at the micro level. First generation models with price setting rigidities were generally at odds with much of the micro price data. A second generation of models, with fixed costs of price adjustment and idiosyncratic shocks, have attempted to rectify this shortcoming. Using a model that matches a large set of microeconomic facts we find significant nonneutrality. We decompose the nonneutrality and find that state dependence plays an important part in the responses of output and inflation to a monetary shock. We also examine how aggregating firm behavior can generate flat hazards. Last, we find that the steady state statistic developed by Alvarez, Le Bihan, and Lippi (2016) is an imperfect guide to characterizing nonneutrality in our model.
(596.0 KB, 41 pages)
The Current Expected Credit Loss (CECL) framework represents a new approach for calculating the allowance for credit losses. Credit cards are the most common form of revolving consumer credit and are likely to present conceptual and modeling challenges during CECL implementation. We look back at nine years of account-level credit card data, starting with 2008, over a time period encompassing the bulk of the Great Recession as well as several years of economic recovery. We analyze the performance of the CECL framework under plausible assumptions about allocations of future payments to existing credit card loans, a key implementation element. Our analysis focuses on three major themes: defaults, balances, and credit loss. Our analysis indicates that allowances are significantly impacted by specific payment allocation assumptions as well as downturn economic conditions. We also compare projected allowances with realized credit losses and observe a significant divergence resulting from the revolving nature of credit card portfolios. We extend our analysis across segments of the portfolio with different risk profiles. Interestingly, fewer risky segments of the portfolio are proportionally more impacted by specific payment assumptions and downturn economic conditions. Our findings suggest that the effect of the new allowance framework on a specific credit card portfolio will depend critically on its risk profile. Thus, our findings should be interpreted qualitatively, rather than quantitatively. Finally, the goal is to gain a better understanding of the sensitivity of allowances to plausible variations in assumptions about the allocation of future payments to present credit card loans. Thus, we do not offer specific best practice guidance.
(1.0 MB, 41 pages)
We document that postwar U.S. elections show a strong pattern of “incumbency disadvantage": If a party has held the presidency of the country or the governorship of a state for some time, that party tends to lose popularity in the subsequent election. To explain this fact, we employ Alesina and Tabellini's (1990) model of partisan politics, extended to have elections with prospective voting. We show that inertia in policies, combined with sufficient uncertainty in election outcomes, implies incumbency disadvantage. We find that inertia can cause parties to target policies that are more extreme than the policies they would support in the absence of inertia and that such extremism can be welfare reducing.
Supersedes Working Paper 17-43.
(698.0 KB, 64 pages)
We investigate the effect of declining house prices on household consumption behavior during 2006–2009. We use an individual-level dataset that has detailed information on borrower characteristics, mortgages and credit risk. Proxying consumption by individual-level auto loan originations, we decompose the effect of declining house prices on consumption into three main channels: wealth effect, household financial constraints, and bank health. We find a negligible wealth effect. Tightening household-level financial constraints can explain 40-45 percent of the response of consumption to declining house prices. Deteriorating bank health leads to reduced credit supply both to households and firms. Our dataset allows us to estimate the effect of this on households as 20-25 percent of the consumption response. The remaining 35 percent is a general equilibrium effect that works via a decline in employment as a result of either lower credit supply to firms or the feedback from lower consumer demand. Our estimate of a negligible wealth effect is robust to accounting for the endogeneity of house prices and unemployment. The contribution of tightening household financial constraints goes down to 35 percent, whereas declining bank credit supply to households captures about half of the overall consumption response, once we account for endogeneity.
(552.0 KB, 36 pages)
This paper studies a labor market with directed search, where multi-worker firms follow a firm wage policy: They pay equally productive workers the same. The policy reduces wages, due to the influence of firms’ existing workers on their wage setting problem, increasing the profitability of hiring. It also introduces a time-inconsistency into the dynamic firm problem, because firms face a less elastic labor supply in the short run. To consider outcomes when firms reoptimize each period, I study Markov perfect equilibria, proposing a tractable solution approach based on standard Euler equations. In two applications, I first show that firm wages dampen wage variation over the business cycle, amplifying that in unemployment, with quantitatively significant effects. Second, I show that firm wage firms may find it profitable to fix wages for a period of time, and that an equilibrium with fixed wages can be good for worker welfare, despite added volatility in the labor market.
(671.0 KB, 61 pages)
We analyze comparative advantages/disadvantages of small and large banks in improving household sentiment regarding financial conditions. We match sentiment data from the University of Michigan Surveys of Consumers with local banking market data from 2000 to 2014. Surprisingly, the evidence suggests that large rather than small banks have significant comparative advantages in boosting household sentiment. Findings are robust to instrumental variables and other econometric methods. Additional analyses are consistent with both scale economies and the superior safety of large banks as channels behind the main findings. These channels appear to more than offset stronger relationships with and greater trust in small banks.
(1.0 MB, 69 pages)
Using a representative-household search and matching model with endogenous labor force participation, we study the interactions between extensive-margin labor supply elasticities and the cyclicality of labor force participation flows. Our model successfully replicates salient business-cycle features of all transition rates between three labor market states, the unemployment rate, and the labor force participation rate, while using values of elasticities consistent with micro evidence. Our results underscore the importance of the procyclical opportunity cost of employment, together with wage rigidity, in understanding the cyclicality of labor market flows and stocks.
(6.0 MB, 71 pages)
This paper examines how a negative shock to the security of personal finances due to severe identity theft changes consumer credit behavior. Using a unique data set of consumer credit records and alerts indicating identity theft and the exogenous timing of victimization, we show that the immediate effects of fraud on credit files are typically negative, small, and transitory. After those immediate effects fade, identity theft victims experience persistent, positive changes in credit characteristics, including improved Risk Scores. Consumers also exhibit caution with credit by having fewer open revolving accounts while maintaining total balances and credit limits. Our results are consistent with consumer inattention to credit reports prior to identity theft and reduced trust in credit card markets after identity theft.
Supersedes Working Paper 16-27.
(1.0 MB, 46 pages)
Ten years after the mortgage crisis, the U.S. housing market has rebounded significantly with house prices now near the peak achieved during the boom. Homeownership rates, on the other hand, have continued to decline. We reconcile the two phenomena by documenting the rising presence of institutional investors in this market. Our analysis makes use of housing transaction data. By exploiting heterogeneity in zip codes' exposure to the First Look program instituted by Fannie Mae and Freddie Mac that affected investors' access to foreclosed properties, we establish the causal relationship between the increasing presence of institutions in the housing market and the subsequent recovery in house prices and decline in homeownership rates between 2007 and 2014. We further demonstrate that institutional investors contributed to the improvement in the local labor market by reducing overall unemployment rate and by increasing total employment, construction employment in particular. Local housing rents also rose.
(414.0 KB, 33 pages)