Please Note: If the title of a paper is highlighted, you can get to the full text for that paper by clicking on the highlighted area. Full text files are in pdf format; to access them, you must have Adobe Acrobat Reader.
Are efficiency considerations important for understanding differences in the development of institutions? The authors model institutional quality as the degree to which obligations associated with exchanging capital can be enforced. Establishing a positive level of enforcement requires an aggregate investment of capital that is no longer available for production. When capital endowments are more unequally distributed, the bigger dispersion in marginal products makes it optimal to invest more resources in enforcement. The optimal allocation of the institutional cost across agents is not monotonic and entails a redistribution of endowments before production begins. Investing in enforcement benefits primarily agents at the bottom of the endowment distribution and leads to a reduction in consumption and income inequality. Efficiency, redistribution and the quality of institutions are thus intricately linked and should be studied jointly.
(285 KB, 29 pages)
Many policymakers and some behavioral models hold that restricting access to expensive credit helps consumers by preventing overborrowing. The author examines some short-run effects of restricting access, using household panel survey data on payday loan users collected around the imposition of binding restrictions on payday loan terms in Oregon. The results suggest that borrowing fell in Oregon relative to Washington, with former payday loan users shifting partially into plausibly inferior substitutes. Additional evidence suggests that restricting access caused deterioration in the overall financial condition of the Oregon households. The results suggest that restricting access to expensive credit harms consumers on average.
(145 KB, 24 pages)
This paper studies the cost of business cycles within a real business cycle model with search and matching frictions in the labor market. The authors endogenously link both the cyclical fluctuations and the mean level of unemployment to the aggregate business cycle risk. The key result of the paper is that business cycles are costly: Fluctuations over the cycle induce a higher average unemployment rate since employment is non-linear in the job-finding rate and the past unemployment rate. The authors show this analytically for a special case of the model. They then calibrate the model to U.S. data. For the calibrated model, too, business cycles cause higher average unemployment; the welfare cost of business cycles can easily be an order of magnitude larger than Lucas' (1987) estimate. The cost of business cycles is the higher the lower the value of non-employment, or, respectively, the lower the disutility of work. The ensuing cost of business cycles rises further when workers' skills depreciate during unemployment. Supersedes Working Paper 08-31.
(365 KB, 34 pages)
Currently there is a growing literature exploring the features of optimal monetary policy in New Keynesian models under both commitment and discretion. This literature usually solves for the optimal allocations that are consistent with a rational expectations market equilibrium, but it does not study how the policy can be implemented given the available policy instruments. Recently, however, King and Wolman (2004) have shown that a time-consistent policy cannot be implemented through the control of nominal money balances. In particular, they find that equilibria are not unique under a money stock regime. The authors of this paper find that King and Wolman's conclusion of non-uniqueness of Markov-perfect equilibria is sensitive to the instrument of choice. Surprisingly, if, instead, the monetary authority chooses the nominal interest rate there exists a unique Markov-perfect equilibrium. The authors then investigate under what conditions a time-consistent planner can implement the optimal allocation by just announcing his policy rule in a decentralized setting.
(363 KB, 33 pages)
In this paper, the authors aim to design a monetary policy for the euro area that is robust to the high degree of model uncertainty at the start of monetary union and allows for learning about model probabilities. To this end, they compare and ultimately combine Bayesian and worst-case analysis using four reference models estimated with pre-EMU synthetic data. The authors start by computing the cost of insurance against model uncertainty, that is, the relative performance of worst-case or minimax policy versus Bayesian policy. While maximum insurance comes at moderate costs, they highlight three shortcomings of this worst-case insurance policy: (i) prior beliefs that would rationalize it from a Bayesian perspective indicate that such insurance is strongly oriented toward the model with highest baseline losses; (ii) the minimax policy is not as tolerant of small perturbations of policy parameters as the Bayesian policy; and (iii) the minimax policy offers no avenue for incorporating posterior model probabilities derived from data available since monetary union. Thus, the authors propose preferences for robust policy design that reflect a mixture of the Bayesian and minimax approaches. They show how the incoming EMU data may then be used to update model probabilities, and investigate the implications for policy.
(578 KB, 48 pages)
Until the end of 1977, the U.S. consumer price index for rents tended to omit rent increases when units had a change of tenants or were vacant, biasing inflation estimates downward. Beginning in 1978, the Bureau of Labor Statistics (BLS) implemented a series of methodological changes that reduced this nonresponse bias, but substantial bias remained until 1985. The authors set up a model of nonresponse bias, parameterize it, and test it using a BLS microdata set for rents. From 1940 to 1985, the official BLS CPI-W price index for tenant rents rose 3.6 percent annually; the authors argue that it should have risen 5.0 percent annually. Rents in 1940 should be only half as much as their official relative price; this has important consequences for historical measures of rent-house-price ratios and for the growth of real consumption. (Revision forthcoming in Review of Economics and Statistics.)
(374 KB, 42 pages)
Rationality of early release data is typically tested using linear regressions. Thus, failure to reject the null does not rule out the possibility of nonlinear dependence. This paper proposes two tests which instead have power against generic nonlinear alternatives. A Monte Carlo study shows that the suggested tests have good finite sample properties. Additionally, the authors carry out an empirical illustration using a real-time dataset for money, output, and prices. Overall, they find strong evidence against data rationality. Interestingly, for money stock the null is not rejected by linear tests but is rejected by the authors' tests.
(303 KB, 28 pages)
The authors present a theory of spinoffs in which the key ingredient is the originator's private information concerning the quality of his new idea. Because quality is privately observed, by the standard adverse-selection logic, the market can at best offer a price that reflects the average quality of ideas sold. This gives the holders of above-average-quality ideas the incentive to spin off. The authors show that only workers with very good ideas decide to spin off, while workers with mediocre ideas sell them. Entrepreneurs of existing firms pay a price for the ideas sold in the market that implies zero expected profits for them. Hence, firms' project selection is independent of firm size, which, under some additional assumptions, leads to scale-independent growth. The entry and growth process of firms leads to invariant firm-size distributions that resemble the ones for the U.S. economy and most of its individual industries.
(949 KB, 54 pages)
In economics, common factors are often assumed to underlie the co-movements of a set of macroeconomic variables. For this reason, many authors have used estimated factors in the construction of prediction models. In this paper, the authors begin by surveying the extant literature on diffusion indexes. They then outline a number of approaches to the selection of factor proxies (observed variables that proxy unobserved estimated factors) using the statistics developed in Bai and Ng (2006a,b). The authors' approach to factor proxy selection is examined via a small Monte Carlo experiment, where evidence supporting their proposed methodology is presented, and via a large set of prediction experiments using the panel dataset of Stock and Watson (2005). One of their main empirical findings is that their "smoothed" approaches to factor proxy selection appear to yield predictions that are often superior not only to a benchmark factor model, but also to simple linear time series models, which are generally difficult to beat in forecasting competitions. In some sense, by using the authors' approach to predictive factor proxy selection, one is able to open up the "black box" often associated with factor analysis, and to identify actual variables that can serve as primitive building blocks for (prediction) models of a host of macroeconomic variables, and that can also serve as policy instruments, for example. The authors' findings suggest that important observable variables include various S&P500 variables, including stock price indices and dividend series; a 1-year Treasury bond rate; various housing activity variables; industrial production; and exchange rates.
(442 KB, 38 pages)
Regulators express growing concern over predatory loans, which the authors take to mean loans that borrowers should decline. Using a model of consumer credit in which such lending is possible, they identify the circumstances in which it arises both with and without competition. The authors find that predatory lending is associated with highly collateralized loans, inefficient refinancing of subprime loans, lending without due regard to ability to pay, prepayment penalties, balloon payments, and poorly informed borrowers. Under most circumstances competition among lenders attenuates predatory lending. They use their model to analyze the effects of legislative interventions.
(308 KB, 39 pages)
Superseded by Working Paper 09-11.
(272 KB, 36 pages)
Superseded by Working Paper 19-16.
(1.66 MB, 61 pages)
This paper studies the relation between macroeconomic fluctuations and corporate defaults while conditioning on industry affiliation and an extensive set of firm-specific factors. Using a logit approach on a panel data set for all incorporated Swedish businesses over 1990-2002, the authors find strong evidence for a substantial and stable impact of aggregate fluctuations. Macroeffects differ across industries in an economically intuitive way. Out-of-sample evaluations show their approach is superior to both models that exclude macro information and best fitting naive forecasting models. While firm-specific factors are useful in ranking firms' relative riskiness, macroeconomic factors capture fluctuations in the absolute risk level.
(669 KB, 57 pages)
This study shows that during Paul Volcker's drastic monetary tightening in the early 1980s, local banks operating in only one county reduced loan supply much more sharply than local subsidiaries of multi-county bank holding companies in similar markets, after controlling for bank (and holding company) size, liquidity, capital conditions, and, most important, local credit demand. The study allows cleaner identification by examining 18 U.S. "county-banking states" where a bank's local lending volume at the county level was observable because no one was allowed to branch across county borders. The local nature of lending allows us to approximate and control for the exogenous component of local loan demand using the prediction that counties with a higher share of manufacturing employment exhibit weaker loan demand during tightening (which is consistent with the interest rate channel and the balance-sheet channel of monetary policy transmission). The study sheds light on the working of the bank lending channel of monetary policy transmission.
(474 KB, 41 pages)
The authors construct a framework for measuring economic activity at high frequency, potentially in real time. They use a variety of stock and flow data observed at mixed frequencies (including very high frequencies), and they use a dynamic factor model that permits exact filtering. They illustrate the framework in a prototype empirical example and a simulation study calibrated to the example.
(701 KB, 30 pages)
Does borrowing at 400 percent APR do more harm than good? The Pentagon asserts that payday loans harm military readiness and successfully lobbied for a binding 36 percent APR cap on loans to military members and their families (effective October 1, 2007). But existing evidence on how access to high-interest debt affects borrower behavior is inconclusive. The authors use within-state variation in state lending laws and exogenous variation in the assignment of Air Force personnel to bases in different states to estimate the effect of payday loan access on personnel outcomes. They find significant average declines in overall job performance and retention and significant increases in severely poor readiness. These results provide some ammunition for the private optimality of the Pentagon's position. The welfare implications for military members are less clear-cut, but the authors' results are consistent with the interpretation that payday loan access causes financial distress and severe misbehavior for relatively young, inexperienced, and financially unsophisticated airmen. Overall job performance declines are also concentrated in these groups, and several pieces of evidence suggest that these declines are welfare-reducing (and not the result of airmen optimally reducing effort given an expanded opportunity set); e.g., performance declines are larger in high unemployment areas with payday lending.
(229 KB, 37 pages)
This paper develops and illustrates a simple method to generate a DSGE model-based forecast for variables that do not explicitly appear in the model (non-core variables). The authors use auxiliary regressions that resemble measurement equations in a dynamic factor model to link the non-core variables to the state variables of the DSGE model. Predictions for the non-core variables are obtained by applying their measurement equations to DSGE model- generated forecasts of the state variables. Using a medium-scale New Keynesian DSGE model, the authors apply their approach to generate and evaluate recursive forecasts for PCE inflation, core PCE inflation, and the unemployment rate along with predictions for the seven variables that have been used to estimate the DSGE model.
(413 KB, 43 pages)
The authors provide a simple and intuitive measure of interdependence of asset returns and/or volatilities. In particular, they formulate and examine precise and separate measures of return spillovers and volatility spillovers. The authors framework facilitates study of both noncrisis and crisis episodes, including trends and bursts in spillovers, and both turn out to be empirically important. In particular, in an analysis of 19 global equity markets from the early 1990s to the present, they find striking evidence of divergent behavior in the dynamics of return spillovers vs. volatility spillovers: Return spillovers display a gently increasing trend but no bursts, whereas volatility spillovers display no trend but clear bursts.
(206 KB, 21 pages)
If the Mortensen and Pissarides model with efficient bargaining is calibrated to replicate the fluctuations of unemployment over the business cycle, it implies a far too strong rise of the unemployment rate when unemployment benefits rise. This paper explores an alternative, right-to-manage bargaining scheme. This also generates the right degree of fluctuations of unemployment but at the same time implies a reasonable elasticity of unemployment with respect to benefits.
Final version forthcoming in Economics Letters
(220 KB, 11 pages)
This paper provides the first in-depth analysis of the homeownership experience of households in bankruptcy. The authors consider households who are homeowners at the time of filing. These households are typically seriously delinquent on their mortgages at the time of filing. The authors measure how often they end up losing their houses in foreclosure, the time between bankruptcy filing and foreclosure sale, and the foreclosure sale price. In particular, they follow homeowners who filed for chapter 13 bankruptcy between 2001 and 2002 in New Castle County, Delaware, through October 2007. They present three main findings. First, close to 30 percent of the filers lost their houses in foreclosure despite filing for bankruptcy. The rate rose to over 40 percent for those who were 12 months or more behind on their mortgage payment, about the same fraction as among those who entered into foreclosure directly. Second, filing for bankruptcy allowed those who eventually lost their houses to foreclosure to remain in their houses for, on average, an additional year. Third, although the average final sale price exceeded borrowers’ own estimates at the time of filing, the majority of the lenders suffered losses. These findings are pertinent to the recent debate over the bankruptcy code on mortgage modification. Finally, the paper also reports circumstances related to the loan, borrower, and lender that make it more or less likely that a certain result will take place.
(241 KB, 31 pages)
The authors use establishment data from the Job Openings and Labor Turnover Survey (JOLTS) to study the micro-level behavior of worker quits and their relation to recruitment and establishment growth. They find that quits decline with establishment growth, playing the most important role at slowly contracting firms. They also find a robust, positive relationship between an establishment's reported hires and vacancies and the incidence of a quit. This relationship occurs despite the finding that quits decline, and hires and vacancies increase, with establishment growth. The authors characterize these dynamics within a labor-market search model with on-the-job search, a convex cost of creating new positions, and multi-worker establishments. The model distinguishes between recruiting to replace a quitting worker and recruiting for a new position, and relates this distinction to firm performance. Beyond giving rise to a varying quit propensity, the model generates endogenously determined thresholds for firm contraction (through both layoffs and attrition), worker replacement, and firm expansion. The continuum of decision rules derived from these thresholds produces rich firm-level dynamics and quit behavior that are broadly consistent with the empirical evidence of the JOLTS data.
(524 KB, 50 pages)
A large empirical literature finds that there is too little international trade, and too much intra-national trade to be rationalized by observed international trade costs such as tariffs and transport costs. The literature uses frameworks in which the nature of production is assumed to be unaffected by trade costs. This paper investigates whether a model in which the nature of production can change in response to trade costs — a framework with multi-stage production — can better explain the home bias in trade. The author finds that the model can explain about 2/5 of the Canada border effect; this is about two-and-one-half times what a model with one stage of production can explain. The model also explains a significant fraction of a key dimension of Canada-U.S. trade, the high degree of "back-and-forth" trade or vertical specialization.
(362 KB, 42 pages)
This paper uses new data on job creation and job destruction to find evidence of a link between the jobless recoveries of the last two recessions and the recent decline in aggregate volatility known as the Great Moderation. The author finds that the last two recessions are characterized by jobless recoveries that came about through contrasting margins of employment adjustment-a relatively slow decline in job destruction in 1991-92 and persistently low job creation in 2002-03. In manufacturing, he finds that these patterns followed a secular decline in the magnitude of job flows and an abrupt decline in their volatility. A structural VAR analysis suggests that these patterns are driven by a decline in the volatilities of the underlying structural shocks in addition to a shift in the response of job flows to these shocks. The shift in structural responses is broadly consistent with the change in job flow patterns observed during the jobless recoveries.
(331 KB, 47 pages)
A decade after the State Street decision, more than 1,000 business method patents are granted each year. Yet only one in ten is obtained by a financial institution. Most business method patents are also software patents.
Have these patents increased innovation in financial services? To address this question the author constructs new indicators of R&D intensity based on the occupational composition of financial industries. The financial sector appears more research intensive than official statistics would suggest but less than the private economy taken as a whole. There is considerable variation across industries but little apparent trend. There does not appear to be an obvious effect from business method patents on the sector's research intensity.
Looking ahead, three factors suggest that the patent system may affect financial services as it has electronics: (1) the sector's heavy reliance on information technology; (2) the importance of standard setting; and (3) the strong network effects exhibited in many areas of finance. Even today litigation is not uncommon; the authors sketch a number of significant examples affecting financial exchanges and consumer payments.
The legal environment is changing quickly. The author reviews a number of important federal court decisions that will affect how business method patents are obtained and enforced. He also reviews a number of proposals under consideration in the U.S. Congress.
(308 KB, 46 pages)
Superseded by Working Paper 11-24.
This paper examines the characteristics of the revisions to the inflation rate as measured by the personal consumption expenditures price index both including and excluding food and energy prices. These data series play a major role in the Federal Reserve's analysis of inflation.
The author examines the magnitude and patterns of revisions to both PCE inflation rates. The first question he poses is: What do data revisions look like? The author runs a variety of tests to see if the data revisions have desirable or exploitable properties. The second question he poses is related to the first: Can we forecast data revisions in real time? The answer is that it is possible to forecast revisions from the initial release to August of the following year. Generally, the initial release of inflation is too low and is likely to be revised up. Policymakers should account for this predictability in setting monetary policy.
(271 KB, 43 pages)
Channel systems for conducting monetary policy are becoming increasingly popular. Despite its popularity, the consequences of implementing policy with a channel system are not well understood. The authors develop a general equilibrium framework of a channel system and study the optimal policy. A novel aspect of the channel system is that a central bank can "tighten" or "loosen" its policy without changing its policy rate. This policy instrument has so far been overlooked by a large body of the literature on the optimal design of interest-rate rules.
(371 KB, 43 pages)
In a reasonably calibrated Mortensen and Pissarides matching model, shocks to average labor productivity can account for only a small portion of the fluctuations in unemployment and vacancies (Shimer (2005a)). In this paper, the author argues that if vintage specific shocks rather than aggregate productivity shocks are the driving force of fluctuations, the model does a better job of accounting for the data. She adds heterogeneity in jobs (matches) with respect to the time the job is created in the form of different embodied technology levels. The author also introduces specific capital that, once adapted for a match, has less value in another match. In the quantitative analysis, she shows that shocks to different vintages of entrants are able to account for fluctuations in unemployment and vacancies and that, in this environment, specific capital is important to decreasing the volatility of the destruction rate of existing matches.
(253 KB, 33 pages)
Over the last decade, the legal and institutional frameworks governing central banks and financial market regulatory authorities throughout the world have undergone significant changes. This has created new interest in better understanding the roles played by organizational structures, accountability, and transparency, in increasing the efficiency and effectiveness of central banks in achieving their objectives and ultimately yielding better economic outcomes. Although much has been written pointing out the potential role institutional form can play in central bank performance, little empirical work has been done to investigate the hypothesis that institution form is related to performance. This paper attempts to help fill this void.
(405 KB, 36 pages)
This paper describes the existing research (as of February 2008) on real-time data analysis, divided into five areas: (1) data revisions; (2) forecasting; (3) monetary policy analysis; (4) macroeconomic research; and (5) current analysis of business and financial conditions. In each area, substantial progress has been made in recent years, with researchers gaining insight into the impact of data revisions. In addition, substantial progress has been made in developing better real-time data sets around the world. Still, additional research is needed in key areas, and research to date has uncovered even more fruitful areas worth exploring.
(265 KB, 38 pages)
The authors document that economies of scale in transportation and delivery lags are important features of international trade. These costs lead firms to import infrequently and hold substantially larger inventories of imported goods. They study a model economy in which international trade is subject to these frictions. When the authors calibrate their theory to the inventory levels and lumpiness of imports observed in the data, they find a large (20 percent) tariff equivalent of these frictions, mostly due to inventory carrying costs. These frictions have important consequences not only for the level of trade, but also for the dynamic response of imports and prices in the aftermath of large shocks. The authors focus on large devaluation episodes in six developing economies. The model predicts, consistent with the data, that desired inventory adjustment in response to a terms-of-trade and interest rate shock generates a short-term trade implosion, an immediate, temporary drop in the value and number of distinct varieties imported, as well as a slow increase in the retail price of imported goods.
(478 KB, 68 pages)
This paper discusses the research agenda on optimal bank productive efficiency and industrial structure. One goal of this agenda is to answer some fundamental questions in financial industry restructuring, such as what motivates bank managers to engage in mergers and acquisitions, and to evaluate the costs and benefits of consolidation, which is essentially an empirical question. The paper reviews the recent literature, including techniques for modeling bank production and the empirical results on scale economies, scope economies, and efficiency in banking.
(333 KB, 43 pages)
Great strides have been made in the theory of bank technology in terms of explaining banks' comparative advantage in producing informationally intensive assets and financial services and in diversifying or offsetting a variety of risks. Great strides have also been made in explaining sub-par managerial performance in terms of agency theory and in applying these theories to analyze the particular environment of banking. In recent years, the empirical modeling of bank technology and the measurement of bank performance have begun to incorporate these theoretical developments and yield interesting insights that reflect the unique nature and role of banking in modern economies. This paper gives an overview of two general empirical approaches to measuring bank performance and discusses some of the applications of these approaches found in the literature.
(303 KB, 32 pages)