The views expressed in these papers are solely those of the authors and should not be interpreted as reflecting the views of the Federal Reserve Bank of Philadelphia or Federal Reserve System.

08-33: Efficient Institutions by Thorsten Koeppl, Cyril Monnet, and Erwan Quintin

Are efficiency considerations important for understanding differences in the development of institutions? The authors model institutional quality as the degree to which obligations associated with exchanging capital can be enforced. Establishing a positive level of enforcement requires an aggregate investment of capital that is no longer available for production. When capital endowments are more unequally distributed, the bigger dispersion in marginal products makes it optimal to invest more resources in enforcement. The optimal allocation of the institutional cost across agents is not monotonic and entails a redistribution of endowments before production begins. Investing in enforcement benefits primarily agents at the bottom of the endowment distribution and leads to a reduction in consumption and income inequality. Efficiency, redistribution and the quality of institutions are thus intricately linked and should be studied jointly.

08-32: Restricting Consumer Credit Access: Household Survey Evidence on Effects Around the Oregon Rate Cap by Jonathan Zinman

Many policymakers and some behavioral models hold that restricting access to expensive credit helps consumers by preventing overborrowing. The author examines some short-run effects of restricting access, using household panel survey data on payday loan users collected around the imposition of binding restrictions on payday loan terms in Oregon. The results suggest that borrowing fell in Oregon relative to Washington, with former payday loan users shifting partially into plausibly inferior substitutes. Additional evidence suggests that restricting access caused deterioration in the overall financial condition of the Oregon households. The results suggest that restricting access to expensive credit harms consumers on average.

08-31/R: The (Un)importance of Unemployment Fluctuations for the Welfare Cost of Business Cycles by Philip Jung and Keith Kuester

This paper studies the cost of business cycles within a real business cycle model with search and matching frictions in the labor market. The authors endogenously link both the cyclical fluctuations and the mean level of unemployment to the aggregate business cycle risk. The key result of the paper is that business cycles are costly: Fluctuations over the cycle induce a higher average unemployment rate since employment is non-linear in the job-finding rate and the past unemployment rate. The authors show this analytically for a special case of the model. They then calibrate the model to U.S. data. For the calibrated model, too, business cycles cause higher average unemployment; the welfare cost of business cycles can easily be an order of magnitude larger than Lucas' (1987) estimate. The cost of business cycles is the higher the lower the value of non-employment, or, respectively, the lower the disutility of work. The ensuing cost of business cycles rises further when workers' skills depreciate during unemployment.

08-30: On the Implementation of Markov-Perfect Interest Rate and Money Supply Rules: Global and Local Uniqueness by Michael Dotsey and Andreas Hornstein

Currently there is a growing literature exploring the features of optimal monetary policy in New Keynesian models under both commitment and discretion. This literature usually solves for the optimal allocations that are consistent with a rational expectations market equilibrium, but it does not study how the policy can be implemented given the available policy instruments. Recently, however, King and Wolman (2004) have shown that a time-consistent policy cannot be implemented through the control of nominal money balances. In particular, they find that equilibria are not unique under a money stock regime. The authors of this paper find that King and Wolman's conclusion of non-uniqueness of Markov-perfect equilibria is sensitive to the instrument of choice. Surprisingly, if, instead, the monetary authority chooses the nominal interest rate there exists a unique Markov-perfect equilibrium. The authors then investigate under what conditions a time-consistent planner can implement the optimal allocation by just announcing his policy rule in a decentralized setting.

08-29: Insurance Policies for Monetary Policy in the Euro Area by Keith Kuester and Volker Wieland

In this paper, the authors aim to design a monetary policy for the euro area that is robust to the high degree of model uncertainty at the start of monetary union and allows for learning about model probabilities. To this end, they compare and ultimately combine Bayesian and worst-case analysis using four reference models estimated with pre-EMU synthetic data. The authors start by computing the cost of insurance against model uncertainty, that is, the relative performance of worst-case or minimax policy versus Bayesian policy. While maximum insurance comes at moderate costs, they highlight three shortcomings of this worst-case insurance policy: (i) prior beliefs that would rationalize it from a Bayesian perspective indicate that such insurance is strongly oriented toward the model with highest baseline losses; (ii) the minimax policy is not as tolerant of small perturbations of policy parameters as the Bayesian policy; and (iii) the minimax policy offers no avenue for incorporating posterior model probabilities derived from data available since monetary union. Thus, the authors propose preferences for robust policy design that reflect a mixture of the Bayesian and minimax approaches. They show how the incoming EMU data may then be used to update model probabilities, and investigate the implications for policy.

08-28: Rents Have Been Rising, Not Falling, in the Postwar Period by Theodore Crone, Leonard I. Nakamura, and Richard Voith

Until the end of 1977, the U.S. consumer price index for rents tended to omit rent increases when units had a change of tenants or were vacant, biasing inflation estimates downward. Beginning in 1978, the Bureau of Labor Statistics (BLS) implemented a series of methodological changes that reduced this nonresponse bias, but substantial bias remained until 1985. The authors set up a model of nonresponse bias, parameterize it, and test it using a BLS microdata set for rents. From 1940 to 1985, the official BLS CPI-W price index for tenant rents rose 3.6 percent annually; the authors argue that it should have risen 5.0 percent annually. Rents in 1940 should be only half as much as their official relative price; this has important consequences for historical measures of rent-house-price ratios and for the growth of real consumption. (Revision forthcoming in Review of Economics and Statistics.)

08-27: Information in the Revision Process of Real-Time Datasets by Valentina Corradi, Andres Fernandez, and Norman Swanson 

Rationality of early release data is typically tested using linear regressions. Thus, failure to reject the null does not rule out the possibility of nonlinear dependence. This paper proposes two tests which instead have power against generic nonlinear alternatives. A Monte Carlo study shows that the suggested tests have good finite sample properties. Additionally, the authors carry out an empirical illustration using a real-time dataset for money, output, and prices. Overall, they find strong evidence against data rationality. Interestingly, for money stock the null is not rejected by linear tests but is rejected by the authors' tests.

08-26: Spinoffs and the Market for Ideas by Satyajit Chatterjee and Esteban Rossi-Hansberg

The authors present a theory of spinoffs in which the key ingredient is the originator's private information concerning the quality of his new idea. Because quality is privately observed, by the standard adverse-selection logic, the market can at best offer a price that reflects the average quality of ideas sold. This gives the holders of above-average-quality ideas the incentive to spin off. The authors show that only workers with very good ideas decide to spin off, while workers with mediocre ideas sell them. Entrepreneurs of existing firms pay a price for the ideas sold in the market that implies zero expected profits for them. Hence, firms' project selection is independent of firm size, which, under some additional assumptions, leads to scale-independent growth. The entry and growth process of firms leads to invariant firm-size distributions that resemble the ones for the U.S. economy and most of its individual industries.

08-25: Seeing Inside the Black Box: Using Diffusion Index Methodology to Construct Factor Proxies in Large Scale Macroeconomic Time Series Environments by Nii Ayi Armah and Norman R. Swanson

In economics, common factors are often assumed to underlie the co-movements of a set of macroeconomic variables. For this reason, many authors have used estimated factors in the construction of prediction models. In this paper, the authors begin by surveying the extant literature on diffusion indexes. They then outline a number of approaches to the selection of factor proxies (observed variables that proxy unobserved estimated factors) using the statistics developed in Bai and Ng (2006a,b). The authors' approach to factor proxy selection is examined via a small Monte Carlo experiment, where evidence supporting their proposed methodology is presented, and via a large set of prediction experiments using the panel dataset of Stock and Watson (2005). One of their main empirical findings is that their "smoothed" approaches to factor proxy selection appear to yield predictions that are often superior not only to a benchmark factor model, but also to simple linear time series models, which are generally difficult to beat in forecasting competitions. In some sense, by using the authors' approach to predictive factor proxy selection, one is able to open up the "black box" often associated with factor analysis, and to identify actual variables that can serve as primitive building blocks for (prediction) models of a host of macroeconomic variables, and that can also serve as policy instruments, for example. The authors' findings suggest that important observable variables include various S&P500 variables, including stock price indices and dividend series; a 1-year Treasury bond rate; various housing activity variables; industrial production; and exchange rates.

08-24: Predatory Mortgage Lending by Philip Bond, David K. Musto, and Bilge Yilmaz

Regulators express growing concern over predatory loans, which the authors take to mean loans that borrowers should decline. Using a model of consumer credit in which such lending is possible, they identify the circumstances in which it arises both with and without competition. The authors find that predatory lending is associated with highly collateralized loans, inefficient refinancing of subprime loans, lending without due regard to ability to pay, prepayment penalties, balloon payments, and poorly informed borrowers. Under most circumstances competition among lenders attenuates predatory lending. They use their model to analyze the effects of legislative interventions.

08-23: Intangible Assets and National Income Accounting by Leonard I. Nakamura

Superseded by Working Paper 09-11.

08-22: City Beautiful by Gerald A. Carlino and Albert Saiz

Superseded by Working Paper 19-16.

08-21: Firm Default and Aggregate Fluctuations by Tor Jacobson, Rikard Kindell, Jesper Linde, and Kasper Roszbach

This paper studies the relation between macroeconomic fluctuations and corporate defaults while conditioning on industry affiliation and an extensive set of firm-specific factors. Using a logit approach on a panel data set for all incorporated Swedish businesses over 1990-2002, the authors find strong evidence for a substantial and stable impact of aggregate fluctuations. Macroeffects differ across industries in an economically intuitive way. Out-of-sample evaluations show their approach is superior to both models that exclude macro information and best fitting naive forecasting models. While firm-specific factors are useful in ranking firms' relative riskiness, macroeconomic factors capture fluctuations in the absolute risk level.

08-20: The Effect of Monetary Tightening on Local Banks by Rocco Huang

This study shows that during Paul Volcker's drastic monetary tightening in the early 1980s, local banks operating in only one county reduced loan supply much more sharply than local subsidiaries of multi-county bank holding companies in similar markets, after controlling for bank (and holding company) size, liquidity, capital conditions, and, most important, local credit demand. The study allows cleaner identification by examining 18 U.S. "county-banking states" where a bank's local lending volume at the county level was observable because no one was allowed to branch across county borders. The local nature of lending allows us to approximate and control for the exogenous component of local loan demand using the prediction that counties with a higher share of manufacturing employment exhibit weaker loan demand during tightening (which is consistent with the interest rate channel and the balance-sheet channel of monetary policy transmission). The study sheds light on the working of the bank lending channel of monetary policy transmission.

08-19: Real-Time Measurement of Business Conditions by S. Borağan Aruoba, Francis X. Diebold, and Chiara Scotti 

The authors construct a framework for measuring economic activity at high frequency, potentially in real time. They use a variety of stock and flow data observed at mixed frequencies (including very high frequencies), and they use a dynamic factor model that permits exact filtering. They illustrate the framework in a prototype empirical example and a simulation study calibrated to the example.

08-18: In Harm's Way? Payday Loan Access and Military Personnel Performance by Scott Carrell and Jonathan Zinman 

Does borrowing at 400 percent APR do more harm than good? The Pentagon asserts that payday loans harm military readiness and successfully lobbied for a binding 36 percent APR cap on loans to military members and their families (effective October 1, 2007). But existing evidence on how access to high-interest debt affects borrower behavior is inconclusive. The authors use within-state variation in state lending laws and exogenous variation in the assignment of Air Force personnel to bases in different states to estimate the effect of payday loan access on personnel outcomes. They find significant average declines in overall job performance and retention and significant increases in severely poor readiness. These results provide some ammunition for the private optimality of the Pentagon's position. The welfare implications for military members are less clear-cut, but the authors' results are consistent with the interpretation that payday loan access causes financial distress and severe misbehavior for relatively young, inexperienced, and financially unsophisticated airmen. Overall job performance declines are also concentrated in these groups, and several pieces of evidence suggest that these declines are welfare-reducing (and not the result of airmen optimally reducing effort given an expanded opportunity set); e.g., performance declines are larger in high unemployment areas with payday lending.

08-17: DSGE Model-Based Forecasting of Non-Modelled Variables by Frank Schorfheide, Keith Sill, and Maxym Kryshko

This paper develops and illustrates a simple method to generate a DSGE model-based forecast for variables that do not explicitly appear in the model (non-core variables). The authors use auxiliary regressions that resemble measurement equations in a dynamic factor model to link the non-core variables to the state variables of the DSGE model. Predictions for the non-core variables are obtained by applying their measurement equations to DSGE model- generated forecasts of the state variables. Using a medium-scale New Keynesian DSGE model, the authors apply their approach to generate and evaluate recursive forecasts for PCE inflation, core PCE inflation, and the unemployment rate along with predictions for the seven variables that have been used to estimate the DSGE model.

08-16: Measuring Financial Asset Return and Volatility Spillovers, with Application to Global Equity Markets by Francis X. Diebold and Kamil Yilmaz

The authors provide a simple and intuitive measure of interdependence of asset returns and/or volatilities. In particular, they formulate and examine precise and separate measures of return spillovers and volatility spillovers. The authors' framework facilitates study of both noncrisis and crisis episodes, including trends and bursts in spillovers, and both turn out to be empirically important. In particular, in an analysis of 19 global equity markets from the early 1990s to the present, they find striking evidence of divergent behavior in the dynamics of return spillovers vs. volatility spillovers: Return spillovers display a gently increasing trend but no bursts, whereas volatility spillovers display no trend but clear bursts.

08-15: The Elasticity of the Unemployment Rate with Respect to Benefits by Kai Christoffel and Keith Kuester

If the Mortensen and Pissarides model with efficient bargaining is calibrated to replicate the fluctuations of unemployment over the business cycle, it implies a far too strong rise of the unemployment rate when unemployment benefits rise. This paper explores an alternative, right-to-manage bargaining scheme. This also generates the right degree of fluctuations of unemployment but at the same time implies a reasonable elasticity of unemployment with respect to benefits.
Final version forthcoming in Economics Letters

08-14: The Homeownership Experience of Households in Bankruptcy by Sarah W. Carroll and Wenli Li

This paper provides the first in-depth analysis of the homeownership experience of households in bankruptcy. The authors consider households who are homeowners at the time of filing. These households are typically seriously delinquent on their mortgages at the time of filing. The authors measure how often they end up losing their houses in foreclosure, the time between bankruptcy filing and foreclosure sale, and the foreclosure sale price. In particular, they follow homeowners who filed for chapter 13 bankruptcy between 2001 and 2002 in New Castle County, Delaware, through October 2007. They present three main findings. First, close to 30 percent of the filers lost their houses in foreclosure despite filing for bankruptcy. The rate rose to over 40 percent for those who were 12 months or more behind on their mortgage payment, about the same fraction as among those who entered into foreclosure directly. Second, filing for bankruptcy allowed those who eventually lost their houses to foreclosure to remain in their houses for, on average, an additional year. Third, although the average final sale price exceeded borrowers’ own estimates at the time of filing, the majority of the lenders suffered losses. These findings are pertinent to the recent debate over the bankruptcy code on mortgage modification. Finally, the paper also reports circumstances related to the loan, borrower, and lender that make it more or less likely that a certain result will take place.

08-13: Quits, Worker Recruitment, and Firm Growth: Theory and Evidence by R. Jason Faberman and Eva Nagypal

The authors use establishment data from the Job Openings and Labor Turnover Survey (JOLTS) to study the micro-level behavior of worker quits and their relation to recruitment and establishment growth. They find that quits decline with establishment growth, playing the most important role at slowly contracting firms. They also find a robust, positive relationship between an establishment's reported hires and vacancies and the incidence of a quit. This relationship occurs despite the finding that quits decline, and hires and vacancies increase, with establishment growth. The authors characterize these dynamics within a labor-market search model with on-the-job search, a convex cost of creating new positions, and multi-worker establishments. The model distinguishes between recruiting to replace a quitting worker and recruiting for a new position, and relates this distinction to firm performance. Beyond giving rise to a varying quit propensity, the model generates endogenously determined thresholds for firm contraction (through both layoffs and attrition), worker replacement, and firm expansion. The continuum of decision rules derived from these thresholds produces rich firm-level dynamics and quit behavior that are broadly consistent with the empirical evidence of the JOLTS data.

08-12/R: Can Multi-Stage Production Explain the Home Bias in Trade? by Kei-Mu Yi

A large empirical literature finds that there is too little international trade, and too much intra-national trade to be rationalized by observed international trade costs such as tariffs and transport costs. The literature uses frameworks in which the nature of production is assumed to be unaffected by trade costs. This paper investigates whether a model in which the nature of production can change in response to trade costs — a framework with multi-stage production — can better explain the home bias in trade. The author finds that the model can explain about 2/5 of the Canada border effect; this is about two-and-one-half times what a model with one stage of production can explain. The model also explains a significant fraction of a key dimension of Canada-U.S. trade, the high degree of "back-and-forth" trade or vertical specialization.

08-11:  Job Flows, Jobless Recoveries, and the Great Moderation by R. Jason Faberman

This paper uses new data on job creation and job destruction to find evidence of a link between the jobless recoveries of the last two recessions and the recent decline in aggregate volatility known as the Great Moderation. The author finds that the last two recessions are characterized by jobless recoveries that came about through contrasting margins of employment adjustment-a relatively slow decline in job destruction in 1991-92 and persistently low job creation in 2002-03. In manufacturing, he finds that these patterns followed a secular decline in the magnitude of job flows and an abrupt decline in their volatility. A structural VAR analysis suggests that these patterns are driven by a decline in the volatilities of the underlying structural shocks in addition to a shift in the response of job flows to these shocks. The shift in structural responses is broadly consistent with the change in job flow patterns observed during the jobless recoveries.

08-10/R: Business Method Patents and U.S. Financial Services by Robert M. Hunt

A decade after the State Street decision, more than 1,000 business method patents are granted each year. Yet only one in ten is obtained by a financial institution. Most business method patents are also software patents.

Have these patents increased innovation in financial services? To address this question the author constructs new indicators of R&D intensity based on the occupational composition of financial industries. The financial sector appears more research intensive than official statistics would suggest but less than the private economy taken as a whole. There is considerable variation across industries but little apparent trend. There does not appear to be an obvious effect from business method patents on the sector's research intensity.

Looking ahead, three factors suggest that the patent system may affect financial services as it has electronics: (1) the sector's heavy reliance on information technology; (2) the importance of standard setting; and (3) the strong network effects exhibited in many areas of finance. Even today litigation is not uncommon; the author sketches a number of significant examples affecting financial exchanges and consumer payments.

The legal environment is changing quickly. The author reviews a number of important federal court decisions that will affect how business method patents are obtained and enforced. He also reviews a number of proposals under consideration in the U.S. Congress.

08-9: Core Measures of Inflation as Predictors of Total Inflation by Theodore M. Crone, N. Neil K. Khettry, Loretta J. Mester, and Jason A. Novak

Superseded by Working Paper 11-24.

08-8: Revisions to PCE Inflation Measures: Implications for Monetary Policy by Dean Croushore

This paper examines the characteristics of the revisions to the inflation rate as measured by the personal consumption expenditures price index both including and excluding food and energy prices. These data series play a major role in the Federal Reserve's analysis of inflation.

The author examines the magnitude and patterns of revisions to both PCE inflation rates. The first question he poses is: What do data revisions look like? The author runs a variety of tests to see if the data revisions have desirable or exploitable properties. The second question he poses is related to the first: Can we forecast data revisions in real time? The answer is that it is possible to forecast revisions from the initial release to August of the following year. Generally, the initial release of inflation is too low and is likely to be revised up. Policymakers should account for this predictability in setting monetary policy.

08-7: Monetary Policy in a Channel System by Aleksander Berentsen and Cyril Monnet

Channel systems for conducting monetary policy are becoming increasingly popular. Despite its popularity, the consequences of implementing policy with a channel system are not well understood. The authors develop a general equilibrium framework of a channel system and study the optimal policy. A novel aspect of the channel system is that a central bank can "tighten" or "loosen" its policy without changing its policy rate. This policy instrument has so far been overlooked by a large body of the literature on the optimal design of interest-rate rules.

08-6: Specific Capital and Vintage Effects on the Dynamics of Unemployment and Vacancies by Burcu Eyigungor

In a reasonably calibrated Mortensen and Pissarides matching model, shocks to average labor productivity can account for only a small portion of the fluctuations in unemployment and vacancies (Shimer (2005a)). In this paper, the author argues that if vintage specific shocks rather than aggregate productivity shocks are the driving force of fluctuations, the model does a better job of accounting for the data. She adds heterogeneity in jobs (matches) with respect to the time the job is created in the form of different embodied technology levels. The author also introduces specific capital that, once adapted for a match, has less value in another match. In the quantitative analysis, she shows that shocks to different vintages of entrants are able to account for fluctuations in unemployment and vacancies and that, in this environment, specific capital is important to decreasing the volatility of the destruction rate of existing matches.

08-5: Central Bank Institutional Structure and Effective Central Banking: Cross-Country Empirical Evidence by Iftekhar Hasan and Loretta J. Mester

Over the last decade, the legal and institutional frameworks governing central banks and financial market regulatory authorities throughout the world have undergone significant changes. This has created new interest in better understanding the roles played by organizational structures, accountability, and transparency, in increasing the efficiency and effectiveness of central banks in achieving their objectives and ultimately yielding better economic outcomes. Although much has been written pointing out the potential role institutional form can play in central bank performance, little empirical work has been done to investigate the hypothesis that institution form is related to performance. This paper attempts to help fill this void.

08-4: Frontiers of Real-Time Data Analysis by Dean Croushore

This paper describes the existing research (as of February 2008) on real-time data analysis, divided into five areas: (1) data revisions; (2) forecasting; (3) monetary policy analysis; (4) macroeconomic research; and (5) current analysis of business and financial conditions. In each area, substantial progress has been made in recent years, with researchers gaining insight into the impact of data revisions. In addition, substantial progress has been made in developing better real-time data sets around the world. Still, additional research is needed in key areas, and research to date has uncovered even more fruitful areas worth exploring.

08-3/R: Inventories, Lumpy Trade, and Large Devaluations by George Alessandria, Joseph Kaboski, and Virgiliu Midrigan

The authors document that economies of scale in transportation and delivery lags are important features of international trade. These costs lead firms to import infrequently and hold substantially larger inventories of imported goods. They study a model economy in which international trade is subject to these frictions. When the authors calibrate their theory to the inventory levels and lumpiness of imports observed in the data, they find a large (20 percent) tariff equivalent of these frictions, mostly due to inventory carrying costs. These frictions have important consequences not only for the level of trade, but also for the dynamic response of imports and prices in the aftermath of large shocks. The authors focus on large devaluation episodes in six developing economies. The model predicts, consistent with the data, that desired inventory adjustment in response to a terms-of-trade and interest rate shock generates a short-term trade implosion, an immediate, temporary drop in the value and number of distinct varieties imported, as well as a slow increase in the retail price of imported goods.

08-2: Optimal Industrial Structure in Banking by Loretta J. Mester

This paper discusses the research agenda on optimal bank productive efficiency and industrial structure. One goal of this agenda is to answer some fundamental questions in financial industry restructuring, such as what motivates bank managers to engage in mergers and acquisitions, and to evaluate the costs and benefits of consolidation, which is essentially an empirical question. The paper reviews the recent literature, including techniques for modeling bank production and the empirical results on scale economies, scope economies, and efficiency in banking.

08-1: Efficiency in Banking: Theory, Practice, and Evidence by Joseph P. Hughes and Loretta J. Mester

Great strides have been made in the theory of bank technology in terms of explaining banks' comparative advantage in producing informationally intensive assets and financial services and in diversifying or offsetting a variety of risks. Great strides have also been made in explaining sub-par managerial performance in terms of agency theory and in applying these theories to analyze the particular environment of banking. In recent years, the empirical modeling of bank technology and the measurement of bank performance have begun to incorporate these theoretical developments and yield interesting insights that reflect the unique nature and role of banking in modern economies. This paper gives an overview of two general empirical approaches to measuring bank performance and discusses some of the applications of these approaches found in the literature.