The views expressed in these papers are solely those of the authors and should not be interpreted as reflecting the views of the Federal Reserve Bank of Philadelphia or Federal Reserve System.

09-34: How Much Did Banks Pay to Become Too-Big-To-Fail and to Become Systemically Important? by Elijah Brewer III and Julapa Jagtiani

Superseded by Working Paper 11-37.

09-33: Worker Flows and Job Flows: A Quantitative Investigation by Shigeru Fujita and Makoto Nakajima

Worker flows and job flows behave differently over the business cycle. The authors investigate the sources of the differences by studying quantitative properties of a multiple-worker version of the search/matching model that features endogenous job separation and intra-firm wage bargaining. Their calibration incorporates micro- and macro-level evidence on worker and job flows. The authors show that the dynamic stochastic equilibrium of the model replicates important cyclical features of worker flows and job flow simultaneously. In particular, the model correctly predicts that hires from unemployment move countercyclically while the job creation rate moves procyclically. The key to this result is to allow for a large hiring flow that does not go through unemployment but is part of job creation, for which procyclicality of the job finding rate dominates its cyclicality. The authors also show that the model generates large volatilities of unemployment and vacancies when a worker's outside option is at 83 percent of aggregate labor productivity.

09-32: How Important Is the Currency Denomination of Exports in Open-Economy Models? by Michael Dotsey and Margarida Duarte

The authors show that standard alternative assumptions about the currency in which firms price export goods are virtually inconsequential for the properties of aggregate variables, other than the terms of trade, in a quantitative open-economy model. This result is in contrast to a large literature that emphasizes the importance of the currency denomination of exports for the properties of open-economy models.

09-31: Internal Capital Markets and Corporate Politics in a Banking Group by Martijn Cremers, Rocco Huang, and Zacharias Sautner

This study looks inside a large retail-banking group to understand how influence within the group affects internal capital allocations and lending behavior at the member bank level. The group consists of 181 member banks that jointly own a headquarters. Influence is measured by the divergence from one-share-one-vote. The authors find that more influential member banks are allocated more capital from headquarters. They are less likely to decrease lending after negative deposit growth or to increase lending following positive deposit growth. These effects are stronger in situations in which information asymmetry between banks and the headquarters seems greater. The evidence suggests that influence can be useful in overcoming information asymmetry.

09-30: Creditor Control of Free Cash Flow by Rocco Huang

With free cash flows, borrowers can accumulate cash or voluntarily pay down debts. However, sometimes creditors impose a mandatory repayment covenant called "excess cash flow sweep" in loan contracts to force borrowers to repay debts ahead of schedule. About 17 percent of borrowers in the author's sample 1995-2006 have this covenant attached to at least one of their loans. The author finds that the sweep covenant is more likely to be imposed on borrowers with higher leverage (i.e., where risk shifting by equity holders is more likely). The results are robust to including borrower fixed effects or using industry median leverage as a proxy. The covenant is more common also in borrowers where equity holders appear to have firmer control, e.g., when more shares are controlled by institutional block holders, when firms are incorporated in states with laws more favorable to hostile takeovers, or when equity holders place higher valuation on excess cash holdings. These determinants suggest that the sweep covenant may be motivated by creditor-shareholder conflicts. Finally, the author shows that the covenant has real effects: Borrowers affected by the sweep covenant indeed repay more debts using excess cash flows, and they spend less in capital investment and pay out fewer dividends to shareholders.

09-29: Predictive Density Construction and Accuracy Testing with Multiple Possibly Misspecified Diffusion Models by Valentina Corradi and Norman R. Swanson 

This paper develops tests for comparing the accuracy of predictive densities derived from (possibly misspecified) diffusion models. In particular, the authors first outline a simple simulation-based framework for constructing predictive densities for one-factor and stochastic volatility models. Then, they construct accuracy assessment tests that are in the spirit of Diebold and Mariano (1995) and White (2000). In order to establish the asymptotic properties of their tests, the authors also develop a recursive variant of the nonparametric simulated maximum likelihood estimator of Fermanian and Salanié (2004). In an empirical illustration, the predictive densities from several models of the one-month federal funds rates are compared.

09-28: Real-Time Datasets Really Do Make a Difference: Definitional Change, Data Release, and Forecasting by Andres Fernandez and Norman R. Swanson

In this paper, the authors empirically assess the extent to which early release inefficiency and definitional change affect prediction precision. In particular, they carry out a series of ex-ante prediction experiments in order to examine: the marginal predictive content of the revision process, the trade-offs associated with predicting different releases of a variable, the importance of particular forms of definitional change, which the authors call "definitional breaks," and the rationality of early releases of economic variables. An important feature of their rationality tests is that they are based solely on the examination of ex-ante predictions, rather than being based on in-sample regression analysis, as are many tests in the extant literature. Their findings point to the importance of making real-time datasets available to forecasters, as the revision process has marginal predictive content, and because predictive accuracy increases when multiple releases of data are used when specifying and estimating prediction models. The authors also present new evidence that early releases of money are rational, whereas prices and output are irrational. Moreover, they find that regardless of which release of their price variable one specifies as the "target" variable to be predicted, using only "first release" data in model estimation and prediction construction yields mean square forecast error (MSFE) "best" predictions. On the other hand, models estimated and implemented using "latest available release" data are MSFE-best for predicting all releases of money. The authors argue that these contradictory findings are due to the relevance of definitional breaks in the data generating processes of the variables that they examine. In an empirical analysis, they examine the real-time predictive content of money for income, and they find that vector autoregressions with money do not perform significantly worse than autoregressions, when predicting output during the last 20 years.

09-27: Monetary Policy Implementation Frameworks: A Comparative Analysis by Antoine Martin and Cyril Monnet

The authors compare two stylized frameworks for the implementation of monetary policy. The first framework relies only on standing facilities, while the second framework relies only on open market operations. They show that the Friedman rule cannot be implemented when the central bank uses standing facilities, while it can be implemented with open market operations. For a given rate of inflation, the authors show that standing facilities unambiguously achieve higher welfare than just conducting open market operations. They conclude that elements of both frameworks should be combined. Also, their results suggest that any monetary policy implementation framework should remunerate both required and excess reserves.

09-26: Banking: A Mechanism Design Approach by Fabrizio Mattesini, Cyril Monnet, and Randall Wright

The authors study banking using the tools of mechanism design, without a priori assumptions about what banks are, who they are, or what they do. Given preferences, technologies, and certain frictions — including limited commitment and imperfect monitoring — they describe the set of incentive feasible allocations and interpret the outcomes in terms of institutions that resemble banks. The bankers in the authors' model endogenously accept deposits, and their liabilities help others in making payments. This activity is essential: If it were ruled out, the set of feasible allocations would be inferior. The authors discuss how many and which agents play the role of bankers. For example, they show agents who are more connected to the market are better suited for this role since they have more to lose by reneging on obligations. The authors discuss some banking history and compare it with the predictions of their theory.

09-25/R: Rising Indebtedness and Hyperbolic Discounting: A Welfare Analysis by Makoto Nakajima

Is the observed large increase in consumer indebtedness since the 1980s beneficial for U.S. consumers? This paper quantitatively studies the macroeconomic and welfare implications of relaxing borrowing constraints when consumers exhibit a hyperbolic discounting preference. The model can capture two contrasting views: the positive view, which links increased indebtedness to financial innovation and thus better insurance, and the negative view, which is associated with consumers' over-borrowing. The author finds that the latter is sizable: the calibrated model implies a social welfare loss equivalent to a 0.2 percent decrease in per-period consumption from the relaxed borrowing constraint consistent with the observed increase in indebtedness. The welfare implication is strikingly different from the model with the standard exponential discounting preference, which implies a welfare gain of 0.6 percent, even though the two models are observationally similar. Naturally, according to the hyperbolic discounting model, there is a welfare gain from restricting consumer borrowing in the current U.S. economy.

09-24: Why Do Markets Freeze? by Philip Bond and Yaron Leitner

Superseded by Working Paper 13-14.

09-23/R: Procyclicality of Capital Requirements in a General Equilibrium Model of Liquidity Dependence by Francisco Covas and Shigeru Fujita

This paper quantifies the procyclical effects of bank capital requirements in a general equilibrium model where financing of capital goods production is subject to an agency problem. At the center of this problem is the interaction between entrepreneurs' moral hazard and liquidity provision by banks, as analyzed by Holmstrom and Tirole (1998). The authors impose capital requirements under the assumption that raising funds through bank equity is more costly than through deposits. They consider the time-varying capital requirement (as in Basel II) as well as the constant requirement (as in Basel I). Importantly, under both regimes, the cost of issuing equity is higher during downturns. Comparing output fluctuations under the Basel I and Basel II economies with those in the no-requirement economy, the authors show that capital requirements significantly contribute to magnifying output fluctuations. The procyclicality is most pronounced around business cycle peaks and troughs.

09-22: Foreclosures and House Price Dynamics: A Quantitative Analysis of the Mortgage Crisis and the Foreclosure Prevention Policy by Satyajit Chatterjee and Burcu Eyigungor

The authors construct a quantitative equilibrium model of the housing market in which an unanticipated increase in the supply of housing triggers default mortgages via its effect on house prices. The decline in house prices creates an incentive to increase the consumption of housing space, but leverage makes it costly for homeowners to sell their homes and buy bigger ones (they must absorb large capital losses). Instead, leveraged households find it advantageous to default and rent housing space. Since renters demand less housing space than homeowners, foreclosures are a negative force affecting house prices. The authors explore the possible effects of the government's foreclosure prevention policy in their model. They find that the policy can temporarily reduce foreclosures and shore up house prices.

09-21/R: Securitization and Mortgage Default by Ronel Elul

Superseded by Working Paper 15-15

09-20: Do Uncertainty and Technology Drive Exchange Rates? by Pablo A. Guerron-Quintana

This paper investigates the extent to which technology and uncertainty contribute to fluctuations in real exchange rates. Using a structural VAR and bilateral exchange rates, the author finds that neutral technology shocks are important contributors to the dynamics of real exchange rates. Investment-specific and uncertainty shocks have a more restricted effect on international prices. All three disturbances cause short-run deviations from uncovered interest rate parity.

09-19: How Much of South Korea's Growth Miracle Can Be Explained by Trade Policy? by Michelle Connolly and Kei-Mu Yi

South Korea's growth miracle has been well documented. A large set of institutional and policy reforms in the early 1960s is thought to have contributed to the country's extraordinary performance. In this paper, the authors assess the importance of one key set of policies, the trade policy reforms in Korea, as well as the concurrent GATT tariff reductions. They develop a model of neoclassical growth and trade that highlights two forces by which lower trade barriers can lead to increased per worker GDP: comparative advantage and specialization, and capital accumulation. The authors calibrate the model and simulate the effects of three sets of tariff reductions that occurred between the early 1962 and 1995. Their main finding is that the model can explain up to 32 percent of South Korea's catch-up to the G7 countries in output per worker in the manufacturing sector. The authors find that the effects of the tariff reductions taken together are about twice as large as the sum of each reduction applied individually.

09-18: Money Talks by Marie Hoerova, Cyril Monnet, and Ted Temzelides

The authors study credible information transmission by a benevolent central bank. They consider two possibilities: direct revelation through an announcement, versus indirect information transmission through monetary policy. These two ways of transmitting information have very different consequences. Since the objectives of the central bank and those of individual investors are not always aligned, private investors might rationally ignore announcements by the central bank. In contrast, information transmission through changes in the interest rate creates a distortion, thus lending an amount of credibility. This induces the private investors to rationally take into account information revealed through monetary policy.

09-17: Technological Adaptation, Cities, and New Work by Jeffrey Lin

Where does adaptation to innovation take place? The author presents evidence on the role of agglomeration economies in the application of new knowledge to production. All else equal, workers are more likely to be observed in new work in locations that are initially dense in both college graduates and industry variety. This pattern is consistent with economies of density from the geographic concentration of factors and markets related to technological adaptation. A main contribution is to use a new measure, based on revisions to occupation classifications, to closely characterize cross-sectional differences across U.S. cities in adaptation to technological change. Worker-level results also provide new evidence on the skill bias of recent innovations.

09-16: The Geography of Research and Development Activity in the U.S. by Kristy Buzard and Gerald Carlino

This study details the location patterns of R&D labs in the U.S., but it differs from past studies in a number of ways. First, rather than looking at the geographic concentration of manufacturing firms (e.g., Ellison and Glaeser, 1997; Rosenthal and Strange, 2001; and Duranton and Overman, 2005), the authors consider the spatial concentration of private R&D activity. Second, rather than focusing on the concentration of employment in a given industry, the authors look at the clustering of individual R&D labs by industry. Third, following Duranton and Overman (2005), the authors look for geographic clusters of labs that represent statistically significant departures from spatial randomness using simulation techniques. The authors find that R&D activity for most industries tends to be concentrated in the Northeast corridor, around the Great Lakes, in California's Bay Area, and in southern California. They argue that the high spatial concentration of R&D activity facilitates the exchange of ideas among firms and aids in the creation of new goods and new ways of producing existing goods. They run a regression of an Ellison and Glaeser (1997) style index measuring the spatial concentration of R&D labs on geographic proxies for knowledge spillovers and other characteristics and find evidence that localized knowledge spillovers are important for innovative activity.

09-15: Economies of Scale and the Size of Exporters by Roc Armenter and Miklos Koren

Exporters are few — less than one-fifth among U.S. manufacturing firms — and are larger than non-exporting firms — about 4-5 times more total sales per firm. These facts are often cited as support for models with economies of scale and firm heterogeneity as in Melitz (2003). The authors find that the basic Melitz model cannot simultaneously match the size and share of exporters given the observed distribution of total sales. Instead exporters are expected to be between 90 and 100 times larger than non-exporters. It is easy to reconcile the model with the data. However, a lot of variation independent of firm size is needed to do so. This suggests that economies of scale play only a minor role in determining a firm's export status. The authors show that the augmented model also has markedly different implications in the event of a trade liberalization. Most of the adjustment is through the intensive margin and productivity gains due to reallocation are halved.

09-14: The Establishment-Level Behavior of Vacancies and Hiring by Steven J. Davis, R. Jason Faberman, and John C. Haltiwanger

The authors study vacancies, hires, and vacancy yields (success rate in generating hires) in the Job Openings and Labor Turnover Survey, a large representative sample of U.S. employers. The authors also develop a simple framework that identifies the monthly flow of new vacancies and the job-filling rate for vacant positions, the employer counterpart to the job-finding rate for unemployed workers. The job-filling rate moves counter to employment at the aggregate level but rises steeply with employer growth rates in the cross section. It falls with employer size, rises with the worker turnover rate, and varies by a factor of four across major industry groups. The authors' analysis also indicates that more than 1 in 6 hires occur without benefit of a vacancy, as defined by JOLTS. These findings provide useful inputs for assessing, developing, and calibrating theoretical models of search, matching, and hiring in the labor market.

09-13: Frequentist Inference in Weakly Identified DSGE Models by Pablo Guerron-Quintana, Atsushi Inoue, and Lutz Kilian

The authors show that in weakly identified models (1) the posterior mode will not be a consistent estimator of the true parameter vector, (2) the posterior distribution will not be Gaussian even asymptotically, and (3) Bayesian credible sets and frequentist confidence sets will not coincide asymptotically. This means that Bayesian DSGE estimation should not be interpreted merely as a convenient device for obtaining asymptotically valid point estimates and confidence sets from the posterior distribution. As an alternative, the authors develop a new class of frequentist confidence sets for structural DSGE model parameters that remains asymptotically valid regardless of the strength of the identification. The proposed set correctly reflects the uncertainty about the structural parameters even when the likelihood is flat, it protects the researcher from spurious inference, and it is asymptotically invariant to the prior in the case of weak identification.

09-12: What Explains the Quantity and Quality of Local Inventive Activity? by Gerald Carlino and Robert Hunt

The authors geocode a data set of patents and their citation counts, including citations from abroad. This allows them to examine both the quantity and quality of local inventions. They also refine their data on local academic R&D to explore effects from different fields of science and sources of R&D funding. Finally, they incorporate data on congressional earmarks of funds for academic R&D.

With one important exception, results using citation-weighted patents are similar to those using unweighted patents. For example, estimates of the returns to density (jobs per square mile) are only slightly changed when using citation-weighted patents as the dependent variable. But estimates of returns to city size (urbanization effects) are quite sensitive to the choice of dependent variable.

Local human capital is the most important determinant of per capita rates of patenting. A 1 percent increase in the adult population with a college degree increases the local patenting rate by about 1 percent.

With few exceptions, there is little variation across fields of science in the contribution of academic R&D to patenting rates. The exceptions are computer and life sciences, where the effects are smaller. There is greater variation in the contribution of R&D funded by different sources — academic R&D funded by the federal government generates smaller increases in patenting rates than R&D funded by the university itself. This effect is somewhat stronger for federally funded applied R&D than for basic R&D. The authors also find small negative effects for cities with greater exposure to academic R&D allocated by congressional earmarks.

They discuss the implications of these results for policy and future research.

09-11: Intangible Assets and National Income Accounting: Measuring a Scientific Revolution by Leonard I. Nakamura

In this paper, the author relates the measurement of intangibles to the project of measuring the sources of growth. He focuses on three related and difficult areas of the measurement of national income: the measurement of new goods, the deflation of intangible investment, and the divergence between the social and private valuations of intangible assets. The author argues that the economic theory and practice underlying measurement of these items is currently controversial and incomplete, and he points toward how concretely to move forward.

09-10 Inducing Agents to Report Hidden Trades: A Theory of an Intermediary by Yaron Leitner

Superseded by Working Paper 10-28/R.

09-9: The Long and Large Decline in State Employment Growth Volatility by Gerald Carlino, Robert DeFina, and Keith Sill

This study documents a general decline in the volatility of employment growth during the period 1956 to 2005 and examines its possible sources. Estimates from a state-level pooled cross-section/time-series model indicate that aggregate and state-level factors each account for an important share of the total explained variation in state-level volatility. Specifically, state-level factors have contributed as much as 16 percent, while aggregate factors are found to account for up to 46 percent of the variation. With regard to state-level factors, the share of state total employment in manufacturing and state banking deregulation each contributed significantly to fluctuations in volatility. Aggregate factors that are quantitatively important in accounting for volatility include monetary policy, the state of the national business cycle, and oil-price shocks.

09-8: Sticky Prices Versus Monetary Frictions: An Estimation of Policy Trade-offs by S. Boragan Aruoba and Frank Schorfheide

The authors develop a two-sector monetary model with a centralized and decentralized market. Activities in the centralized market resemble those in a standard New Keynesian economy with price rigidities. In the decentralized market, agents engage in bilateral exchanges for which money is essential. The model is estimated and evaluated based on postwar U.S. data. They document its money demand properties and determine the optimal long-run inflation rate that trades off the New Keynesian distortion against the distortion caused by taxing money and hence transactions in the decentralized market. The authors find that target rates of -1 percent or less are desirable, which contrasts with policy recommendations derived from a cashless New Keynesian model.

09-7: Housing Over Time and Over the Life Cycle: A Structural Estimation by Wenli Li, Haiyong Liu, and Rui Yao

Superseded by Working Paper 15-04.

09-6: Inflation Dynamics with Labour Market Matching: Assessing Alternative Specifications by Kai Christoffel, James Costain, Gregory de Walque, Keith Kuester, Tobias Linzert, Stephen Millard, and Olivier Pierrard

This paper reviews recent approaches to modeling the labour market and assesses their implications for inflation dynamics through both their effect on marginal cost and on price-setting behavior. In a search and matching environment, the authors consider the following modeling setups: right-to-manage bargaining vs. efficient bargaining, wage stickiness in new and existing matches, interactions at the firm level between price and wage-setting, alternative forms of hiring frictions, search on-the-job and endogenous job separation. They find that most specifications imply too little real rigidity and, so, too volatile inflation. Models with wage stickiness and right-to-manage bargaining or with firm-specific labour emerge as the most promising candidates.

09-5: Introduction to Price and Productivity Measurement for Housing by Bert M. Balk, W. Erwin Diewert, and Alice O. Nakamura

This paper provides a brief introduction to a proposed new opportunity cost treatment of owner-occupied housing in measures of inflation for the United States. In addition, the paper introduces, and provides links to, a collection of nine other papers that discuss various aspects of the treatment of owner-occupied housing in measures of inflation for a number of nations, including Canada, Germany, Iceland, and the United States.

09-4: Accounting for Housing in a CPI by W. Erwin Diewert and Alice O. Nakamura

In this paper, the authors take stock of how statistical agencies in different nations are currently accounting for housing in their consumer price indexes (CPIs). The rental equivalence and user cost approaches have been favorites of economists. Both can be derived from the fundamental equation of capital theory. Concerns about these approaches are taken up. They go on to argue that an opportunity cost approach is the correct theoretical framework for accounting for owner-occupied housing (OOH) in a CPI. This approach, first mentioned in a 2006 OECD paper by Diewert, is developed more fully here. The authors explore the relationship of this new approach to the usual rental equivalency and user cost approaches. The new approach leads to an owner-occupied housing opportunity cost (OOHOC) index that is a weighted average of the rental and the financial opportunity costs.

The authors call attention to the need for more direct measures of inflation for owner-occupied housing services. In a 2007 paper, Mishkin argues that central banks with supervisory authority can reduce the likelihood of bubbles forming through prudential supervision of the financial system. However, the official mandates of central banks typically focus on managing measured inflation. Barack Obama has pledged to give the Federal Reserve greater oversight of a broader array of financial institutions. They believe that an important addition to this pledge should be to give the BLS, BEA, and Census Bureau the funds and the mandate to aggressively develop improved measures of inflation for owner-occupied housing services. Central banks and national governments have many policy instruments at their disposal that they could use, in the future, to control inflation in housing markets. What they lack are appropriate measures of inflation in the market for owner-occupied housing services. The proposed new opportunity cost measure for accounting for OOH in a CPI will not be simple or cheap to implement. However, the current financial crisis makes it clear that the costs of not having an adequate measure for inflation in the cost of owner-occupied housing services can be far greater.

09-3/R: The Dark Side of Bank Wholesale Funding by Rocco Huang and Lev Ratnovski

Banks increasingly use short-term wholesale funds to supplement traditional retail deposits. Existing literature mainly points to the "bright side" of wholesale funding: Sophisticated financiers can monitor banks, disciplining bad but refinancing good ones. This paper models a "dark side" of wholesale funding. In an environment with a costless but noisy public signal on bank project quality, short-term wholesale financiers have lower incentives to conduct costly monitoring, and instead may withdraw based on negative public signals, triggering inefficient liquidations. Comparative statics suggest that such distortions of incentives are smaller when public signals are less relevant and project liquidation costs are higher, e.g., when banks hold mostly relationship-based small business loans.

09-2: Maturity, Indebtedness, and Default Risk by Satyajit Chatterjee and Burcu Eyigungor

Superseded by Working Paper 10-12.

09-1: The Role of Labor Markets for Euro Area Monetary Policy by Kai Christoffel, Keith Kuester, and Tobias Linzert

In this paper, the authors explore the role of labor markets for monetary policy in the euro area in a New Keynesian model in which labor markets are characterized by search and matching frictions. They first investigate to which extent a more flexible labor market would alter the business cycle behavior and the transmission of monetary policy. They find that while a lower degree of wage rigidity makes monetary policy more effective, i.e., a monetary policy shock transmits faster onto inflation, the importance of other labor market rigidities for the transmission of shocks is rather limited. Second, having estimated the model by Bayesian techniques they analyze to which extent labor market shocks, such as disturbances in the vacancy posting process, shocks to the separation rate and variations in bargaining power are important determinants of business cycle fluctuations. The authors' results point primarily towards disturbances in the bargaining process as a significant contributor to inflation and output fluctuations. In sum, the paper supports current central bank practice which appears to put considerable effort into monitoring euro area wage dynamics and which appears to treat some of the other labor market information as less important for monetary policy.