> > > >
Please Note: If the title of a paper is highlighted, you can get to the full text for that paper by clicking on the highlighted area. Full text files are in pdf format; to access them, you must have Adobe Reader.
There are two ways for a venture capital (VC) firm to enter a new market: initiate a new deal or form a syndicate with an incumbent. Both types of entry are extensively observed in the data. In this paper, the author examines (i) the causes of syndication between entrant and incumbent VC firms, (ii) the impact of entry on VC contract terms and survival rates of VC-backed start-up companies, and (iii) the effect of syndication between entrant and incumbent VC firms on the competition in the VC market and the outcomes of incumbent-backed ventures. By developing a theoretical model featuring endogenous matching and coalition formation in the VC market, the author shows that an incumbent VC firm may strategically form syndicates with entrants to maintain its bargaining power. Furthermore, an incumbent VC firm is less likely to syndicate with entrants as the incumbent's expertise increases. The author finds that entry increases the likelihood of survival for incumbent-backed start-up companies while syndication between entrants and incumbents dampens the competitive effect of entry. Using a data set of VC-backed investments in the U.S. between the years 1990 and 2006, the author finds empirical evidence that is consistent with the theoretical predictions. The estimation results remain robust after she controls for the endogeneity of entry and syndication.
(719 KB, 56 pages)
The authors present theory and evidence highlighting the role of natural amenities in neighborhood dynamics, suburbanization, and variation across cities in the persistence of the spatial distribution of income. The authors' model generates three predictions that they confirm using a novel database of consistent-boundary neighborhoods in U.S. metropolitan areas, 1880–2010, and spatial data for natural features such as coastlines and hills. First, persistent natural amenities anchor neighborhoods to high incomes over time. Second, downtown neighborhoods in coastal cities were less susceptible to the suburbanization of income in the mid-20th century. Third, naturally heterogeneous cities exhibit spatial distributions of income that are dynamically persistent.
(3 MB, 54 pages)
The authors develop a new class of nonlinear time-series models to identify nonlinearities in the data and to evaluate nonlinear DSGE models. U.S. output growth and the federal funds rate display nonlinear conditional mean dynamics, while inflation and nominal wage growth feature conditional heteroskedasticity. They estimate a DSGE model with asymmetric wage/price adjustment costs and use predictive checks to assess its ability to account for nonlinearities. While it is able to match the nonlinear inflation and wage dynamics, thanks to the estimated downward wage/price rigidities, these do not spill over to output growth or the interest rate.
(935.6 KB, 115 pages)
The authors quantify the fiscal multipliers in response to the American Recovery and Reinvestment Act (ARRA) of 2009. They extend the benchmark Smets-Wouters (2007) New Keynesian model, allowing for credit-constrained households, the zero lower bound, government capital, and distortionary taxation. The posterior yields modestly positive short-run multipliers around 0.53 and modestly negative long-run multipliers around -0.36. The authors explain the central empirical findings with the help of a simple three equation New Keynesian model with sticky wages and credit-constrained households.
(935.6 KB, 115 pages)
New businesses are important for job creation and have contributed more than proportionally to the expansion in the 1990s and the decline of employment after the 2007 recession. This paper provides a framework for analyzing determinants of business creation in a world where new business owners are exposed to idiosyncratic risk due to initial imperfect diversification. This paper uses this framework to analyze how entrepreneurial risk has changed over time and how this has affected employment in the US. Conditions are provided under which entrepreneurial risk can be identified using micro data on the size distribution of new businesses and their exit rates. The baseline model considers both upside and downside risk. Applied to US time series data, structural estimates suggest that higher upside risk explains much of the high job creation in the late 1990s. Time variation in risk explains around 40% of the variation in employment of new businesses. Reduced form results show that this relationship is strongest in IT-related industries. When restricting the model to a single risk factor, the explanatory power for employment drops by 25% to 50% compared to the baseline estimates.
(1.7 MB, 97 pages)
The authors are motivated by four stylized facts computed for emerging and developed economies: (i) business cycle movements are wider in emerging countries; (ii) economies in emerging countries experience greater economic policy uncertainty; (iii) emerging economies are more polarized and less politically stable; and (iv) economic policy uncertainty is positively related to political polarization. The authors show that a standard real business cycle (RBC) model augmented to incorporate political polarization, a 'polarized business cycle' (PBC) model, is consistent with these facts. The authors' main hypothesis is that fluctuations in economic variables are not only caused by innovations to productivity, as traditionally assumed in macroeconomic models, but also by shifts in political ideology. Switches between left-wing and right-wing governments generate uncertainty about the returns to private investment, and this affects real economic outcomes. Since emerging economies are more polarized than developed ones, the effects of political turnover are more pronounced. This translates into higher economic policy uncertainty and amplifies business cycles. The authors derive their results analytically by fully characterizing the long-run distribution of economic and fiscal variables. They then analyze the effect of a permanent increase in polarization on PBCs.
(1.25 MB, 44 pages)
This paper studies the effects of asymmetries in re-election probabilities across parties on public policy and their subsequent propagation to the economy. The struggle between groups that disagree on targeted public spending (e.g., pork) results in governments being endogenously short-sighted: Systematic underinvestment in infrastructure and overspending on targeted goods arise, above and beyond what is observed in symmetric environments. Because the party enjoying an electoral advantage is less short-sighted, it devotes a larger proportion of revenues to productive investment. Hence, political turnover induces economic fluctuations in an otherwise deterministic environment. The author characterizes analytically the long-run distribution of allocations and shows that output increases with electoral advantage, despite the fact that governments expand. Volatility is non-monotonic in electoral advantage and is an additional source of inefficiency. Using panel data from US states the author confirms these findings.
(1.1 MB, 45 pages)
This paper studies information aggregation in financial markets with recurrent investor exit and entry. A dynamic general equilibrium model of asset trading with private information and collateral constraints is considered. Investors differ in their aversion to Knightian uncertainty: When uncertainty is high, some investors exit the market. Since exiting investors' information is not fully revealed by prices, conditional return volatility and risk premia both increase. Data on institutional investors' holdings of individual stocks show that investor exits indeed move negatively with price informativeness. The model also implies that exit is more likely when wealth is more concentrated in the hands of less uncertainty-averse investors. The model thus predicts less informative prices toward the end of a long boom, as seen in the data. Moreover, economies with looser collateral constraints should see more volatility due to exit and partial revelation. Higher capital requirements can improve welfare by inducing more information revelation by prices.
(792 KB, 46 pages)
American politics have become increasingly polarized in recent decades. To the extent that political polarization introduces uncertainty about economic policy, this pattern may have adversely affected the economy. According to existing theories, a rise in the volatility of fiscal shocks faced by individuals should result in a decline in economic activity. Moreover, if polarization is high around election dates, businesses and households may be induced to delay decisions that involve high reversibility costs (such as investment or hiring under search costs). Testing these theories has been challenging given the low frequency at which existing polarization measures have been computed (in most studies, the series is available only biannually). In this paper, the author provides a novel high-frequency measure of polarization, the political polarization index (PPI). The measure is constructed monthly for the period 1981–2013 using a search-based approach. The author documents that while the PPI fluctuates around a constant mean for most of the sample period prior to 2007, it has exhibited a steep increasing trend since the Great Recession. Evaluating the effects of this increase using a simple VAR, the author finds that an innovation to polarization significantly discourages investment, output, and employment. Moreover, these declines are persistent, which may help explain the slow recovery observed since the 2007 recession ended.
(2.0 MB, 24 pages)
Superseded by Working Paper 14-20.
The authors develop a nonlinear state-space model that captures the joint dynamics of consumption, dividend growth, and asset returns. Building on Bansal and Yaron (2004), their model consists of an economy containing a common predictable component for consumption and dividend growth and multiple stochastic volatility processes. The estimation is based on annual consumption data from 1929 to 1959, monthly consumption data after 1959, and monthly asset return data throughout. The authors maximize the span of the sample to recover the predictable component and use high-frequency data, whenever available, to efficiently identify the volatility processes. Their Bayesian estimation provides strong evidence for a small predictable component in consumption growth (even if asset return data are omitted from the estimation). Three independent volatility processes capture different frequency dynamics; their measurement error specification implies that consumption is measured much more precisely at an annual than monthly frequency; and the estimated model is able to capture key asset-pricing facts of the data.
(975 KB, 53 pages)
Superseded by Working Paper 15-23.
Restriction on the supply of new urban land is commonly thought to raise the value of existing urban land. This paper questions this view. The authors develop a tractable production-externality-based circular city model in which firms and workers choose locations and intensity of land use. Consistent with evidence, the model implies exponentially decaying density and price gradients. For plausible parameter values, an increase in the demand for urban land can lead to a smaller increase in urban rents in cities that cannot expand physically because they are less able to exploit the positive external effect of greater employment density.
Supersedes Working Paper 12-25.
(800 KB, 50 pages)
Rapid house-price depreciation and rising unemployment were the main drivers of the huge increase in mortgage default during the downturn years of 2007 to 2010. However, mortgage default was also partly driven by an increased reliance on alternative mortgage products such as pay-option ARMs and interest-only mortgages, which allow the borrower to defer principal amortization. The goal of this paper is to better understand the forces that spurred use of alternative mortgages during the housing boom and the resulting impact on default patterns, relying on a unifying conceptual framework to guide the empirical work.
The conceptual framework allows borrowers to choose the extent of mortgage "backloading," the postponement of loan repayment through various mechanisms that constitutes a main feature of alternative mortgages. The model shows that, when future house-price expectations become more favorable, reducing default concerns, mortgage choices shift toward alternative contracts. This prediction is confirmed by empirical evidence showing that an increase in past house-price appreciation, which captures more favorable expectations for the future, raises the market share of alternative mortgages. In addition, using a proportional-hazard default model, the paper tests the fundamental presumption that backloaded mortgages are more likely to default, finding support for this view.
(874 KB, 33 pages)
In the last ten years there has been an explosion of empirical work examining price setting behavior at the micro level. The work has in turn challenged existing macro models that attempt to explain monetary nonneutrality, because these models are generally at odds with much of the micro price data. In response, economists have developed a second generation of sticky-price models that are state dependent and that include both fixed costs of price adjustment and idiosyncratic shocks. Nonetheless, some ambiguity remains about the extent of monetary nonneutrality that can be attributed to costly price adjustment. The authors' paper takes a step toward eliminating that ambiguity.
(997 KB, 46 pages)
The authors provide a new way to filter US inflation into trend and cycle components, based on extracting long-run forecasts from the Survey of Professional Forecasters, by operating the Kalman filter in reverse, beginning with observed forecasts, then estimating parameters, and then extracting the stochastic trend in inflation. The trend-cycle model with unobserved components is consistent with numerous studies of US inflation history and is of interest partly because the trend may be viewed as the Fed’s evolving inflation target or long-horizon expected inflation. The sluggish reporting attributed to forecasters is consistent with evidence on mean forecast errors. There is considerable evidence of inflation-gap persistence and some evidence of implicit sticky information. But statistical tests show these two widely used perspectives on US inflation forecasts, the unobserved-components model and the sticky-information model, cannot be reconciled.
(497 KB, 43 pages)
The authors study the source and consequences of sluggish export dynamics in emerging markets following large devaluations. They document two main features of exports that are puzzling for standard trade models. First, given the change in relative prices, exports tend to grow gradually following a devaluation. Second, high interest rates tend to suppress exports. To address these features of export dynamics, the authors embed a model of endogenous export participation due to sunk and per period export costs into an otherwise standard small open economy. In response to shocks to productivity, the interest rate, and the discount factor, the authors find the model can capture the salient features of export dynamics documented. At the aggregate level, the features giving rise to sluggish exports lead to more gradual net export reversals, sharper contractions and recoveries in output, and endogenous stagnation in labor productivity.
(534 KB, 53 pages)
Superseded by Working Paper 15-20.
The unique capital structure of commercial banking — funding production with demandable debt that participates in the economy’s payments system — affects various aspects of banking. It shapes banks’ comparative advantage in providing financial products and services to informationally opaque customers, their ability to diversify credit and liquidity risk, and how they are regulated, including the need to obtain a charter to operate and explicit and implicit federal guarantees of bank liabilities to reduce the probability of bank runs. These aspects of banking affect a bank’s choice of risk vs. expected return, which, in turn, affects bank performance. Banks have an incentive to reduce risk to protect the valuable charter from episodes of financial distress and they also have an incentive to increase risk to exploit the cost-of-funds subsidy of mispriced deposit insurance. These are contrasting incentives tied to bank size. Measuring the performance of banks and its relationship to size requires untangling cost and profit from decisions about risk versus expected-return because both cost and profit are functions of endogenous risk-taking. This chapter gives an overview of two general empirical approaches to measuring bank performance and discusses some of the applications of these approaches found in the literature. One application explains how better diversification available at a larger scale of operations generates scale economies that are obscured by higher levels of risk-taking. Studies of banking cost that ignore endogenous risk-taking find little evidence of scale economies at the largest banks while those that control for this risk-taking find large scale economies at the largest banks — evidence with important implications for regulation.
(445 KB, 36 pages)
An important inefficiency in sovereign debt markets is debt dilution, wherein sovereigns ignore the adverse impact of new debt on the value of existing debt and, consequently, borrow too much and default too frequently. A widely proposed remedy is the inclusion of seniority clause in sovereign debt contracts: Creditors who lent first have priority in any restructuring proceedings. The authors incorporate seniority in a quantitatively realistic model of sovereign debt and find that seniority is quite effective in mitigating the dilution problem. The authors also show theoretically that seniority cannot be fully effective unless the costs of debt restructuring are zero.
(653 KB, 46 pages)
This paper studies the dynamics of a New Keynesian dynamic stochastic general
equilibrium (DSGE) model near the zero lower bound (ZLB) on nominal interest rates. In addition to the standard targeted-inflation equilibrium, the authors consider a deflation equilibrium as well as a Markov sunspot equilibrium that switches between a targeted-inflation and a deflation regime. The authors use the particle filter to estimate the state of the U.S. economy during and after the 2008–09 recession under the assumptions that the U.S. economy has been in either the targeted-inflation or the sunspot equilibrium. The authors consider a combination of fiscal policy (calibrated to the American Recovery and Reinvestment Act) and monetary policy (that tries to keep interest rates near zero) and compute government spending multipliers. Ex-ante multipliers (cumulative over one year) under the targeted-inflation regime are around 0.9. A monetary policy that keeps interest rates at zero can raise the multiplier to 1.7. The ex-post (conditioning on the realized shocks in 2009–11) multiplier is estimated to be 1.3. Conditional on the sunspot equilibrium, the multipliers are generally smaller and the scope for conventional expansionary monetary policy is severely limited.
(715 KB, 66 pages)
The author studies the effectiveness of bank coalition formation in response to an external aggregate shock that may cause disruption to the payment mechanism and trading activity. He shows that a specific type of bank coalition (a joint-liability arrangement) is an effective arrangement that allows member banks to build a capital buffer that permits them to absorb the effects of an external shock. In particular, it allows society to completely prevent any disruption to trading activity that can be caused by a temporary drop in the aggregate value of banking assets, at least in the case of a shock that is not too big. If the shock is relatively large, then a bank coalition will be unable to completely prevent a disruption in trading activity even though it will be able to substantially mitigate the effects of the shock. Thus, the existence of a private bank coalition of the kind considered in this paper can be an effective means of preventing significant contractions in trading activity.
(411 KB, 41 pages)
Superseded by Working Paper 14-27.
Superseded by Working Paper 15-10.
Congestion pricing has long been held up by economists as a panacea for the problems associated with ever increasing traffic congestion in urban areas. In addition, the concept has gained traction as a viable solution among planners, policymakers, and the general public. While congestion costs in urban areas are significant and clearly represent a negative externality, economists also recognize the advantages of density in the form of positive agglomeration externalities. The long-run equilibrium outcomes in economies with multiple correlated, but offsetting, externalities have yet to be fully explored in the literature. To this end, the author develops a spatial equilibrium model of urban structure that includes both congestion costs and agglomeration externalities. The author then estimates the structural parameters of the model by using a computational solution algorithm and matches the spatial distribution of employment, population, land use, land rents, and commute times in the data. Policy simulations based on the estimates suggest that naive optimal congestion pricing can lead to net negative economic outcomes.
(1.35 MB, 55 pages)
Superseded by Working Paper 14-25.
The authors examine investors’ reactions to announcements of large capital infusions by U.S. financial institutions (FIs) from 2000 to 2009. These infusions include private market infusions (seasoned equity offerings (SEOs)) as well as injections of government capital under the Troubled Asset Relief Program (TARP). The sample period covers both business cycle expansions and contractions, and the recent financial crisis. They present evidence on the factors affecting FIs’ decisions to raise capital, the determinants of investor reactions, and post-infusion risk-taking of the recipients, as well as a sample of matching FIs. Investors reacted negatively to the news of private market SEOs by FIs, both in the immediate term (e.g., the two days surrounding the announcement) and over the subsequent year, but positively to TARP injections. Reactions differed depending on the characteristics of the FIs, and the stage of the business cycle. More financially constrained institutions were more likely to have raised capital through private market offerings during the period prior to TARP, and firms receiving a TARP injection tended to be riskier and more levered. In the case of TARP recipients, they appeared to finance an increase in lending (as a share of assets) with more stable financing sources such as core deposits, which lowered their liquidity risk. However, the authors find no evidence that banks’ capital adequacy increased after the capital injections.
Supersedes Working Paper 11-46.
(549 KB, 44 pages)
A well-documented property of the Beveridge-Nelson trend-cycle decomposition is the perfect negative correlation between trend and cycle innovations. The authors show how this may be consistent with a structural model where trend shocks enter the cycle, or cyclic shocks enter the trend and that identification restrictions are necessary to make this structural distinction. A reduced-form unrestricted version such as that of Morley, Nelson and Zivot (2003) is compatible with either option, but cannot distinguish which is relevant. They discuss economic interpretations and implications using US real GDP data.
(563 KB, 34 pages)
In this paper the authors use credit rating data from two large Swedish banks to elicit evidence on banks’ loan monitoring ability. For these banks, their tests reveal that banks’ credit ratings indeed include valuable private information from monitoring, as theory suggests. However, their tests also reveal that publicly available information from a credit bureau is not efficiently impounded in the bank ratings: The credit bureau ratings not only predict future movements in the bank ratings but also improve forecasts of bankruptcy and loan default. The authors investigate possible explanations for these findings. Their results are consistent with bank loan officers placing too much weight on their private information, a form of overconfidence. To the extent that overconfidence results in placing too much weight on private information, risk analyses of the bank loan portfolios in the authors' data could be improved by combining the bank credit ratings and public credit bureau ratings.
The methods the authors use represent a new basket of straightforward techniques that enable both financial institutions and regulators to assess the performance of credit rating systems.
Supersedes Working Paper 10-21.
(635 KB, 60 pages)
When markets freeze, not only are gains from trade left unrealized, but the process of information production through prices, or price discovery, is disrupted as well. Though this latter effect has received much less attention than the former, it constitutes an important source of inefficiency during times of crisis. The authors provide a formal model of price discovery and use it to study a government program designed explicitly to restore the process of information production in frozen markets. This program, which provided buyers with partial insurance against acquiring low-quality assets, reveals a fundamental trade-off for policymakers: while some insurance encourages buyers to bid for assets when they otherwise would not, thus promoting price discovery, too much insurance erodes the informational content of these bids, which hurts price discovery.
(550 KB, 43 pages)
The authors propose a novel method to estimate dynamic equilibrium models with stochastic volatility. First, they characterize the properties of the solution to this class of models. Second, the authors take advantage of the results about the structure of the solution to build a sequential Monte Carlo algorithm to evaluate the likelihood function of the model. The approach, which exploits the profusion of shocks in stochastic volatility models, is versatile and computationally tractable even in large-scale models, such as those often employed by policy-making institutions. As an application, the authors use their algorithm and Bayesian methods to estimate a business cycle model of the U.S. economy with both stochastic volatility and parameter drifting in monetary policy. Their application shows the importance of stochastic volatility in accounting for the dynamics of the data.
(707 KB, 72 pages)
How does physical capital accumulation affect the decision to default in developing small open economies? The authors find that, conditional on a level of foreign indebtedness, more capital improves the sovereign’s ability to meet its obligations, reducing the likelihood of default and the risk premium. This effect, however, is diminishing in the stock of capital because capital also tames the severity of the contraction following default, making autarky more appealing. Access to long-term debt and costly capital adjustment are crucial for matching business cycles. Their quantitative model delivers default episodes that mimic those observed in the data.
(524 KB, 36 pages)
Banks supply payment services that underpin the smooth operation of the economy. To ensure an efficient payment system, it is important to maintain competition among payment service providers, but data available to gauge the degree of competition are quite limited. The authors propose and implement a frontier-based method to assess relative competition in bank-provided payment services. Billion dollar banks account for around 90 percent of assets in the U.S., and those with around $4 to $7 billion in assets turn out to be both the most and the least competitive in payment services, not the very largest banks.
(340 KB, 28 pages)
The authors provide a new and superior measure of U.S. GDP, obtained by applying optimal signal-extraction techniques to the (noisy) expenditure-side and income-side estimates. Its properties — particularly as regards serial correlation — differ markedly from those of the standard expenditure-side measure and lead to substantially revised views regarding the properties of GDP.
(655 KB, 36 pages)
In this study, the authors make use of a massive database of mortgage defaults to estimate REO liquidation timelines and time-related costs resulting from the recent post-crisis interventions in the mortgage market and the freezing of foreclosures due to "robo-signing" revelations. The cost of delay, estimated by comparing today's time-related costs to those before the start of the financial crisis, is eight percentage points, with enormous variation among states. While costs are estimated to be four percentage points higher in statutory foreclosure states, they are estimated to be 13 percentage points higher in judicial foreclosure states and 19 percentage points higher in the highest-cost state, New York. They discuss the policy implications of these extraordinary increases in time-related costs, including recent actions by the GSEs to raise their guarantee fees 15-30 basis points in five high-cost judicial states. Combined with evidence that foreclosure delays do not improve outcomes for borrowers and that increased delays can have large negative externalities in neighborhoods, the weight of the evidence is that current foreclosure practices merit the urgent attention of policymakers.
(666 KB, 29 pages)
The authors study trade between an informed seller and an uninformed buyer who have existing inventories of assets similar to those being traded. They show that these inventories may induce the buyer to increase the price (a “run-up”) but may also make trade impossible (a “freeze”) and hamper information dissemination. Competition may amplify the run-up by inducing buyers to purchase assets at a loss to prevent competitors from purchasing at lower prices and releasing bad news about inventory values. In a dynamic extension, the authors show that a market freeze may be preceded by high prices. Finally, they discuss empirical and policy implications.
Supersedes Working Paper 12-8.
(638 KB, 64 pages)
The Great Recession focused attention on large financial institutions and systemic risk. The authors investigate whether large size provides any cost advantages to the economy and, if so, whether these cost advantages are due to technological scale economies or too-big-to-fail subsidies. Estimating scale economies is made more complex by risk-taking. Better diversification resulting from larger scale generates scale economies but also incentives to take more risk. When this additional risk-taking adds to cost, it can obscure the underlying scale economies and engender misleading econometric estimates of them. Using data pre- and post-crisis, they estimate scale economies using two production models. The standard model ignores endogenous risk-taking and finds little evidence of scale economies. The model accounting for managerial risk preferences and endogenous risk-taking finds large scale economies, which are not driven by too-big-to-fail considerations. The authors evaluate the costs and competitive implications of breaking up the largest banks into smaller banks.
(500 KB, 50 pages)
In the data, most consumer defaults on unsecured credit are informal and the lending industry devotes significant resources to debt collection. The authors develop a new theory of credit card lending that takes these two features into account. The two key elements of their model are moral hazard and costly state verification that relies on the use of information technology. They show that the model gives rise to a novel channel through which IT progress can affect outcomes in the credit markets, and argue that this channel can be critical to understand the trends associated with the rapid expansion of credit card borrowing in the 1980s and over the 1990s. Independently, the mechanism of the model helps reconcile high levels of defaults and indebtedness observed in the US data.
(566 KB, 50 pages)
Using a sample of the 48 mainland U.S. states for the period 1973-2009, the authors study the ability of U.S. states to expand their own state employment through the use of state deficit policies. The analysis allows for the facts that U.S. states are part of a wider monetary and economic union with free factor mobility across all states and that state residents and firms may purchase goods from "neighboring" states. Those purchases may generate economic spillovers across neighbors. Estimates suggest that states can increase their own state employment by increasing their own deficits. There is evidence of spillovers to employment in neighboring states defined by common cyclical patterns among state economies. For large states, aggregate spillovers to its economic neighbors are approximately two thirds of the large state's job growth. Because of significant spillovers and possible incentives to free-ride, there is a potential case to actively coordinate (i.e., centralize) the management of stabilization policies. Finally, when these deficits are scheduled for repayment the job effects of a temporary increase in state own deficits persist for at most one to two years and there is evidence of a negative impact of state jobs.
(395 KB, 42 pages)
This paper documents a strong association between total factor productivity (TFP) growth and the value of U.S. corporations (measured as the value of equities and net debt for the U.S. corporate sector) throughout the postwar period. Persistent fluctuations in the first two moments of TFP growth predict two-thirds of the medium-term variation in the value of U.S. corporations relative to gross domestic product (henceforth value-output ratio). An increase in the conditional mean of TFP growth by 1 percent is associated with a 21 percent increase in the value-output ratio, while this indicator declines by 12 percent following a 1 percent increase in the standard deviation of TFP growth. A possible explanation for these findings is that movements in the first two moments of aggregate productivity affect the expectations that investors have regarding future corporate payouts as well as their perceived risk. The authors develop a dynamic stochastic general equilibrium model with the aim of verifying how sensible this interpretation is. The model features recursive preferences for the households, Markov-Switching regimes in the first two moments of TFP growth, incomplete information, and monopolistic rents. Under a plausible calibration and including all these features, the model can account for a sizable fraction of the elasticity of the value-output ratio to the first two moments of TFP growth.
(712 KB, 48 pages)
Superseded by Working Paper 16-03.
(1 MB, 45 pages)
The Agency CMO market, an often overlooked corner of mortgage finance, has experienced tremendous growth over the past decade. This paper explains the rationale behind the construction of Agency CMOs, quantifies risks embedded in Agency CMOs using a traditional and a novel approach, and offers valuable lessons learned when interpreting these risk measures. Among these lessons is that to fully understand the risks in Agency CMOs a full bond-by-bond analysis is necessary and that interest rate risk is not the only risk that needs to be considered when conducting risk management with CMOs.
(481 KB, 28 pages)
In many markets, sellers advertise their good with an asking price. This is a price at which the seller is willing to take his good off the market and trade immediately, though it is understood that a buyer can submit an offer below the asking price and that this offer may be accepted if the seller receives no better offers. Despite their prevalence in a variety of real world markets, asking prices have received little attention in the academic literature. The authors construct an environment with a few simple, realistic ingredients and demonstrate that using an asking price is optimal: it is the pricing mechanism that maximizes sellers’ revenues and it implements the efficient outcome in equilibrium. They provide a complete characterization of this equilibrium and use it to explore the positive implications of this pricing mechanism for transaction prices and allocations.
(678 KB, 44 pages)
This paper examines the different effects of macroprudential policy and monetary policy on credit and inflation using a simple New Keynesian model with credit. In this model, macroprudential policy is effective in stabilizing credit but has a limited effect on inflation. Monetary policy with an interest rate rule stabilizes inflation, but this rule is ‘too blunt’ an instrument to stabilize credit. The determinacy of the model requires the interest rate’s response to inflation to be greater than one for one and independent of macroprudential policy. That is, the ‘Taylor principle’ applies to monetary policy. This dichotomy between macroprudential policy and monetary policy arises because each policy is designed to differently affect the saving and borrowing decisions of households.
(733 KB, 25 pages)
Using a segmented market model that includes state-dependent asset market decisions along with access to credit, the authors analyze the impact that transactions credit has on interest rates and prices. They find that the availability of credit substantially changes the dynamics in the model, allowing agents to significantly smooth consumption and reduce the movements in velocity. As a result, prices become quite flexible and liquidity effects are dampened. Thus, adding another medium of exchange whose use is calibrated to U.S. data has important implications for economic behavior in a segmented markets model.
(394 KB, 43 pages)
This paper argues that there is a normative case for delaying policy reform. Policy design in dynamic economies typically faces a trade-off between the policy effects in the short and long term, and possibly across future states of nature. When the economy is in an atypical state or available policies are less flexible than ideal, this trade-off can be steep enough that retaining the status-quo policy in the short term and taking on the reform at a later date is welfare improving. In a simple New Keynesian economy, the author considers monetary policy reform from discretion to the optimal targeting rule. The author finds that the policy reform should be postponed if a sharp drop in output drives the nominal interest rate to the zero lower bound but only modest deflation pressures are observed under the status-quo policy.
(534 KB, 37 pages)
In most states, the law grants seniority to the oldest mortgage on a house, unless that mortgagee subordinates its claim. The authors show that this practice significantly impedes the refinancing of first mortgages by imparting blocking power to junior mortgagees. They identify the effect by building a database showing all mortgages of a large panel of homeowners, identifying those whose combined loan-to-value makes them candidates for refinancing their first mortgages, and contrasting the incidence of refinancing between the states following this standard and the states following an alternate standard by which a mortgage inherits the seniority of the mortgage it replaces.
(719 KB, 42 pages)
This paper sets forth a discussion framework for the information requirements of systemic financial regulation. It specifically describes a potential large macro-micro database for the U.S. based on an extended version of the Flow of Funds. The author argues that such a database would have been of material value to U.S. regulators in ameliorating the recent financial crisis and could be of aid in understanding the potential vulnerabilities of an innovative financial system in the future. The author also suggests that making these data available to the academic research community, under strict confidentiality restrictions, would enhance the detection and measurement of systemic risk.
(225 KB, 23 pages)
In this paper, the author studies long-run population changes across U.S. metropolitan areas. First, the author argues that changes over a long period of time in the geographic distribution of population can be informative about the so-called “resilience” of regions. Using the censuses of population from 1790 to 2010, the author finds that persistent declines, lasting two decades or more, are somewhat rare among metropolitan areas in U.S. history, though more common recently. Incorporating data on historical factors, the author finds that metropolitan areas that have experienced extended periods of weak population growth tend to be smaller in population, less industrially diverse, and less educated. These historical correlations inform the construction of a regional resilience index.
(515 KB, 31 pages)