Please Note: If the title of a paper is highlighted, you can get to the full text for that paper by clicking on the highlighted area. Full text files are in pdf format; to access them, you must have Adobe Acrobat Reader.
past several years, substantial research effort has gone into measuring the
efficiency of financial institutions. Many studies have found that
inefficiencies are quite large, on the order of 20 percent or more of total
banking industry costs and about half of the industry's potential profits.
There is no consensus on the sources of the differences in measured efficiency.
This paper examines several possible sources, including differences in
efficiency concepts, measurement method, and a number of bank, market, and
regulatory characteristics. We review the existing literature and provide new
evidency using data on U.S. banks over the period 1990-5.
(192 KB, 59 pages)
Optimal monetary policy for an economy with
seasonal fluctuations and a cash-in-advance requirement on the purchase of
consumption goods is studied. It is shown that the short delay in the
availability of newly acquired funds for consumption purchases (the hallmark of
cash-in-advance models) typically makes the seasonal steady state inefficient.
It is also shown that monetary policy can overcome this inefficiency by keeping
the nominal interest rate constant over the seasons. An analytical model is
also presented to explore the effects of seasonal smoothing of nominal interest
rates on the seasonal amplitude of other closely related variables.
(202 KB, 29 pages)
The authors provide some preliminary evidence on
the costs and profitability of relationship lending by commercial banks.
Drawing on recent research that has identified loan rate smoothing as a
significant element in lending relationships between banks and firms, the
authors carry out a two-stage procedure. In the first stage, the authors derive
bank-specific measures of the extent to which the banks in their sample engage
in loan rate smoothing for small business borrowers in response to exogenous
shocks to their credit risk. In the second stage, the authors estimate cost and
(alternative) profit functions to examine how loan rate smoothing affects a
banks' costs and profits. On the whole, the authors' evidence says that loan
rate smoothing is associated with lower costs and lower profits.
These results do not support the hypothesis that loan rate smoothing arises as
part of an optimal long-term contract between a bank and its borrower. However,
we do find so me limited support for smoothing as part of an optimal contract
for small banks early in our sample period.
(87 KB, 27 pages)
The computerization of retailing has made price dispersion a norm
in the United States, so that any given list price or transactions price is an
increasingly imperfect measure of a product's resource cost. As a consequence,
measuring the real output of retailers has become increasingly difficult. Food
retailing is used as a case study to examine data problems in retail
productivity measurement. Crude direct measures of grocery store output suggest
that the CPI for food-at-home may have been overstated by 1.4 percentage points
annually from 1978 to 1996.
(78 KB, 35 pages)
The authors investigate
efficiency and productivity growth of the U.S. banking industry over the latter
part of the 1980s and first part of the 1990s using comprehensive data on U.S.
commercial banks. Cost efficiency decreased slightly between the 1980s and
1990s, and large banks showed a sizable decline in profit efficiency. Total
predicted production costs increased over both the 1980s and 1990s, reflecting
cost productivity declines. Changes in business conditions led to cost declines
over both periods. Total predicted profits increased in the 1980s and 1990s,
with the entire change reflecting increased profit productivity. Changing
business conditions led to small declines in profits.
(120 KB, 33 pages)
The authors propose methods for evaluating and
improving density forecasts. They focus primarily on methods that are
applicable regardless of the particular user's loss function, though they take
explicit account of the relationships between density forecasts, action
choices, and the corresponding expected loss throughout. They illustrate the
methods with a detailed series of examples, and they discuss extensions to
improving and combining suboptimal density forecasts, multistep-ahead density
forecast evaluation, multivariate density forecast evaluation, monitoring for
structural change and its relationship to density forecasting, and density
forecast evaluation with known loss function.
(191 KB, 31 pages)
propose a constructive, multivariate framework for assessing agreement between
(generally misspecified) dynamic equilibrium models and data, which enables a
complete second-order comparison of the dynamic properties of models and data.
They use bootstrap algorithms to evaluate the significance of deviations
between models and data, and they use goodness-of-fit criteria to produce
estimators that optimize economically relevant loss functions. The authors
provide a detailed illustrative application to modeling the U.S. cattle cycle.
(263 KB, 66 pages)
Using modern duality theory to recover technologies from data can be complicated by the risk characteristics of production. In many industries, risk influences cost and revenue and can create the potential for costly episodes of financial distress. When risk is an important consideration in production, the standard cost and profit functions may not adequately describe the firm's technology and choice of production plan. In general, standard models fail to account for risk and its endogeneity. The authors distinguish between exogenous risk, which varies over the firm's choice sets, and endogenous risk, which is chosen by the firm in conjunction with its production decision. They show that, when risk matters in production decisions, it is important to account for risk's endogeneity.
For example, better risk diversification that results, for example, from an increase in scale, improves the reward to risk-taking and may under certain conditions induce the firm to take on more risk to increase the firm's value. A choice of higher risk at a larger scale could add to costs and mask scale economies that may result from better diversification.
This paper introduces risk into the dual model of production by constructing a utility-maximizing model in which managers choose their most preferred production plan. The authors show that the utility function that ranks production plans is equivalent to a ranking of subjective probability distributions of profit that are conditional on the production plan. The most preferred production plan results from the firm's choice of an optimal profit distribution. The model is sufficiently general to incorporate risk aversion as well as risk neutrality. Hence, it can account for the case where the potential for costly financial distress makes trading profit for reduced risk a value-maximizing strategy.
The authors implement the model using the Almost Ideal Demand System to derive utility-maximizing share equations for profit and inputs, given the output vector and given sources of risk to control for choices that would affect endogenous risk. The most preferred cost function is obtained from the profit share equation and we show that, if risk neutrality is imposed, this system is identical to the standard translog cost system except that it controls for sources of risk.
The authors apply the model to the U.S. banking industry
using 1989-90 data on banks with over $1 billion in assets. The authors find
evidence that managers trade return for reduced risk, which is consistent with
the significant regulatory and financial costs of bank distress. In addition,
the authors find evidence of significant scale economies that help explain the
recent wave of large bank mergers. Using these same data, the authors also
estimate the standard cost function, which does not explicitly account for
risk, and they obtain the usual results of esentially constant returns to
scale, which contradicts the often-stated rationale for bank mergers.
(281 KB, 36 pages)
97-9 James J. McAndrews, "Banking and Payment System Stability in an Electronic Money World"
No abstract available
(75 KB, 32 pages)
forecasters distort their reported forecasts in a way that compromises
accuracy? New research in the theory of forecasting suggests such a
possibility. In a recent paper, Owen Lamont finds that forecasters in the
Business Week survey make more radical forecasts as they gain
experience. In this paper, the authors uses forecasts from the Federal Reserve
Bank of Philadelphia's Survey of Professional Forecasters to test the
robustness of Lamont's results. The author's results contradict Lamont's.
However, careful examination of a methodological difference in the two surveys
suggests a more general theory of forecasting that accounts for both sets of
(92 KB, 41 pages)
Prediction problems involving asymmetric loss
functions arise routinely in many fields, yet the theory of optimal prediction
under asymmetric loss is not well developed. We study the optimal prediction
problem under general loss structures and characterize the optimal predictor.
We compute it numerically in less tractable cases. A key theme is that the
conditionally optimal forecast is biased under asymmetric loss and that the
conditionally optimal amount of bias is time-varying in general and depends on
higher-order conditional moments. Thus, for example, volatility dynamics (e.g.,
GARCH effects) are relevant for optimal point prediction under asymmetric loss.
More generally, even for models with linear conditional-mean structure, the
optimal point predictor is in general nonlinear under asymmetric loss,
which provides a link with the broader nonlinear time series literature.
(93 KB, 19 pages)
This paper uses
time-series techniques to examine whether monetary policy has similar effects
across U.S. states during the 1958-92 period. Impulse response functions from
estimated structural vector autoregression models reveal differences in state
policy responses, which in some cases are substantial. The paper also provides
evidence on the reasons for the measured cross-state differential policy
responses. The size of a state's response is significantly related to its
industry mix, evidence of an interest rate channel for monetary policy. The
state-level data offer no support for recently advanced credit-channel theories
of the monetary policy transmission mechanism.
(154 KB, 33 pages)
examines the role of U.S. housing-related tax expenditures in creating
incentives for decentralization and encouraging residential sorting by income
and central city decline. Tax expenditures associated with the deductibility of
mortgage interest and property taxes make housing less expensive relative to
other goods and, hence, increase the quantity of housing and residential land
purchased and lower the density of urban areas. Because the tax expenditures
increase with income and the consumption of housing services, they lower the
cost of geographic sorting by income typically associated with exclusionary
zoning and other land- market imperfections. A direct consequence of this
sorting process is that housing-related tax expenditures are concentrated in
communities with the highest incomes and house values. These effects do not
arise simply because of housing-tax policies alone, but rather from the
interaction of these policies and other factors that affect local real e state
markets, such as zoning or fixed housing capital stocks. Three models are
developed to formally analyze these issues. In the authors' base case model in
which there are no land-use constraints and local amenities are fixed, tax
deductions related to home ownership result in population decentralization
within the metropolitan area and a less dense central city, but do not induce
sorting by income. Moreover, land prices in the city increase because the
subsidy increases the aggregate demand for housing in all communities. Thus,
the mere presence of the federal housing tax expenditures increases
decentralization, but cannot generate America's patterns of income sorting and
central city decline. These conclusions change in an important way in the
authors' second model in which a land-use constraint, such as the type of
minimum lot-size zoning prevalent in the suburbs, is introduced. In this case,
the housing subsidies foster the separation of the rich from the poor. Inco me
sorting results, and consequently, there is an increasing concentration of the
poor in the central city. However, there still is no weakening of prices in
city land markets in this model. The third and final model endogenizes the
production of local amenities in the sense that they are made an increasing
function of community income. In this case, three characteristics common to
American urban form result: population decentralization within the metropolitan
area, increased concentration of the poor in the urban core, and weak city land
markets. These results indicate that America's current urban form reflects, at
least in part, incentives arising from the interaction of the national tax and
local zoning systems, rather than unique American tastes for low-density living
(93 KB, 39 pages)
It is widely believed that imposing
cointegration on a forecasting system, if cointegration is, in fact, present,
will improve long-horizon forecasts. The authors show that, contrary to this
belief, at long horizons nothing is lost by ignoring cointegration when the
forecasts are evaluated using standard multivariate forecast accuracy measures.
In fact, simple univariate Box-Jenkins forecasts are just as accurate. The
authors' results highlight a potentially important deficiency of standard
forecast accuracy measures — they fail to value the maintenance of cointegrating
relationships among variables — and the authors suggest alternatives that
explicitly do so.
(137 KB, 36 pages)
authors study the impact of a minimum consumption requirement on the rate of
economic growth and the evolution of wealth distribution. The requirement
introduces a positive dependence between the intertemporal elasticity of
substitution and household wealth. This dependence implies a transition phase
during which the growth rate of per-capita quantities rise toward their
steady-state values and the distributions of wealth, consumption, and permanent
income become more unequal. The authors calibrate the minimum consumption
requirement to match estimates available for a sample of Indian villagers and
find that these transitional effects are quantitatively significant and depend
importantly on the economy's steady-state growth rate. NOTE: This paper refers
to figures not currently available with this electronic version. For a hard
copy of the figures, call the Research Department's Publications Desk at
215-574-6428 and ask for Working Paper 97-15.
(211 KB, 28 pages)
The authors construct and simulate a model of check exchange to examine the
incentives a bank (or a bank clearinghouse) has to engage in practices that
limit access to its payment facilities, in particular delaying the availability
of check payment. The potentially disadvantaged bank has the option of directly
presenting checks to the first bank. The authors find that if the retail
banking market is highly competitive, the first bank will not engage in such
practices, but if the retail banking market is imperfectly competitive, it will
find it advantageous to restrict access to its facilities. Lower costs of
direct presentment can reduce (but not eliminate) the range over which these
practices are employed. The practice of delayed presentment can either reduce
or increase welfare, again depending on the degree of competition in the
market. The model suggests that, were the Federal Reserve System to exit the
business of check processing, practices such as delayed presentment would b e
(77 KB, 27 pages)
This paper views financial intermediaries as
vertically integrated firms. The authors explore how competitive conditions in
retail and wholesale funding markets affect the incentive for (upstream)
originators and (downstream) fund managers to integrate. The underlying
tradeoff in our model is driven by the choice between the production of an
illiquid but high yielding loan and a liquid but relatively low yielding bond.
The authors find that greater homogeneity among savers has two effects, both of
which tend to increase the incentive to form integrated intermediaries. Greater
homogeneity both increases competition between independent fund managers and
reduces the likelihood of inefficient underinvestment by integrated
intermediaries. The authors also find that the incentive to integrate is
greater when fund managers have more power in the market for firms' securities.
(528 KB, 34 pages)
97-18 Antulio N. Bomfim and Francis X. Diebold, "Bounded Rationality and Strategic Complementarity in a Macroeconomic Model: Policy Effects, Persistence, and Multipliers"
Motivated by recent developments in the bounded rationality and strategic complementarity literatures, we examine an intentionally simple and stylized aggregative economic model when the assumptions of fully rational expectations and no strategic interactions are relaxed. We show that small deviations from rational expectations, taken alone, lead only to small deviations from classical policy-ineffectiveness, but that the situation can change dramatically when strategic complementarity is introduced. Strategic complementarity magnifies the effects of even small departures from rational expectations, producing equilibria with policy effectiveness, output persistence, and multiplier effects.
This paper explores the effect on costs when firms within an industry must
interact with each other in the normal course of business. Such interaction
will generally cause the socially optimal scale of each firm to deviate from
its minimum average cost scale. In addition, the socially optimal industry
structure may be more concentrated than conventional firm-level cost studies
would suggest and may also differ from the unregulated (free-entry) equilibrium
structure. These concepts, while potentially applicable to several industries,
are here made more precise for the banking industry, both theoretically and
(109 KB, 41 pages)
Broadly defined, macroeconomic forecasting
is alive and well. Nonstructural forecasting, which is based largely on
reduced-form correlations, has always been well and continues to improve.
Structural forecasting, which aligns itself with economic theory and, hence,
rises and falls with theory, receded following the decline of Keynesian theory.
In recent years, however, powerful new dynamic stochastic general equilibrium
theory has been developed, and structural macroeconomic forecasting is poised
(82 KB, 35 pages)
This paper considers network externalities from currency
acceptability as a determinant of observed persistence of dollarization in
Latin American countries. A model with efficiencies from establishing a network
of currency users is constructed. Model implications are then tested using a
unique data set of daily loan records from an informal Bolivian credit market.
Empirical results are consistent with dollarization hysteresis being driven by
network externalities from currency adoption. The results also imply that
credible exchange rate stabilization policy alone is not sufficient to achieve
(88 KB, 32 pages)
paper reports the first stage of a project to recover Argentine stock market
data for the entire 20th century. The authors find that real rates of return on
Argentine stocks and bonds after 1920 were above those in the Belle
Époque, and that they were consistent with the view that in the postwar
period Argentina remained firmly integrated with international financial
(102 KB, 32 pages)
The authors propose a measure of
predictability based on the ratio of the expected loss of a short-run forecast
to the expected loss of a long-run forecast. This predictability measure can be
tailored to the forecast horizons of interest, and it allows for general loss
functions, univariate or multivariate information sets, and stationary or
nonstationary data. The authors propose a simple estimator and suggest
resampling methods for inference. They then provide several macroeconomic
applications. First, on the basis of fitted parametric models, the authors
assess the predictability of a variety of macroeconomic series. Second, they
analyze the internal propagation mechanism of a standard dynamic macroeconomic
model by comparing predictability of model inputs and model outputs. Third,
they use predictability as a metric for assessing the similarity of data
simulated from the model and actual data. Finally, the authors sketch several
promising directions for future research.
(98 KB, 33 pages)
introduce an element of centralization in a random matching model of money that
allows for private liabilities to circulate as media of exchange. Some agents,
which the authors identify as banks, are endowed with the technology to issue
notes and to record-keep reserves with a central clearinghouse, which they call
the treasury. The liabilities are redeemed according to a stochastic process
that depends on the endogenous trades. The treasury removes the banking
technology from banks that are not able to meet the redemptions in a given
period. This, together with the market incompleteness, gives rise to a reserve
management problem for the issuing banks. The authors demonstrate that
"sufficiently patient" banks will concentrate on improving their reserve
position instead of pursuing additional issue. The model provides a first
attempt to reconcile limited note issue with optimizing behavior by banks
during the National Banking Era.
(270 KB, 43 pages)
studies have noted that loan applications rejected by one bank can apply at
another bank, systematically worsening the pool of applicants faced by all
banks. This paper presents the first empirical evidence of this effect and
explores some additional ramifications, including the role of common filters,
such as commercially available credit scoring models, in mitigating this
adverse selection, implications for de novo banks, implications for banks'
incentives to comply with fair lending laws, and macroeconomic effects.
(93 KB, 34 pages)
In this paper, the
authors document a pronounced trend toward deconcentration of metropolitan
employment during the postwar period in the United States. The employment share
of initially more dense metro areas declined and those of initially less dense
metro areas rose. Motivated by this finding, the authors develop a
system-of-cities model in which increase in aggregate metropolitan employment
causes employment to shift in favor of less dense metro areas because
congestion costs increase more rapidly for the initially more dense metro
areas. A calibrated version of the model shows that the more-than-twofold
increase in employment experienced by MSAs during the postwar period was indeed
a powerful force favoring deconcentration.
(268 KB, 34 pages)
This paper proposes evaluating the assumptions
of the RBC model rather than merely the ability of model-constrained data to
mach moments of official data counterparts. Reduced-form relationships can be
used to create model-consistent derivations of capital and labor input. Since
several relationships exist for each input, comparison of their properties
highlights weaknesses and strengths in the model assumptions. Applied to the
RBC model with factor hoarding and depreciation through use, the approach
highlights weaknesses in the standard utility function and casts doubt upon use
of the model to improve official capital stock measures or utilization rates.
(102 KB, 26 pages)
There is a
widespread belief that different geographic regions of the U.S. respond
differently to economic shocks, perhaps because of factors such as differences
in the composition of regional output, adjustment costs, or other frictions.
The author investigates the comovement of regional employment series using a
common features framework. Little evidence is found to suggest that regions
move synchronously; rather, it takes about three quarters before regions
respond in a similar fashion to a common shock. The author identifies leading
and lagging regions. None of the regional employment series appears to share a
common, synchronous cycle with aggregate U.S. employment.
(221 KB, 29 pages)