Tuesday, June 12, 2012

Bringing Africa Back to Life: The Legacy of George W. Bush

Bringing Africa Back to Life: The Legacy of George W. Bush. By Jim Landers
Dallas Morning News
June 08, 2012


LUSAKA, Zambia — On a beautiful Saturday morning, Delfi Nyankombe stood among her bracelets and necklaces at a churchyard bazaar and pondered a question: What do you think of George W. Bush?
“George Bush is a great man,” she answered. “He tried to help poor countries like Zambia when we were really hurting from AIDS. He empowered us, especially women, when the number of people dying was frightening. Now we are able to live.”

Nyankombe, 38, is a mother of three girls. She also admires the former president because of his current campaign to corral cervical cancer. Few are screened for the disease, and it now kills more Zambian women than any other cancer.

“By the time a woman knows, she may need radiation or chemotherapy that can have awful side effects, like fistula,” she said. “This is a big problem in Zambia, and he’s still helping us.”

The debate over a president’s legacy lasts many years longer than his term of office. At home, there’s still no consensus about the 2001-09 record of George W. Bush, with its wars and economic turmoil.
In Africa, he’s a hero.

“No American president has done more for Africa,” said Festus Mogae, who served as president of Botswana from 1998 to 2008. “It’s not only me saying that. All of my colleagues agree.”
AIDS was an inferno burning through sub-Saharan Africa. The American people, led by Bush, checked that fire and saved millions of lives.

People with immune systems badly weakened by HIV were given anti-retroviral drugs that stopped the progression of the disease. Mothers and newborns were given drugs that stopped the transmission of the virus from one generation to the next. Clinics were built. Doctors and nurses and lay workers were trained. A wrenching cultural conversation about sexual practices broadened, fueled by American money promoting abstinence, fidelity and the use of condoms.

“We kept this country from falling off the edge of a cliff,” said Mark Storella, the U.S. ambassador to Zambia. “We’ve saved hundreds of thousands of lives. We’ve assisted over a million orphans. We’ve created a partnership with Zambia that gives us the possibility of walking the path to an AIDS-free generation. This is an enormous achievement.”

Bush remains active in African health. Last September, he launched a new program — dubbed Pink Ribbon Red Ribbon — to tackle cervical and breast cancer among African women. The program has 14 co-sponsors, including the Obama administration.


Read the rest here: http://www.bushcenter.com/blog/2012/06/11/icymi-bringing-africa-back-to-life-the-legacy-of-george-w-bush

Systemic Risk and Asymmetric Responses in the Financial Industry

Systemic Risk and Asymmetric Responses in the Financial Industry. By López-Espinosa, Germán; Moreno, Antonio; Rubia, Antonio; and Valderrama, Laura
IMF Working Paper No. 12/152
June, 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25991.0

Summary: To date, an operational measure of systemic risk capturing non-linear tail comovement between system-wide and individual bank returns has not yet been developed. This paper proposes an extension of the so-called CoVaR measure that captures the asymmetric response of the banking system to positive and negative shocks to the market-valued balance sheets of individual banks. For the median of our sample of U.S. banks, the relative impact on the system of a fall in individual market value is sevenfold that of an increase. Moreover, the downward bias in systemic risk from ignoring this asymmetric pattern increases with bank size. The conditional tail comovement between the banking system and a top decile bank which is losing market value is 5.4 larger than the unconditional tail comovement versus only 2.2 for banks in the bottom decile. The asymmetric model also produces much better estimates and fitting, and thus improves the capacity to monitor systemic risk. Our results suggest that ignoring asymmetries in tail interdependence may lead to a severe underestimation of systemic risk in a downward market.

Excerpts:

In this paper, we discuss the suitability of the general modeling strategy implemented in Adrian and Brunnermeier (2011) and propose a direct extension which accounts for nonlinear tail comovements between individual bank returns and financial system returns. Like most VaR models, the CoVaR approach builds on semi-parametric assumptions that characterize the dynamics of the time series of returns. Among others, the procedure requires the specification of the functional form that relates the conditional quantile of the whole financial system to the returns of the individual firm. The model proposed by Adrian and Brunnermeier (2011) assumes that system returns depend linearly on individual returns, so changes in the latter would feed proportionally into the former. This assumption is simple, convenient, and to a large extent facilitates the estimation of the parameters involved and the generation of downside-risk comovement estimates. On the other hand, this structure imposes certain limitations, as it neglects nonlinear patterns in the propagation of volatility shocks and of perturbations to the risk factors affecting banks' exposures. Both patterns feature distinctively in downside-risk dynamics.

There are strong economic arguments that suggest that the financial system may respond nonlinearly to shocks initiated in a single institution. A sizeable, positive shock in an individual bank is unlikely to generate the same characteristic response (i.e., comovement with the system) in absolute terms than a massive negative shock of the same magnitude, particularly if dealing with large-scale financial institutions.2 The disruption to the banking system caused by the failure of a financial institution may occur through direct exposures to the failing institution, through the contraction of financial services provided by the weakening institution (clearing, settlement, custodial or collateral management services), or from a shock to investor confidence that spreads out to sound institutions under adverse selection imperfections (Nier, 2011). Indeed, an extreme idiosyncratic shock in the banking industry, will not only reduce the market value of the stocks a¤ected, but may also spread uncertainty in the system rushing depositors and lending counterparties to withdraw their holdings from performing institutions and across unrelated asset classes, precipitating widespread insolvency. Historical experience suggests that a confidence loss in the soundness of the banking sector takes time to dissipate and may generate devastating e¤ects on the real economy. Bernanke (1983) comes to the conclusion that bank runs were largely responsible of the systemic collapse of the financial industry and the subsequent contagion to the real sectors during the Great Depression. Another channel of contagion in a downward market is through the fire-sales of assets initiated by the stricken institution to restore its capital adequacy, causing valuation losses in firms holding equivalent securities. This mechanism, induced by the generalized collateral lending practices that are prevalent in the wholesale interbank market, can exacerbate price volatility in a crisis situation, as discussed by Brunnermeier and Pedersen (2009).  The increased complexity and connectedness of financial institutions can generate "Black Swan" effects, morphing small perturbations in one part of the financial system into large negative shocks on seemingly unrelated parts of the system. These arguments suggest that the financial system is more sensitive to downside losses than upside gains. In such a case, the linear assumption involved in Adrian and Brunnermeier (2011) would neglect a key aspect of downside risk modeling and lead to underestimate the extent of systemic risk contribution of an individual bank.

We propose a simple extension of this procedure that encompasses the linear functional form as a special case and which, more generally, allows us to capture asymmetric patterns in systemic spillovers. We shall refer to this specification as asymmetric CoVaR in the sequel. This approach retains the tractability of the linear model, which ensures that parameters can readily be identified by appropriate techniques, and produces CoVaR estimates which are expected to be more accurate. Furthermore, given the resultant estimates, the existence of nonlinear patterns that motivate the asymmetric model can be addressed formally through a standard Wald test statistic. In this paper, we analyze the suitability of the asymmetric CoVaR in a comprehensive sample of U.S. banks over the period 1990-2010. We find strong statistical evidence suggesting the existence of asymmetric patterns in the marginal contribution of these banks to the systemic risk. Neglecting these nonlinearities gives rise to estimates that systematically underestimate the marginal contribution to systemic risk. Remarkably, the magnitude of the bias is tied to the size of the firm, so that the bigger the company, the greater the underestimation bias. This result is consistent with the too-big-to-fail hypothesis which stresses the need to maintain continuity of the vital economic functions of a large financial institution whose disorderly failure would cause significant disruption to the wider financial system.3 Ignoring the existence of asymmetries would thus lead to conservative estimates of risk contributions, more so in large firms which are more likely to be systemic. Accounting for asymmetries in a simple extension of the model would remove that bias.

 Concluding Remarks

In this paper, we have discussed the suitability of the CoVaR procedure recently proposed by Adrian and Brunnermeier (2011). This valuable approach helps understand the drivers of systemic risk in the banking industry. Implementing this procedure in practice requires specifying the unobservable functional form that relates the dynamics of the conditional tail of system's returns to the returns of an individual bank. Adrian and Brunnermeier (2011) build on a model that assumes a simple linear representation, such that returns are proportional.

We show that this approach may provide a reasonable approximation for small-sized banks.  However, in more general terms, and particularly for large-scale banks, the linear assumption leads to a severe underestimation of the conditional comovement in a downward market and, hence, their systemic importance may be perceived to be lower than their actual contribution to systemic risk. Yet, how to measure and buttress e¤ectively the resilience of the financial system to losses crystallizing in a stress scenario is the main concern of policy makers, regulatory authorities, and financial markets alike. Witness the rally on U.S. equities and dollar on March 14, 2012 after the regulator announced favorable bank stress test results for the largest nineteen U.S. bank holding companies.

The reason is that the symmetric model implicitly assumes that positive and negative individual returns are equally strong to capture downside risk comovement. Our empirical results however, provide robust evidence that negative shocks to individual returns generate a much larger impact on the financial system than positive disturbances. For a median-sized bank, the relative impact ratio is sevenfold. We contend that this non-linear pattern should be acknowledged in the econometric modeling of systemic risk to avoid a serious misappraisal of risk. Moreover, our analysis suggests that the symmetric specification introduces systematic biases in risk assessment as a function of bank size. Specifically, the distortion caused by a linear model misspecification is more pronounced for larger banks, which are also more systemic on average. Our results show that tail interdependence between the financial system and a bottom-size decile bank which is contracting its balance sheet is 2.2 times larger than its average comovement. More strikingly, this ratio reaches 5.4 for the top-size decile bank. This result is in line with the too-big-to-fail hypothesis and lends support to recent recommendations put forth by the Financial Stability Board to require higher loss absorbency capacity on large banks. Likewise, it is consistent with the resolution plan required by the Dodd-Frank Act for bank holding companies and non-bank financial institutions with $50 billion or more in total assets. Submitting periodically a plan for rapid and orderly resolution in the event of financial distress or failure will enable the FDIC to evaluate potential loss severity and minimize the disruption that a failure may have in the rest of the system, thus performing its resolution functions more e¢ ciently. This measure will also help alleviate moral hazard concerns associated with systemic institutions and strengthen the stability of the overall financial system.

To capture the asymmetric pattern on tail comovement, we propose a simple yet e¤ective extension of the original CoVaR model. This extension preserves the tractability of the original model and its suitability can formally be tested formally through a simple Wald-type test, given the estimates of the model. We show that this simple extension is robust to more general specifications capturing non-linear patterns in returns, though at the expense of losing tractability.

The refinement of the CoVaR statistical measure presented in the paper aims at quantifying asymmetric spillover e¤ects when strains in banks' balance sheets are elevated, and thus contributes a step towards strengthening systemic risk monitoring in stress scenarios.  Furthermore, its focus on tail comovement originated from negative perturbations in the growth rate of market-valued banks' balance sheets, may yield insights into the impact on the financial system from large-scale deleveraging by banks seeking to rebuild their capital cushions. This particular concern has been recently rekindled by the continued spikes in volatility in euro area financial markets. It has been estimated that, following pressures on the European banking system as banks cope with sovereign stress, European banks may shrink their combined balance sheet significantly with the potential of unleashing shockwaves to emerging economies hurting their financial stability (IMF, 2012). The estimation of the impact on the real economy from aggregate weakness of the financial sector, and the design of optimal macroprudential policies to arrest systemic risk when tail interdependencies feed non-linearly into the financial system, are left for future research.

Friday, June 8, 2012

Policy Analysis and Forecasting in the World Economy: A Panel Unobserved Components Approach

Policy Analysis and Forecasting in the World Economy: A Panel Unobserved Components Approach. By Francis Vitek
IMF Working Paper No. 12/149
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25974.0

Summary: This paper develops a structural macroeconometric model of the world economy, disaggregated into thirty five national economies. This panel unobserved components model features a monetary transmission mechanism, a fiscal transmission mechanism, and extensive macrofinancial linkages, both within and across economies. A variety of monetary policy analysis, fiscal policy analysis, spillover analysis, and forecasting applications of the estimated model are demonstrated, based on a Bayesian framework for conditioning on judgment.

Thursday, June 7, 2012

Policies for Macrofinancial Stability: How to Deal with the Credit Booms

Policies for Macrofinancial Stability: How to Deal with the Credit Booms. By Dell'Ariccia, Giovanni; Igan, Deniz; Laeven, Luc; Tong, Hui; Bakker, Bas B.; Vandenbussche, Jérôme
IMF Staff Discussion Notes No. 12/06
June 07, 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25935.0

Excerpts

Executive summary

Credit booms buttress investment and consumption and can contribute to long-term financial deepening. But they often end up in costly balance sheet dislocations, and, more often than acceptable, in devastating financial crises whose cost can exceed the benefits associated with the boom. These risks have long been recognized. But, until the global financial crisis in 2008, policy paid limited attention to the problem. The crisis—preceded by booms in many of the hardest-hit countries—has led to a more activist stance. Yet, there is little consensus about how and when policy should intervene. This note explores past credit booms with the objective of assessing the effectiveness of macroeconomic and macroprudential policies in reducing the risk of a crisis or, at least, limiting its consequences.

It should be recognized at the onset that a more interventionist policy will inevitably imply some trade-offs. No policy tool is a panacea for the ills stemming from credit booms, and any form of intervention will entail costs and distortions, the relevance of which will depend on the characteristics and institutions of individual countries. With these caveats in mind, the analysis in this note brings the following insights.

First, credit booms are often triggered by financial reform, capital inflow surges associated with capital account liberalizations, and periods of strong economic growth. They tend to be more frequent in fixed exchange rate regimes, when banking supervision is weak, and when macroeconomic policies are loose.

Second, not all booms are bad. About a third of boom cases end up in financial crises. Others do not lead to busts but are followed by extended periods of below-trend economic growth.  Yet many result in permanent financial deepening and benefit long-term economic growth.  Third, it is difficult to tell “bad” from “good” booms in real time. But there are useful telltales. Bad booms tend to be larger and last longer (roughly half of the booms lasting longer than six years end up in a crisis).

Fourth, monetary policy is in principle the natural lever to contain a credit boom. In practice, however, capital flows (and related concerns about exchange rate volatility) and currency substitution limit its effectiveness in small open economies. In addition, since booms can occur in low-inflation environments, a conflict may emerge with its primary objective.

Fifth, given its time lags, fiscal policy is ill-equipped to timely stop a boom. But consolidation during the boom years can help create fiscal room to support the financial sector or stimulate the economy if and when a bust arrives.

Finally, macroprudential tools have at times proven effective in containing booms, and more often in limiting the consequences of busts, thanks to the buffers they helped to build. Their more targeted nature limits their costs, although their associated distortions, should these tools be abused, can be severe. Moreover, circumvention has often been a major issue, underscoring the importance of careful design, coordination with other policies (including across borders), and close supervision to ensure the efficacy of these tools. 


Conclusions

Prolonged credit booms are a harbinger of financial crises and have real costs. Our analysis shows that, while only a minority of booms end up in crises, those that do can have longlasting and devastating real effects if left unaddressed. Yet it appears to be difficult to identify bad booms as they emerge, and the cost of intervening too early and running the risk of stopping a good boom therefore has to be weighed against the desire to prevent financial crises.

While the analysis offers some insights into the origins and dynamics of credit booms, from a policy perspective a number of questions remain unaddressed. In part this reflects the limited experience to date with macroprudential policies and the simultaneous use of multiple policy tools, making it hard to disentangle specific policy measures’ effectiveness.

First, while monetary policy tightening seems the natural response to rapid credit growth, we find only weak empirical evidence that it contains booms and their fallout on the economy.  This may be partly the result of a statistical bias. But there are several “legitimate” factors that limit the use and effectiveness of monetary policy in dealing with credit booms, especially in small open economies. In contrast, there is more consistent evidence that macroprudential policy is up to this task, although it is more exposed to circumvention.

All of the above raise important questions about the optimal policy response to credit booms.  Our view is that when credit booms coincide with periods of general overheating in the economy, monetary policy should act first and foremost. If the boom lasts and is likely to end up badly or if it occurs in the absence of overheating, then macroprudential policy should come into play. Preferably, this should be in combination and coordination with macroeconomic policy, especially when macroeconomic policy is already being used to address overheating of the economy.

Second, questions remain about the optimal mix and modality of macroprudential policies, also in light of political economy considerations and the type of supervisory arrangements in the country. Political economy considerations call for a more rules-based approach to setting macroprudential policy to avoid pressure from interest groups to relax regulation during a crisis. But such considerations have to be weighed against the practical problems and unintended effects of a rules-based approach, such as the calibration of rules with rather demanding data requirements and the risk of circumvention in the presence of active earnings management. The design of a macroprudential framework should also consider the capacity and ability of supervisors to enforce such rules so that unintended and potentially dangerous side effects can be avoided.

Third, the optimal macroprudential policy response to credit booms, as well as the optimal policy mix, will likely have to depend on the type of credit boom. Because of data limitations, our analysis has focused on aggregate credit. While it seems natural that policy response should adapt to and be targeted to the type of credit, additional analysis is needed to assess the effectiveness of policies to curtail booms that differ in the type of credit.

Fourth, policy coordination, across different authorities and across borders, may increase the effectiveness of monetary tightening and macroprudential policies. Cooperation and a continuous flow of information among national supervisors, especially regarding the activities of institutions that are active across borders, are crucial. Equally important is the coordination of regulations and actions among supervisors of different types of financial institutions. Whether and how national policymakers take into account the effects of their actions on the financial and macroeconomic stability of other countries is a vital issue, calling for further regional and global cooperation in the setup of macroprudential policy frameworks and the conduct of macroeconomic policies.

IMF Staff Notes: Externalities and Macro-Prudential Policy

Externalities and Macro-Prudential Policy. By De Nicoló, Gianni; Favara, Giovanni; Ratnovski, Lev
IMF Staff Discussion Notes No. 12/05
June 07, 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25936.0

Excerpts

Executive Summary

The recent financial crisis has led to a reexamination of policies for macroeconomic and financial stability. Part of the current debate involves the adoption of a macroprudential approach to financial regulation, with an aim toward mitigating boom-bust patterns and systemic risks in financial markets.

The fundamental rationale behind macroprudential policies, however, is not always clearly articulated. The contribution of this paper is to lay out the key sources of market failures that can justify macroprudential regulation. It explains how externalities associated with the activity of financial intermediaries can lead to systemic risk, and thus require specific policies to mitigate such risk.

The paper classifies externalities that can lead to systemic risk as:

1. Externalities related to strategic complementarities, that arise from the strategic interaction of banks (and other financial institutions) and cause the build-up of vulnerabilities during the expansionary phase of a financial cycle;
2. Externalities related to fire sales, that arise from a generalized sell-off of financial assets causing a decline in asset prices and a deterioration of the balance sheets of intermediaries, especially during the contractionary phase of a financial cycle; and
3. Externalities related to interconnectedness, caused by the propagation of shocks from systemic institutions or through financial networks.

The correction of these externalities can be seen as intermediate targets for macroprudential policy, since policies that control externalities mitigate market failures that create systemic risk.

This paper discusses how the main proposed macroprudential policy tools—capital requirements, liquidity requirements, restrictions on activities, and taxes—address the identified externalities. It is argued that each externality can be corrected by different tools that can complement each other. Capital surcharges, however, are likely to play an important role in the design of macroprudential regulation.

This paper’s analysis of macroprudential policy complements the more traditional one that builds on the distinction between time-series and cross-sectional dimensions of systemic risk.


Conclusions

This paper has argued that the first step in the economic analysis of macroprudential policy is the identification of market failures that contribute to systemic risk. Externalities are an important source of such market failures, and macroprudential policy should be thought of as an attempt to correct these externalities.

Building on the discussion in the academic literature, the paper has identified externalities that lead to systemic risk: externalities due to strategic complementarities, which contribute to the accumulation of vulnerabilities during the expansionary phase of a financial cycle; and externalities due to fire sales and interconnectedness, which tend to exacerbate negative shocks especially during a contractionary phase.

The correction of these externalities can be seen as an intermediate targets for macroprudential policy, since policies that control externalities mitigate market failures that create systemic risk. This paper has studied how the identified externalities can be corrected by the main macroprudential policy proposals: capital requirements, liquidity requirements, restrictions on bank activities, and taxation. The main finding is that even though some of these policies can complement each other in correcting the same externality, capital requirements are likely to play an important role in the design of any macroprudential framework.

It has also been argued that although externalities can be proxied through a variety of risk measurements, the accumulation of evidence on the effectiveness of alternative policy tools remains the most pressing concern for the design of macroprudential policy.

Wednesday, May 30, 2012

Financial Intermediation Costs in Low-Income Countries: The Role of Regulatory, Institutional, and Macroeconomic Factors

Financial Intermediation Costs in Low-Income Countries: The Role of Regulatory, Institutional, and Macroeconomic Factors. By Tigran Poghosyan
IMF Working Paper No. 12/140
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25949.0

Summary: We analyze factors driving persistently higher financial intermediation costs in low-income countries (LICs) relative to emerging market (EMs) country comparators. Using the net interest margin as a proxy for financial intermediation costs at the bank level, we find that within LICs a substantial part of the variation in interest margins can be explained by bank-specific factors: margins tend to increase with higher riskiness of credit portfolio, lower bank capitalization, and smaller bank size. Overall, we find that concentrated market structures and lack of competition in LICs banking systems and institutional weaknesses constitute the key impediments preventing financial intermediation costs from declining. Our results provide strong evidence that policies aimed at fostering banking competition and strengthening institutional frameworks can reduce intermediation costs in LICs.

Excerpts

Introduction

The net interest margin, measured as a difference between lending and deposit rates, is a commonly accepted measure of how costly bank intermediation services are for a society.  Research shows that the cost of financial intermediation has important repercussions for economic performance (Jayaratne and Strahan, 1996; Rajan and Zingales, 1998; Beck, et al., 2000). The importance of the bank interest margin as a measure of financial intermediation costs is particularly pertinent for low-income countries (LICs), where in the absence of developed stock markets firms largely depend on bank financing as a source of external funding.

High financial intermediation costs may constitute an important impediment for financial deepening in LICs. The persistence of high margins might be symptomatic of a number of systemic problems, such as: lack of competition, perceived market and credit risks, bank unsoundness, scale diseconomies constrained by small markets, high operating costs due to low efficiency, unfavorable institutional environment, and existence of various regulatory constraints distorting financial market activity.

The main objective of this paper is to examine the influence of market concentration, bank regulations, institutional development, and macroeconomic environment on bank margins across a broad cross-section of LICs and emerging economies (Ems), while controlling for bank-specific factors. We use bank-level data on 359 commercial banks in 48 LICs, and 2535 commercial banks in 67 EMs for the period 1996-2010. For both groups of countries, the sample includes great diversity in terms of financial intermediation costs, bank characteristics, as well as regulation, institutional, and macroeconomic environments. The comparison of results across the two groups of countries helps identify key environmental factors that put upward pressure on financial intermediation costs. Based on the results of the analysis, we provide policy recommendations for reducing financial intermediation costs in LICs and contributing to further financial deepening.

Estimation results suggest that concentrated market structures and lack of competition in LICs’ banking systems remain key impediments preventing financial intermediation costs from declining. In this respect, relaxing restrictions to bank entry could help in reducing the cost of financial intermediation in LICs. Low institutional capacity also plays a prominent role in boosting margins. Within LICs, bank-specific characteristics explain a substantial part of variation in interest margins. Specifically, margins tend to increase with higher riskiness of credit portfolio, lower bank capitalization, and smaller bank size.


Conclusions

This paper examined the determinants of interest margin in LICs and EMs, given the importance of high financial intermediation costs as an impediment for financial deepening in LICs. The paper adopts two complementary approaches to explore interest margin determinants: (i) an accounting framework for decomposing the interest margin into its cost and profit components; and (ii) econometric specification based on the behavioral (dealership) model of profit maximizing banks.

The analysis provides evidence on the significantly higher interest margins in LICs compared to EMs. Decomposition of the margins into their cost and profit components indicates that higher median margin in LICs is mainly explained by greater profitability of banks (which may be due to less competitive environment) and higher credit risk (which may reflect a weaker environment).

The econometric analysis further highlights differences between LICs and EMs banks.  Compared to EMs, margins in LICs banks appear to be more responsive to the market structure, suggesting that promoting banking competition can be an important tool to reduce interest margins in LICs. Furthermore, the positive impact of market structure on LICs margins is robust to different measures of market concentration. On the other hand, margins in LICs banks are less responsive to the macroeconomic and regulatory environment compared to their EMs peers. In both groups of countries, the institutional setting appears to have a dominant impact on margins, overshadowing that of market concentration and many bank-specific controls. Finally, bank characteristics explain a substantial part of withincountry variation in interest margins once country-specific determinants are accounted for. Specifically, margins in both LICs and EMs increase with higher riskiness of credit portfolio, lower bank capitalization, and smaller bank size.

Taken together, these findings provide strong evidence that there is large scope to reduce interest margins in LICs through policies aimed at fostering banking competition. On the regulatory side, relaxing restrictions to bank entry could help in lowering intermediation costs. More important, improvements in the informational, contractual and enforcement frameworks could play a key role in lowering interest margins in LICs.

Tuesday, May 29, 2012

CPSS: Innovations in retail payments

Innovations in retail payments

CPSS Publications No 102
May 29, 2012
http://www.bis.org/publ/cpss102.htm
 
Over the past decade, a number of innovative developments in retail payments have emerged. Many central banks take an interest in retail payments as part of their role in maintaining the stability and efficiency of the financial system and preserving confidence in their currencies. Although most retail payment systems are not considered systemically important, their potential weaknesses with regard to security and reliability could nonetheless affect the financial system and the economy. Innovations in retail payments can therefore raise policy issues for central banks.

In June 2010, the Committee on Payment and Settlement Systems (CPSS) set up a working group to investigate developments in retail payments, focusing especially on innovations. This report, produced by that group, first provides an overview of innovative retail payment activities in the CPSS and other selected countries from a fact-finding exercise, which attempted to cover influential developments in retail payment instruments and schemes over the past decade. Based on the trends observed and the economics of retail payments, the report identifies a number of exogenous and endogenous factors that could serve as drivers for retail payment innovations or as barriers to them. The analysis was also used to suggest some pointers as to what can be expected over the next five years. Finally, the report identifies a number of issues for central banks concerning their various responsibilities and tasks as catalysts, overseers and/or operators of payment systems. 

Friday, May 25, 2012

Walking Hand in Hand: Fiscal Policy and Growth in Advanced Economies. By Carlo Cottarelli & Laura Jaramillo

Walking Hand in Hand: Fiscal Policy and Growth in Advanced Economies. By Carlo Cottarelli & Laura Jaramillo
May 01, 2012
IMF Working Paper No. 12/137

Summary: Implementation of fiscal consolidation by advanced economies in coming years needs to take into account the short and long-run interactions between economic growth and fiscal policy. Many countries must reduce high public debt to GDP ratios that penalize longterm growth. However, fiscal adjustment is likely to hurt growth in the short run, delaying improvements in fiscal indicators, including deficits, debt, and financing costs. Revenue and expenditure policies are also critical in affecting productivity and employment growth. This paper discusses the complex relationships between fiscal policy and growth both in the short and in the long run.

Order a print copy: http://www.imfbookstore.org/IMFORG/WPIEA2012137

Appendix. Short-run Determinants of CDS Spreads in Advanced Economies

Introduction

Since the global financial crisis and ensuing sovereign crisis in Europe, financial markets’ assessment of credit risk for advanced economies has changed significantly. Before the crisis, the valuation of advanced economy sovereign debt treated default as a very low probability event, and therefore liquidity risk rather than default risk was seen as the dominant driver of financing costs in advanced economies. However, as the recent crisis in Europe unfolded, assessment of credit risk came to the forefront, taking into account country-specific fundamentals. In response, several countries have made progress in adopting fiscal consolidation plans, although this has not always been met with a reduction in sovereign spreads. The current crisis has shown that while markets are concerned with large debt and fiscal deficits, they also worry about low growth and its effect on debt dynamics, as wells as the feasibility of fiscal adjustment in an environment of very weak economic activity.

While a few studies have looked at sovereign spreads in advanced economies since the onset of the global crisis, we focus here on credit default swap (CDS) spreads in advanced economies during 2011, the height of the euro area crisis. Under the assumption that underlying global factors (such as global risk aversion) are behind general co-movements of CDS spreads, our analysis seeks to identify the set of country specific factors that explain the divergence of spreads across countries during the most recent phase of the global crisis. The results highlight the current short-termism of markets, which makes fiscal policy management more difficult.14 In particular, it shows that lower debt and deficit to GDP ratios lead to lower CDS spreads, but so too does faster short-term growth. There is further evidence of a nonlinear relationship between growth and sovereign bond spreads: spreads are more likely to increase if growth declines from an already low rate and the fiscal tightening is large. If growth deteriorates enough as a result of a fiscal tightening, spreads could actually rise even as the deficit falls.


Background

As fiscal fundamentals have become a growing concern of financial markets, the sovereign CDS market for advanced economies has become increasingly large.15 Several European advanced economies are now among the ten largest single name CDS exposures by net notional position (Figure A1), and since September 2009 investors can trade index products on a basket of Western European sovereign CDS.[For further information on Markit iTraxx SovX Western European Index see http://www.markit.com/assets/en/docs/commentary/credit-wrap/SovX.pdf.] With rising size and liquidity, sovereign CDS spreads now provide more reliable signaling of sovereign credit risk.

The literature on the determinants of CDS spreads in advanced economies since the onset of the financial crisis— typically focusing on a narrow set of countries and using data only through 2010—has highlighted the importance of global factors with an increasingly prominent role of country-specific factors as the crisis progressed.  Longstaff et al. (2011) show that sovereign credit risk can be linked to global factors, based on a dataset of 28 advanced and emerging economies over the period October 2000-January 2010.  Similarly, Fontana and Scheicher (2010) find that the recent repricing of sovereign credit risk in the CDS market is mostly due to common factors. Dieckmann and Plank (2011) find—for a group of 18 advanced economies between January 2007 and April 2010—that the state of the world financial system as well as a country’s domestic financial system has strong explanatory power for the behavior of CDS spreads, with euro area countries especially sensitive. Forni et al. (2012) find that domestic financial and global factors explain movements in CDS spreads, using a panel dataset of 21 advanced economies over the period January 2008-October 2010.

As the crisis has progressed, differentiation across sovereign CDS spreads has increased significantly (Figure A2), underscoring that markets are reassessing the effect of country specific-factors on default risk. This implies that looking at historical correlations can overshadow some of the important relationships that have emerged as the crisis has evolved.  In this context, this appendix attempts to shed light on the particular state of the markets in 2011, at the height of the euro area crisis. 

Empirical Model Estimation

Sovereign CDS spreads in several European countries reached historical highs during 2011.  High deficits and debt to GDP ratios have typically been a precondition for such a surge, as countries that saw their overall deficit to GDP ratio rise into the double digits or a sizeable increase in their stock of debt faced increasing market pressure (mostly in the euro area).  However, there are several indications that other elements beyond fiscal fundamentals were at play. First, countries that announced sizeable fiscal adjustment plans in 2011 were not necessarily greeted with a reduction in spreads (Figure A3). Second, countries with weak fiscal accounts (such as Japan, the United Kingdom, and the United States) did not pay high spreads in 2011, which could in part be attributed to the effects of quantitative easing strategies by central banks in these countries (Figure A4).

With this in mind, this appendix assesses in a more consistent way why 5-year CDS spreads differ across a sample of 31 advanced economies,18 by looking at a set of macroeconomic and fiscal fundamentals, based on a simple OLS cross-section regression. A cross-section analysis is preferred to a panel regression given the desired focus on market behavior in the latest phase of the crisis. In particular, a cross-section allows for a larger number of countries to be included, which adds greater variation to the dataset than does the time dimension (as this analysis covers only one crisis episode).19 CDS spreads (average for 2011) are drawn from Markit20 and transformed into logs in line with Edwards (1984). Fiscal variables used as regressors are drawn from the September 2011 Fiscal Monitor (IMF 2011a), while macroeconomic variables are drawn from the September 2011 World Economic Outlook (IMF 2011b). Regressors include:

  • Macroeconomic variables: real GDP growth rate and growth squared; projected real GDP in 2014; projected potential real GDP growth, averaged over 2011-16; inflation rate for 2011.
  • Near-term fiscal variables: General government primary balance and general government debt as a ratio to GDP. For Australia, Canada, and Japan, net debt to GDP is used, in view of the sizeable amount of their assets.
  • Long-term fiscal variables: Net present value of the increase in public pension spending during 2010-50 as a ratio to GDP (from IMF, 2010c); net present value of the increase in public health care spending during 2010-50 as a ratio to GDP (from IMF, 2010d); projected primary balance to GDP in 2014; projected debt to GDP in 2014.
  • Investor base: General government debt of the country in question held by its national central bank (from the IMF International Financial Statistics) and, in the case of Japan, the U.K. and the U.S., by foreign central banks, based on the latest available data.

Estimation Results

Table A1 provides the results of the model. Column 1 reports a general specification in which all variables are included. The following columns illustrate the specification search, with insignificant variables dropped one by one. Column 5, the preferred specification, provides a relatively good fit with an adjusted R-squared of 0.76. The results illustrate the current short-termism of markets:
  • Fiscal variables are important, with markets focusing primarily on short-term developments (the projected primary deficit and debt in 2011). The primary balance is only significant for euro area countries. The coefficients on deficits and debt are broadly in line with what has been found by previous econometric work, though at the lower end of the range.22 For a country with CDS spreads of 200 basis points, a 1 percentage point increase in the debt ratio raises the spread by about 3 basis points and a 1 percentage point increase in the deficit raises the spread by 35 basis points.  Given the log-linear specification, the larger the initial level of the CDS spread, the larger the impact on spreads, in basis points, of an increase in deficit and debt ratios; consistently, a weakening of fiscal variables has a more negative impact in countries with higher initial deficit and debt ratios.
  • Long-term fiscal variables are not found to be significant. The coefficients on future debt and deficits and on public pension and health spending were not found to be significant. This suggests that reforms to entitlement spending or measures that would only have a long term impact would not necessarily be rewarded by markets in the short run. This result underscores the difficulty of providing credible information to markets in this area and the need for more effective communication of the effect of such reforms on the soundness of public finances.
  • Short-term growth is important (higher growth leading to lower spreads), while potential growth and future growth are not significant. This relationship is found to be nonlinear—with a positive coefficient on the squared growth term—as spreads are more likely to increase when growth is already low and fiscal tightening is larger.23 Based on these results, if the fiscal multiplier is sufficiently large (higher than 0.7 based on the estimated coefficients), the improvement in spreads from a lower deficit could be offset by the negative impact of adjustment on short-term growth, which also acts through the short-term rise in the debt to GDP ratio (see Figure 5 in the main text).
  • Central bank financing (either from national central banks or from foreign central banks) is important in lowering spreads, as long as it is not inflationary. This coefficient is higher than the one on the debt ratio, implying that the effect of purchases by national central banks (and by foreign central banks for reserve currencies) goes beyond the effect of reducing the overall supply of government bonds sold to the public. This probably reflects confidence effects provided by the presence of the central bank in the market. Note, however, that the central bank holdings variable does not include purchases by the ECB through the Securities Market Program.24 The coefficient for inflation is highly significant, and thus implies that central bank purchases are effective in moderating spreads only if they are not inflationary. Given the large accumulation of excess reserves by banks, inflation pressures currently remain at bay in most countries. The respite in sovereign bond markets, following the long-term refinancing operations (LTRO) of the European Central Bank (ECB) are a further example of the confidence effects of central bank intervention. These results suggest that the availability of financing from an entity with sufficiently large resources could help reduce spreads in the current environment.

Conclusions

The cross-section estimates point to the current short-term vision of markets, with special concern for near-term growth prospects. This could possibly reflect strong risk aversion after four years of market turmoil. These results imply that tighter fiscal policy could actually lead to wider, rather than narrower, spreads in the short term. It is important to note, however, that the euro area crisis is still not fully resolved and financial markets remain unsettled, therefore these results may reflect the particular state of markets in 2011 rather than more permanent features, something that a cross section cannot shed light on. Moreover, it would be important to assess the direct effect on spreads of other variables beyond fiscal fundamentals, such as exposure to contingent liabilities from the banking sector. Potential simultaneity issues (e.g., between spreads and growth) also deserve additional attention.

Optimal Liquidity and Economic Stability. By Linghui Han & Il Houng Lee

Optimal Liquidity and Economic Stability. By Linghui Han & Il Houng Lee
IMF Working Paper No. 12/135
May 2012

Summary: Monetary aggregates are now much less used as policy instruments as identifying the right measure has become difficult and interest rate transmission has worked well in an increasingly complex financial system. In this process, little attention was paid to the potential spillover of excess liquidity. This paper suggests a notional level of "optimal" liquidity beyond which asset prices will start to rise faster than the GDP deflator, thereby creating a gap between the face value and the real purchasing value of financial assets and widen the wedge in income between those with capital stock and those living on salaries. Such divergence will eventually lead to an abrupt and disorderly adjustment of the asset value, with repercussions on the real sector.

Order a print copy: http://www.imfbookstore.org/IMFORG/WPIEA2012135

Excerpts

Introduction

The definition of money has evolved but is still anchored on the notion that money provides ready access to current and future goods and services, i.e., cash value. Liquidity is often defined as assets that can be easily converted into cash, and now includes most financial assets as financial innovations and financial deepening have enabled them to be readily converted into money. In this regard, the definition of money can be broadened to equal liquidity.

The traditional conceptual framework of money and price dynamics, however, has not kept up with the expanding concept of money. The formalization of the conceptual framework of the role of money M probably started with the infamous Fisher’s “equation of exchange” MV = PT, where M is money, V is velocity, P the price level and T the level of transactions.  Since it assumes that V and T are fixed and M is exogenous, an increase in M will lead to an exact proportional increase in the price level. The Cambridge school highlights money’s role as a store-of-wealth (including for precautionary motive) and defines M/P = kY where k is the Cambridge constant capturing the opportunity cost of money (interest). Thus, k is not institutionally fixed but changing. This is equivalent to Fisher's equation if one recognizes that real income (Y) and transactions (T) are identical and k=1/V.

Keynes further enriched the Cambridge equation by providing three motives for money, i.e., transaction, precautionary, and speculative. Money demand is affected by income and interest rates, so that Md = L(r, Y) where r is the average of rate of return on illiquid assets.  The basic propositions are L’(r) < 0 due to the opportunity cost, and L’(Y) > 0. These motives provide the basis for holding a larger amount of money within the economy. In Milton Friedman’s general form of money demand Md introduces the generalized portfolio constraint (Md - Ms) + (Bd - Bs) + (Yd - Ys) = 0 which connects the goods market with the money and bond markets. A monetary expansion (Ms) can be offset by an excess demand for goods. Then output Ys will rise and money demand Md will rise so that the goods market and money market are brought back into equilibrium.

Increasingly less attention is paid to the interconnectedness between money and the real sector, and thus on a mechanism for correction if money exceeds a notional optimal level. In large part, this is because the relationship between money in the classical sense and the real economy has weakened with the expansion of financial market instruments. Money M as used in Fisher’s equation is now only a fraction of instruments of transaction and as a store of value. Similarly, Friedman’s generalized portfolio constraint no longer captures the complexity of the current financial system. Indeed, M (narrowly defined money) is only relevant in influencing short term liquidity condition, and hence the short term interest rate.

Accordingly, in several countries monetary aggregates are now playing a relatively minor role in monetary policy formulations. The former Federal Reserve Governor L. Mayer noted that “money plays no role in today’s consensus macro model.” Consistent with this view, the Federal Open Market Committee does not specify a monetary aggregate as a target. Indeed, Bernanke (2006) stated that targeting monetary aggregates have not been effective in “constraining policy or in reducing inflation.” He attributes this to the recurrent instability in the relationship between money demand framework associated with deregulation and financial innovation. While the Federal Reserve continues to monitor and analyze monetary developments, he argued against heavy reliance on monetary aggregate in policy formulation.

These views are supported by Woodford (2007), who reviewed inflation models with no roles for money and suggested that these models are not inconsistent with elementary economic principles. Using a basic new Keynesian model, he showed (implicitly) that central banks’ inflation target credibility and their reaction function (policy rate) are adequate in setting a path for the price level without explicitly modeling a role for money.

In Europe, on the other hand, monetary aggregates are not fully dismissed in policy formulation. As noted by Kahn and Benolkin (2007), the European Central Bank continues to regard money as one of the factors determining inflation outlook over the medium term.  Even then, its focus is more on identifying an appropriate money demand framework and less on redefining money that better captures the growing complexity of the financial market.

These said, studies on the role of monetary aggregates have evolved but with focus on their relations with asset prices, especially in light of the disruptive boom and bust cycles of the latter on growth. Borio and Lowe (2002) identified gaps in credit, asset price, and investment, respectively, as periods when the actual deviates from the trend by a sizable amount. They found that the credit gap is the best indicator of an impending financial crisis.  The importance of credit to equity and property boom/bust episodes is supported also by Helbling and Terrones (2003) where they found the monetary aggregate to be more relevant for equity, rather than, for housing prices.

Gerdesmeier and others (2009) found asset price booms to follow rapid growth in monetary aggregates (money and credit) and eventually lead to asset price busts. They do so by constructing an asset price indicator composed of stock price and house price markets, similar to the work by Borio and others (1994) where the index was compiled using residential property, commercial property and share prices. Gerdesmeier found that changes in their composite index were consistent with the rapid increase in credit growth that followed the relaxation of constraints in the wake of financial liberalization during the 1980s.

Against these developments, this paper suggests an expanded definition of money, i.e., liquidity, which includes all financial assets held by the nonfinancial private sector. Then a notional level of “optimal” liquidity is proposed beyond which asset prices will start to rise faster than the GDP deflator, thereby creating a “Gap” between the face value and the real purchasing value of financial assets. Such a divergence will eventually lead to an abrupt and disorderly adjustment of the asset value, with repercussions on the real sector. This work provides value added by identifying a monetary aggregate the optimal value of which can be targeted at a level consistent with real sector fundamentals. These in turn are defined as the economy’s capacity to produce goods and services. When the Gap widens, it will not only lead to a boom/bust cycle, but also worsen income disparity between those holding capital stock and those who rely on income flows.


Summary and conclusion

Although money in the narrow sense matters less in an increasingly complex financial system, the quantity of a broader measure of monetary aggregate is still very relevant to the real economy. We find that the liquidity defined as the total financial assets held by the nonfinancial sector is an important determent of the value of the physical capital. This is because those who issue a financial asset must have corresponding earnings, including valuation gains, on the liability side that match the value of the issuance. The value of the earnings of a physical asset in turn is the real net present value of return of the capital stock, which depreciates over time, multiplied by the price of the capital stock.

The optimal amount of liquidity is attained at the level where it equals the real earnings times the GDP deflator. This is because the nominal earnings (income flows) of capital by default are measured as a scale to nominal GDP (i.e., relevant because of purchasing power). A Gap is created if the amount of liquidity exceeds this optimal level, which will be reflected through a fall in the GDP deflator/price of capital ratio. In other words, if the Gap arises due to a rapid expansion in liquidity, this will push up the price level of the capital stock at a much faster pace than the GDP deflator. As a result, this gap will (i) lead to a boom and bust cycle if left unchecked, which is disruptive to the economy, and (ii) worsen income inequality by rewarding those with capital stock more than those who depend on flow of income.

While it is true that interest rate transmission mechanism has become an effective monetary policy instrument aimed at controlling inflation, monetary aggregate is also still relevant to providing economic stability. By broadening the definition of money to include all financial assets held by the nonfinancial private sector, and then targeting the total to a level that is consistent with the optimal level liquidity as discussed in this paper, economic and price stability can be achieved. To achieve this desired outcome, monetary policy will have to use a combination of interest rate and monetary aggregate as the intermediate target.

Saturday, May 19, 2012

Quantifying Structural Subsidy Values for Systemically Important Financial Institutions

Quantifying Structural Subsidy Values for Systemically Important Financial Institutions. By Kenichi Ueda and Beatrice Weder
IMF Working Paper No. 12/128

Summary: Claimants to SIFIs receive transfers when governments are forced into bailouts. Ex ante, the bailout expectation lowers daily funding costs. This funding cost differential reflects both the structural level of the government support and the time-varying market valuation for such a support. With large worldwide sample of banks, we estimate the structural subsidy values by exploiting expectations of state support embedded in credit ratings and by using long-run average value of rating bonus. It was already sizable, 60 basis points, as of the end-2007, before the crisis. It increased to 80 basis points by the end-2009.

http://www.imf.org/external/pubs/cat/longres.aspx?sk=25928.0

Excerpts

Introduction

One of the most troubling legacies of the financial crisis is the problem of “too-systemically important-to-fail” financial institutions. Public policy had long recognized the dangers that systemically relevant institutions pose for the financial system and for public sector balance sheets, but in practice, this problem was not deemed to be extremely pressing. It was mainly dealt with by creating some uncertainty (constructive ambiguity) about the willingness of government intervention in a crisis.

The recent crisis since 2008 provided a real-life test of the willingness to intervene. After governments have proven their willingness to extended large-scale support, constructive ambiguity has given way to near certainty that sufficiently large or complex institutions will not be allowed to fail. Thus, countries have emerged from the financial crisis with an even larger problem: Many banks are larger than before and so are implicit government guarantees. In addition, it also becomes clear that these guarantees are not limited to large institutions. In Europe, smaller institutions with a high degree of interconnectedness, complexity, or political importance were also considered too important to fail.

The international community is addressing the problem of SIFIs with a two-pronged approach. On the one hand, the probability of SIFIs failure is to be reduced through higher capital buffers and tighter supervision. On the other hand, SIFIs are to be made more “resolvable” by subjecting them to special resolutions regimes (e.g., living wills and CoCos). A number of countries have already adopted special regimes at the national level or are in the process of doing so. However, it remains highly doubtful whether these regimes would be operable across borders. This regulatory coordination failure implies that creditors of SIFIs continue to enjoy implicit guarantees.

Subsidies arising from size and complexity create incentives for banks to become even larger and more complex. Hence, eliminating the value of the implicit structural subsidy to SIFIs should contribute to reducing both the probability and magnitude of (future) financial crises. Market participants tend to dismiss these concerns by stating that these effects may be there in theory but are very small in practice. Therefore, it requires an empirical study to quantify the value of state subsidies to SIFIs. This is the aim of this paper. 

How can we estimate the value of structural state guarantees? As institutions with state backing are safer, investors ask for a lower risk premium, taking into account the expected future transfers from the government. Therefore, before crisis, the expected value of state guarantees is the difference in funding costs between a privileged bank and a non-privileged bank. A caveat of this reasoning is that this distortion might affect the competitive behaviors and the market shares of both the subsidized and the non-subsidized financial institutions. Therefore, the difference in observed funding costs may include indirect effects in addition to the direct subsidy for SIFIs.

We estimate the value of the structural subsidy using expectations of government support embedded in credit ratings. Overall ratings (and funding costs) of financial institutions have two constituent parts: their own financial strength and the expected amount of external support.  External support can be provided by a parent company or by the government. Some rating agencies (e.g., Fitch) provide regular quantitative estimates of the probability that a particular financial institution would receive external support in case of crisis. We isolate the government support component and provide estimates of the value of this subsidy as of end-2007 and end-2009.

We find that the structural subsidy value is already sizable as of end-2007 and increased substantially by the end-2009, after key governments confirmed bailout expectations. On average, banks in major countries enjoyed credit rating bonuses of 1.8-3.4 at the end-2007 and 2.5-4.2 at the end-2009. This can be translated into a funding cost advantage roughly 60bp and 80bp, respectively.

The use of ratings might be considered problematic because rating agencies have been known to make mistakes in their judgments. For instance, they have been under heavy criticism for overrating structured products in the wake of the financial crisis. However, whether rating agencies assess default risks correctly is not important for the question at hand. All that matters is that markets use ratings in pricing debt instruments and those ratings influence funding costs.  This has been the case.6 Therefore, we can use the difference in overall credit ratings of banks as a proxy for the difference in their structural funding costs. Our empirical approach is to extract the value of structural subsidy from support ratings, while taking into account bank-specific factors that determine banks’ own financial strength as well as country-specific factors that determine governments’ fiscal ability to offer support.

A related study by Baker and McArthur (2009) obtains a somewhat lower value of the subsidy, ranging from 9 bp to 49 bp. However, the difference in results can be explained by different empirical strategies: Baker and McArthur use the change in the difference in funding costs between small and large US banks before and after TARP. With this technique, they identify the change in the value of the SIFIs subsidy, which is assumed to be created by the government bailout intervention. However, they cannot account for a possible level of bailout expectations that may have been embedded in prices long before the financial crisis. This ignorance is a drawback of all studies that use bailout events to quantify the value of subsidy: They can be quite precise in estimating the change in the subsidy due to a particular intervention but they will underestimate the total level of the subsidy if the support is positive even in tranquil times. In other words, they cannot establish the value of funding cost advantages accruing from expected state support even before the crisis.

This characteristic is the distinct advantage of the rating approach. It allows us to estimate not only the change of the subsidy during the crisis but also the total value of the subsidy before the crisis. As far as we are aware, there are only a few previous papers which use ratings. Soussa (2000), Rime (2005), and Morgan and Stiroh (2005) used similar approaches to back out the value of the subsidy. However, our study is more comprehensive by including a larger set of banks and countries and also by covering the 2008 financial crisis.

Assuming that the equity values are not so much affected by bailouts but the debt values are, the time-varying estimates of the government guarantees can be calculated using a standard option pricing theory.9 However, the funding cost advantage in crisis reflects two components: first, the structural government support and, second, a larger risk premium due to market turmoil. If we calculate the value of one rating bonus only in crisis times, the value of bonus would be larger because of the latter effect. However, when designing a corrective levy, the value of the government support should not be affected by these short-run market movements. For this reason, the long-run average value of one rating bonus—used here to calculate the total value of the structural government support—should be more suitable as a basis for a collective levy than real-time estimates for the market value of the government guarantees. 


Interpretation and conclusion

Section III has provided estimates of the value of the subsidy to SIFIs in terms of the overall ratings. Using the range of our estimates, we can summarize that a one-unit increase in government support for banks in advanced economies has an impact equivalent to 0.55 to 0.9 notches on the overall long-term credit rating at the end-2007. And, this effect increased to 0.8 to 1.23 notches by the end-2009 (Summary Table 8). At the end-2009, the effect of the government support is almost identical between the group of advanced countries and developing countries.  Before the crisis, governments in advanced economies played a smaller role in boosting banks’ long-term ratings. These results are robust to a number of sample selection tests, such as testing for differential effects across developing and advanced countries, for both listed and non-listed banks, and also correcting for bank parental support and alternative estimations of an individual bank’s strength.

In interpreting these results, it is important to check if the averages mask large differences across countries. In fact, the overall rating bonuses in a section of large countries seem remarkably similar (Summary Table 9). For instance, mean support of Japanese banks was unchanged at 3.9 in 2007 and 2009. This implies, based on regressions without distinguishing advanced and developing countries, that overall ratings of systemically relevant banks profited by 2.9-3.5 notches from expected government support in 2007, with the value of this support increasing to 3.4-4.2 notches in 2009. For the top 45 U.S. banks, the mean support rating increased from 3.2 in 2007 to 4.1 in 2009. This translates into a 2.4-2.9 overall rating bonus for supported banks in 2007 and a much higher, 3.6-4.5, notch impact in 2009. In Germany, government support started high at 4.4 in 2007 and slightly increased to 4.6 in 2009. This suggests a 3.3-4.0 overall rating advantage of supported banks in 2007 and a 4.1-5.1 notch rating bonus in 2009.

For selected countries that have large banking centers and/or have been affected by the financial crisis, average government support ratings are about 3.6 in 2007 and 3.8 in 2009 on average (see Table 2, based on U.S. top 45 banks). Thus the overall rating bonuses for supported banks in this sample of countries are 2.7-3.2 in 2007 and 3.4-4.2 in 2009.

Our three-notch impact, on average, for advanced countries in 2007 is comparable to the results found by Soussa (2000) and Rimes (2005), although their studies are less rigorous and based on a smaller sample. In addition, Soussa (2000) reports structural annualized interest rate differentials among different credit ratings based on the average cumulative default rates (percent) for 1920-1999, calculated by Moody’s.17 According to his conversion table, when issuing a five-year bond, a three-notch rating increase translates into a funding advantage of 5 bp to 128 bp, depending on the riskiness of the institution.18 At the mid-point, it is 66.5 bp for a three-notch improvement, or 22bp for one-notch improvement. Using this and the overall rating bonuses described in the previous paragraph, we can evaluate the overall funding cost advantage of SIFIs as around 60bp in 2007 and 80bp in 2009.

This is helpful information, for example, if one would like to design a corrective levy on banks, which extracts the value of the subsidy. The funding cost advantage can be decomposed into the level of the government support and the time-varying risk premium. If a corrective levy were to be designed, it should not be affected by short-run market movements but should reflect only the long-run average value of rating bonuses, used here to calculate the total value of the structural government support. As discussed above, we find that the level of the structural government support has increased in most countries in 2009 compared to 2007. Still, we note that our estimate for the value of government support is lower than the real-time market value during crisis.

Our estimate may be also an overestimate of the required tax rate that would neutralize the (implicit) SIFI subsidy, since the competitive advantage of a guaranteed firm versus a nonguaranteed firm can be magnified (the former gains market share and the latter loses market share). One possibility is that the advantages and disadvantages are equally distributed between the two firms. Then, the levy rate that would eliminate the competitive distortion is smaller than the estimated difference in the funding costs. In this simple example, it would be half of the values given above. Nevertheless, the corrective tax required to correct the distortion of government support would remain sizable.

Wednesday, May 16, 2012

BCBS: Models and tools for macroprudential analysis

Models and tools for macroprudential analysis
BCBS Working Papers No 21
May 2012

The Basel Committee's Research Task Force Transmission Channel project aimed at generating new research on various aspects of the credit channel linkages in the monetary transmission mechanism. Under the credit channel view, financial intermediaries play a critical role in the allocation of credit in the economy. They are the primary source of credit for consumers and businesses that do not have direct access to capital markets. Among more traditional macroeconomic modelling approaches, the credit view is unique in its emphasis on the health of the financial sector as a critically important determinant of the efficacy of monetary policy.

The final products of the project are two working papers that summarise the findings of the many individual research projects that were undertaken and discussed in the course of the project. The first working paper, Basel Committee Working Paper No 20, "The policy implications of transmission channels between the financial system and the real economy", analyses the link between the real economy and the financial sector, and channels through which the financial system may transmit instability to the real economy. The second working paper, Basel Committee Working Paper No 21, "Models and tools for macroprudential analysis", focuses on the methodological progress and modelling advancements aimed at improving financial stability monitoring and the identification of systemic risk potential. Because both working papers are summaries, they touch only briefly on the results and methods of the individual research papers that were developed during the course of the project. Each working paper includes comprehensive references with information that will allow the interested reader to contact any of the individual authors and acquire the most up-to-date version of the research that was summarised in each of these working papers.

http://www.bis.org/publ/bcbs_wp21.htm

Tuesday, May 15, 2012

Changes in U.S. water use and implications for the future

It is interesting to see some data in Water Reuse: Expanding the Nation's Water Supply Through Reuse of Municipal Wastewater (http://www.nap.edu/catalog.php?record_id=13303), a National Research Council publication.

See for example figure 1-6, p 17, changes in U.S. water use and implications for the future:



Monday, May 14, 2012

Do Dynamic Provisions Enhance Bank Solvency and Reduce Credit Procyclicality? A Study of the Chilean Banking System

Do Dynamic Provisions Enhance Bank Solvency and Reduce Credit Procyclicality? A Study of the Chilean Banking System. By Jorge A. Chan-Lau
IMF Working Paper No. 12/124
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25912.0

Summary: Dynamic provisions could help to enhance the solvency of individual banks and reduce procyclicality. Accomplishing these objectives depends on country-specific features of the banking system, business practices, and the calibration of the dynamic provisions scheme. In the case of Chile, a simulation analysis suggests Spanish dynamic provisions would improve banks' resilience to adverse shocks but would not reduce procyclicality. To address the latter, other countercyclical measures should be considered.


Excerpts

Introduction

It has long been acknowledged that procyclicality could pose risks to financial stability as noted by the academic and policy discussion centered on Basel II, accounting practices, and financial globalization. Recently, much attention has been focused on regulatory dynamic provisions (or statistical provisions). Under dynamic provisions, as banks build up their loan portfolio during an economic expansion, they should set aside provisions against future losses.

The use of dynamic provisions raises two questions bearing on financial stability. First, do dynamic provisions reduce insolvency risk? Second, do dynamic provisions reduce procyclicality? In theory the answer is yes to both questions. Provided loss estimates are roughly accurate, bank solvency is enhanced since buffers are built in advance ahead of the realization of large losses. Regulatory dynamic provisions could also discourage too rapid credit growth during the expansionary phase of the cycle, as it helps preventing a relaxation of provisioning practices.

However, when real data is brought to bear on the questions above the answers could diverge from what theory implies. This paper attempts to answer these questions in the specific case of Chile. It finds that the adoption of dynamic provisions could help to enhance bank solvency but it would not help to reduce procyclicality. The successful implementation of dynamic provisions, however, requires a careful calibration to match or exceed current provisioning practices, and it is worth noting that reliance on past data could lead to a false sense of security as loan losses are fat-tail events. Finally, since dynamic provisions may not be sufficient to counter procyclicality alternative measures should be considered, such as the proposed countercyclical capital buffers in Basel III and the countercyclical provision rule Peru implemented in 2008.


Conclusions

At the policy level, the case for regulatory dynamic provisions have been advanced on the grounds that they help reducing the risk of bank insolvency and dampening credit procyclicality. In the case of the Chile the data appears to partly validate these claims.

A simulation analysis suggests that under the Spanish dynamic provisions rule provision buffers against losses would be higher compared to those accumulated under current practices. The analysis also suggests that calibration based on historical data may not be adequate to deal with the presence of fat-tails in realized loan losses. Implementing dynamic provisions, therefore, requires a careful calibration of the regulatory model and stress testing loan-loss internal models.

Dynamic provision rules appear not to dampen procyclicality in Chile. Results from a VECM analysis indicate that the credit cycle does not respond to the level of or changes in aggregate provisions. In light of this result, it may be worth exploring other measures to address procyclicality. Two examples of these measures include countercyclical capital requirements, as proposed by the Basel Committee on Banking Supervision (2010a and b), or the countercyclical provision rule introduced in Peru in 2008. The Basel countercyclical capital requirements suggest that the build up and release of additional capital buffers should be conditioned on deviations of credit to GDP ratio from its long-run trend. The Peruvian rule, contrary to standard dynamic provision rules, requires banks to accumulate countercyclical provisions when GDP growth exceeds potential. Both measures, by tying up capital or provision accumulations to cyclical indicators, could be more effective for reducing procyclicality.

Sunday, May 13, 2012

What Tokyo's Governor supporters think

A Japanese correspondant wrote about what Tokyo's Governor supporters think (edited):
I'll reply one of your questions about Tokyo's governor. He is a famous writer in Japan. Once in Japan, it was said "Money can move Politics." A leading politician who provided private funds to friends in politics, Tokyo's governor was once a lawmaker.

He gained funds by his writer activity, always speak radical statements since he was young, and he had been clearly different from other influential politicians. He was independent. He always strives to influence politicians with his great ability.

Thus, he was on the side of populace.

However he became arrogant now. But the Japanese people expect great things from him, in particular, people living in the capital, Tokyo. He always talks about "Changing Japan from Tokyo."

We think that he can do it.

Yours,

Nakaki

Friday, May 11, 2012

IMF Policy Papers: Enhancing Financial Sector Surveillance in Low-Income Countries Series

IMF Policy Paper: Enhancing Financial Sector Surveillance in Low-Income Countries - Background Paper

Summary: This note provides an overview of the literature on the challenges posed by shallow financial systems for macroeconomic policy implementation. Countries with shallow markets are more likely to choose fixed exchange rates, less likely to use indirect measures as instruments of monetary policy, and to implement effective counter-cyclical fiscal policies. But causation appears to work in both directions, as policy stances can themselves affect financial development. Drawing on recent FSAP reports, the note also shows that shallow financial markets tend to increase foreign exchange, liquidity management, and concentration risks, posing risks for financial stability

http://www.imf.org/external/pp/longres.aspx?id=4650



---
IMF Policy Paper: Enhancing Financial Sector Surveillnace in Low-Income Countries - Financial Deepening and Macro-Stability

Summary: This paper aims to widen the lens through which surveillance is conducted in LICs, to better account for the interplay between financial deepening and macro-financial stability as called for in the 2011 Triennial Surveillance Review. Reflecting the inherent risk-return tradeoffs associated with financial deepening, the paper seeks to shed light on the policy and institutional impediments in LICs that have a bearing on the effectiveness of macroeconomic policies, macro-financial stability, and growth. The paper focuses attention on the role of enabling policies in facilitating sustainable financial deepening. In framing the discussion, the paper draws on a range of conceptual and analytical tools, empirical analyses, and case studies.

http://www.imf.org/external/pp/longres.aspx?id=4649



---
IMF Policy Paper: Enhancing Financial Sector Surveillance in Low-Income Countries - Case Studies

Summary: This supplement presents ten case studies, which highlight the roles of targeted policies to facilitate sustainable financial deepening in a variety of country circumstances, reflecting historical experiences that parallel a range of markets in LICs. The case studies were selected to broadly capture efforts by countries to increase reach (e.g., financial inclusion), depth (e.g., financial intermediation), and breadth of financial systems (e.g., capital market, cross-border development). The analysis in the case studies highlights the importance of a balanced approach to financial deepening. A stable macroeconomic environment is vital to instill consumer, institutional, and investor confidence necessary to encourage financial market activity. Targeted public policy initiatives (e.g., collateral, payment systems development) can be helpful in removing impediments and creating infrastructure for improved market operations, while ensuring appropriate oversight and regulation of financial markets, to address potential sources of instability and market failures.

http://www.imf.org/external/pp/longres.aspx?id=4651

Tuesday, May 8, 2012

Some scholars argue that top rates can be raised drastically with no loss of revenue

Of Course 70% Tax Rates Are Counterproductive. By Alan Reynolds
Some scholars argue that top rates can be raised drastically with no loss of revenue. Their arguments are flawed.WSJ, May 7, 2012
http://online.wsj.com/article/SB10001424052702303916904577376041258476020.html


President Obama and others are demanding that we raise taxes on the "rich," and two recent academic papers that have gotten a lot of attention claim to show that there will be no ill effects if we do.

The first paper, by Peter Diamond of MIT and Emmanuel Saez of the University of California, Berkeley, appeared in the Journal of Economic Perspectives last August. The second, by Mr. Saez, along with Thomas Piketty of the Paris School of Economics and Stefanie Stantcheva of MIT, was published by the National Bureau of Economic Research three months later. Both suggested that federal tax revenues would not decline even if the rate on the top 1% of earners were raised to 73%-83%.

Can the apex of the Laffer Curve—which shows that the revenue-maximizing tax rate is not the highest possible tax rate—really be that high?

The authors arrive at their conclusion through an unusual calculation of the "elasticity" (responsiveness) of taxable income to changes in marginal tax rates. According to a formula devised by Mr. Saez, if the elasticity is 1.0, the revenue-maximizing top tax rate would be 40% including state and Medicare taxes. That means the elasticity of taxable income (ETI) would have to be an unbelievably low 0.2 to 0.25 if the revenue-maximizing top tax rates were 73%-83% for the top 1%. The authors of both papers reach this conclusion with creative, if wholly unpersuasive, statistical arguments.

Most of the older elasticity estimates are for all taxpayers, regardless of income. Thus a recent survey of 30 studies by the Canadian Department of Finance found that "The central ETI estimate in the international empirical literature is about 0.40."

But the ETI for all taxpayers is going to be lower than for higher-income earners, simply because people with modest incomes and modest taxes are not willing or able to vary their income much in response to small tax changes. So the real question is the ETI of the top 1%.

Harvard's Raj Chetty observed in 2009 that "The empirical literature on the taxable income elasticity has generally found that elasticities are large (0.5 to 1.5) for individuals in the top percentile of the income distribution." In that same year, Treasury Department economist Bradley Heim estimated that the ETI is 1.2 for incomes above $500,000 (the top 1% today starts around $350,000).

A 2010 study by Anthony Atkinson (Oxford) and Andrew Leigh (Australian National University) about changes in tax rates on the top 1% in five Anglo-Saxon countries came up with an ETI of 1.2 to 1.6. In a 2000 book edited by University of Michigan economist Joel Slemrod ("Does Atlas Shrug?"), Robert A. Moffitt (Johns Hopkins) and Mark Wilhelm (Indiana) estimated an elasticity of 1.76 to 1.99 for gross income. And at the bottom of the range, Mr. Saez in 2004 estimated an elasticity of 0.62 for gross income for the top 1%.

A midpoint between the estimates would be an elasticity for gross income of 1.3 for the top 1%, and presumably an even higher elasticity for taxable income (since taxpayers can claim larger deductions if tax rates go up.)

But let's stick with an ETI of 1.3 for the top 1%. This implies that the revenue-maximizing top marginal rate would be 33.9% for all taxes, and below 27% for the federal income tax.

To avoid reaching that conclusion, Messrs. Diamond and Saez's 2011 paper ignores all studies of elasticity among the top 1%, and instead chooses a midpoint of 0.25 between one uniquely low estimate of 0.12 for gross income among all taxpayers (from a 2004 study by Mr. Saez and Jonathan Gruber of MIT) and the 0.40 ETI norm from 30 other studies.

That made-up estimate of 0.25 is the sole basis for the claim by Messrs. Diamond and Saez in their 2011 paper that tax rates could reach 73% without losing revenue.

The Saez-Piketty-Stantcheva paper does not confound a lowball estimate for all taxpayers with a midpoint estimate for the top 1%. On the contrary, the authors say that "the long-run total elasticity of top incomes with respect to the net-of-tax rate is large."

Nevertheless, to cut this "large" elasticity down, the authors begin by combining the U.S. with 17 other affluent economies, telling us that elasticity estimates for top incomes are lower for Europe and Japan. The resulting mélange—an 18-country "overall elasticity of around 0.5"—has zero relevance to U.S. tax policy.

Still, it is twice as large as the ETI of Messrs. Diamond and Saez, so the three authors appear compelled to further pare their 0.5 estimate down to 0.2 in order to predict a "socially optimal" top tax rate of 83%. Using "admittedly only suggestive" evidence, they assert that only 0.2 of their 0.5 ETI can be attributed to real supply-side responses to changes in tax rates.

The other three-fifths of ETI can just be ignored, according to Messrs. Saez and Piketty, and Ms. Stantcheva, because it is the result of, among other factors, easily-plugged tax loopholes resulting from lower rates on corporations and capital gains.

Plugging these so-called loopholes, they say, requires "aligning the tax rates on realized capital gains with those on ordinary income" and enacting "neutrality in the effective tax rates across organizational forms." In plain English: Tax rates on U.S. corporate profits, dividends and capital gains must also be 83%.

This raises another question: At that level, would there be any profits, capital gains or top incomes left to tax?

"The optimal top tax," the three authors also say, "actually goes to 100% if the real supply-side elasticity is very small." If anyone still imagines the proposed "socially optimal" tax rates of 73%-83% on the top 1% would raise revenues and have no effect on economic growth, what about that 100% rate?

Mr. Reynolds is a senior fellow with the Cato Institute and the author of "Income and Wealth" (Greenwood Press, 2006).

Bank Capitalization as a Signal. By Daniel C. Hardy

Bank Capitalization as a Signal. By Daniel C. Hardy
IMF Working Paper No. 12/114
May 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25894.0

Summary: The level of a bank‘s capitalization can effectively transmit information about its riskiness and therefore support market discipline, but asymmetry information may induce exaggerated or distortionary behavior: banks may vie with one another to signal confidence in their prospects by keeping capitalization low, and banks‘ creditors often cannot distinguish among them - tendencies that can be seen across banks and across time. Prudential policy is warranted to help offset these tendencies.

Friday, May 4, 2012

Women, Welch Clash at Forum - "Great women get upset about getting into the victim's unit"

Women, Welch Clash at Forum. By John Bussey
Wall Street Journal, May 4, 2012, page B1
http://online.wsj.com/article/SB10001424052702303877604577382321364803912.html

Is Jack Welch a timeless seer or an out-of-touch warhorse?

The former Master and Commander of General Electric still writes widely on business strategy. He's also influential on the speaking circuit.

On Wednesday, Mr. Welch and his wife and writing partner, Suzy Welch, told a gathering of women executives from a range of industries that, in matters of career track, it is results and performance that chart the way. Programs promoting diversity, mentorships and affinity groups may or may not be good, but they are not how women get ahead. "Over deliver," Mr. Welch advised. "Performance is it!"

Angry murmurs ran through the crowd. The speakers asked: Were there any questions?

"We're regaining our consciousness," one woman executive shot back.

Mr. Welch had walked into a spinning turbine fan blade.

"Of course women need to perform to advance," Alison Quirk, an executive vice president at the investment firm State Street Corp., said later. "But we can all do more to help people understand their unconscious biases."

"He showed no recognition that the culture shapes the performance metrics, and the culture is that of white men," another executive said.

Academy Award winning actor Geena Davis talks about the perception of women as seen in the media and about what has and has not changed in the past sixty years.

Dee Dee Myers, a former White House press secretary who is now with Glover Park Group, a communications firm, added: "While he seemed to acknowledge the value of a diverse workforce, he didn't seem to think it was necessary to develop strategies for getting there—and especially for taking a cold, hard look at some of the subtle barriers to women's advancement that still exist. If objective performance measures were enough, more than a handful of Fortune 500 senior executives would already be women. "

"This meritocracy fiction may be the single biggest obstacle to women's advancement," added Lisa Levey, a consultant who heard Mr. Welch speak.

Mr. Welch has sparked controversy in the past with his view of the workplace. In 2009, he told a group of human-resources managers: "There's no such thing as work-life balance." Instead, "there are work-life choices, and you make them, and they have consequences." Step out of the arena to raise kids, and don't be surprised if the promotion passes you by.

Of the Fortune 500 companies, only 3% have a female CEO today. Female board membership is similarly spare. A survey of 60 major companies by McKinsey shows women occupying 53% of entry-level positions, 40% of manager positions, and only 19% of C-suite jobs.

The reasons for this are complex and aren't always about child rearing. A separate McKinsey survey showed that among women who have already reached the status of successful executive, 59% don't aspire to one of the top jobs. The majority of these women have already had children.

"Their work ethic—these people are doing it all," said Dominic Barton of McKinsey. "They say, 'I'm the person turning off the lights'" at the end of the day.

Instead, Mr. Barton said, it's "the soft stuff, the culture" that's shaping their career decisions.

The group of women executives who wrestled with Mr. Welch were at a conference on Women in the Economy held by The Wall Street Journal this week. Among other things, they tackled the culture questions—devising strategies to get more high-performing women to the top, keep women on track during childbearing years, address bias, and make the goals of diversity motivating to employees. They also discussed the sexual harassment some women still experience in the workplace. (A report on the group's findings will be published in the Journal Monday.)

The realm of the "soft stuff" may not be Mr. Welch's favored zone. During his remarks, he referred to human resources as "the H.R. teams that are out there, most of them for birthdays and picnics." He mentioned a women's forum inside GE that he says attracted 500 participants. "The best of the women would come to me and say, 'I don't want to be in a special group. I'm not in the victim's unit. I'm a star. I want to be compared with the best of your best.'"

And then he addressed the audience: "Stop lying about it. It's true. Great women get upset about getting into the victim's unit."

Individual mentoring programs, meanwhile, are "one of the worst ideas that ever came along," he said. "You should see everyone as a mentor."

He had this advice for women who want to get ahead: Grab tough assignments to prove yourself, get line experience, and embrace serious performance reviews and the coaching inherent in them.

"Without a rigorous appraisal system, without you knowing where you stand...and how you can improve, none of these 'help' programs that were up there are going to be worth much to you," he said. Mr. Welch said later that the appraisal "is the best way to attack bias" because the facts go into the document, which both parties have to sign.

Mr. Welch championed the business philosophy of "Six Sigma" at GE, a strategy that seeks to expunge defects from production through constant review and improvement. It appears to work with machines and business processes.

But applying that clinical procedure to the human character, as Mr. Welch seems to want to do, is a stickier proposition.

"His advice was not tailored to how women can attain parity in today's male-dominated workplace," said one female board member of a Fortune 500 company. Indeed, a couple of women walked out in frustration during his presentation.

Wednesday, May 2, 2012

Dynamic Loan Loss Provisioning: Simulations on Effectiveness and Guide to Implementation

Dynamic Loan Loss Provisioning: Simulations on Effectiveness and Guide to Implementation. By Torsten Wezel, Jorge A. Chan Lau, and Francesco Columba
IMF Working Paper No. 12/110
May 01, 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25885.0

Summary: This simulation-based paper investigates the impact of different methods of dynamic provisioning on bank soundness and shows that this increasingly popular macroprudential tool can smooth provisioning costs over the credit cycle and lower banks’ probability of default. In additon, the paper offers an in-depth guide to implementation that addresses pertinent issues related to data requirements, calibration and safeguards as well as accounting, disclosure and tax treatment. It also discusses the interaction of dynamic provisioning with other macroprudential instruments such as countercyclical capital.

Excerpts:

Introduction

Reducing the procyclicality of the banking sector by way of macroprudential policy instruments has become a policy priority. The recent crisis has illustrated how excessive procyclicality of the banking system may activate powerful macro-financial linkages that amplify the business cycle and how increased financial instability can have large negative spillover effects onto the real sector. Moreover, research has shown that crises that included banking turmoil are among the longest and most severe of all crises.

Although there is no consensus yet on the very definition of macroprudential policy, an array of such tools, especially those of countercyclical nature, has been applied in many countries for years. But it was only during the financial crisis that powerful macro-financial linkages played out on a global scale, conveying a sense of urgency.

In the wake of the crisis, policymakers therefore intensified their efforts to gear the macroprudential approach to financial stability towards improving banks’ capacity to absorb shocks—a consultative process that culminated in the development of the Basel III framework in December 2010 to be phased in over the coming years. In addition to improving the quality of bank capital and liquidity as well as imposing a minimum leverage ratio, this new regulatory standard introduces countercyclical capital buffers and lends support to forward-looking loan loss provisioning, which comprises dynamic provisioning (DP).

The new capital standard promotes the build-up of capital buffers in good times that can be drawn down in periods of stress, in the form of a capital conservation requirement to increase the banking sector’s resilience entering into a downturn. Part of this conservation buffer would be a countercyclical buffer that is to be activated only when there is excess credit growth so that the sector is not destabilized in the downturn. Such countercyclical capital has also been characterized as potentially cushioning the economy’s real output during a crisis (IMF, 2011). Similarly, dynamic provisioning requires banks to build a cushion of generic provisions during an upswing that can be used to cover rising specific provisions linked to loan delinquencies during the subsequent downturn.

Both countercyclical capital and DP have been applied in practice. Some countries have adjusted capital regulations in different phases of the cycle to give them a more potent countercyclical impact: Brazil has used a formula to smooth capital requirements for interest rate risk in times of extreme volatility, China introduced a countercyclical capital requirement similar to the countercyclical buffer under Basel III, and India has made countercyclical adjustments in risk weights and in provisioning. DP was first introduced by Spain in 2000 and subsequently adopted in Uruguay, Colombia, Peru, and Bolivia, while other countries such as Mexico and Chile switched to provisioning based on expected loan loss. Peru is the only country to explicitly use both countercyclical instruments in combination.

The concept of DP examined in this paper is intriguing. By gradually building a countercyclical loan loss reserve in good times and then using it to cover losses as they arise in bad times, DP is able to greatly smooth provisioning costs over the cycle and thus insulate banks’ profit and loss statements in this regard. Therefore, DP may usefully complement other policies targeted more at macroeconomic aggregates. The implementation of DP can, however, be a delicate balancing exercise. The calibration is typically challenging because it requires specific data, and even if these are available, it may still be inaccurate if the subsequent credit cycle differs substantially from the previous one(s) on which the model is necessarily predicated. Over-provisioning may ensue in particular instances. This said, a careful calibration that tries to incorporate as many of the stylized facts of past credit developments as possible goes a long way in providing a sizeable cushion for banks to withstand periodic downswings.

This paper provides strong support for DP as a tool for countercyclical banking policies. Our contribution to this strand of the literature is threefold. We first recreate a hypothetical path of provisions under different DP systems based on historical data of an emerging banking market and compare the outcome to the actual situation without DP. These counterfactual simulations suggest that a well-calibrated system of DP mitigates procyclicality in provisioning costs and thus earnings and capital. Second, using Monte-Carlo simulations we show that the countercyclical buffer that DP builds typically lowers a bank’s probability of default. Finally, we offer a guide to implementation of the DP concept that seeks to clarify issues related to data requirements, choice of formula, parametrization, accounting treatment, and recalibration.

Other studies that have used counterfactual simulations based on historical data to assess the hypothetical performance under DP include Balla and McKenna (2009), Fillat and Montoriol- Garriga (2010), both using U.S. bank data, and Wezel (2010), using data for Uruguay. All studies find support for the notion that DP, when properly calibrated, can help absorb rising loan losses in a downturn and thus be a useful macroprudential tool in this regard. Some other studies (Lim et al., 2011; Peydró-Alcalde et al., 2011) even find that DP is effective in mitigating swings in credit growth, although this should not be expected of DP in general.



Conclusion

This paper has provided a thorough analysis of the merits and challenges associated with dynamic provisioning—a macroprudential tool that deserves attention from policymakers and regulators for its capacity to distribute the burden of loan impairment evenly over the credit cycle and so quench an important source of procyclicality in banking. Our simulations that apply the Spanish and Peruvian DP formulas to a full cycle of banking data of an advanced emerging market leave little doubt that the countercyclical buffer built under DP not only smoothes costs but actually bolsters financial stability by lowering banks’ PD in severe downturn conditions. We also show that for best countercyclical results DP should be tailored to the different risk exposures of individual banks and the specific circumstances of banking sectors, presenting measures such as bank-specific rates or hybrid systems combining the virtues of formulas.

While the simple concept of providing in good times for lean years is intuitive, it has its operational challenges. When calibrating a DP system great care must be taken to keep countercyclical reserves in line with expected loan losses and so avoid insufficient buffers or excessive coverage. As many of the features and needed restrictions are not easily understood or operationalized, we offer a comprehensive primer for regulators eager to implement one of the variants of DP analyzed in the paper. The discussion of practical challenges also includes thorny issues like compliance with accounting standards. In fact, policymakers have long tended to dismiss DP on grounds that it is not legitimate from an accounting perspective and therefore focused on other tools such as countercyclical capital. To remedy this problem, we propose ways to recalibrate the formula periodically and so keep it in line with expected loan loss. Further, while recognizing that countercyclical capital has its definite place in the macroprudential toolkit, we argue that DP acts as a first line of defense by directly shielding bank profits, thereby lowering the degree to which other countercyclical instruments are needed. However, there should be no doubt that due to the limited impact of DP in restraining excessive credit growth complacency in supervision due to DP buffers should be avoided and that DP needs to be accompanied by other macroprudential tools aimed at mitigating particular systemic risks.

Clearly, further research is needed on the interaction between DP and countercyclical capital as well as other macroprudential tools to answer the question in what ways they can complement one another in providing an integrated countercyclical buffer. As an early example, Saurina (2011) analyzes DP and countercyclical capital side-by-side but not their possible interaction. Another area of needed research is the impact of DP on credit cycles and other macroeconomic aggregates. Newer studies (e.g., Peydró-Alcalde et al., 2011; Chan-Lau, 2012) evaluate the implications of DP for credit availability, yet broader-based results are certainly warranted. The ongoing efforts by a number of countries towards adopting DP systems and other forms of forward-looking provisioning will provide a fertile ground for such future research.