Tuesday, May 8, 2012

Some scholars argue that top rates can be raised drastically with no loss of revenue

Of Course 70% Tax Rates Are Counterproductive. By Alan Reynolds
Some scholars argue that top rates can be raised drastically with no loss of revenue. Their arguments are flawed.WSJ, May 7, 2012
http://online.wsj.com/article/SB10001424052702303916904577376041258476020.html


President Obama and others are demanding that we raise taxes on the "rich," and two recent academic papers that have gotten a lot of attention claim to show that there will be no ill effects if we do.

The first paper, by Peter Diamond of MIT and Emmanuel Saez of the University of California, Berkeley, appeared in the Journal of Economic Perspectives last August. The second, by Mr. Saez, along with Thomas Piketty of the Paris School of Economics and Stefanie Stantcheva of MIT, was published by the National Bureau of Economic Research three months later. Both suggested that federal tax revenues would not decline even if the rate on the top 1% of earners were raised to 73%-83%.

Can the apex of the Laffer Curve—which shows that the revenue-maximizing tax rate is not the highest possible tax rate—really be that high?

The authors arrive at their conclusion through an unusual calculation of the "elasticity" (responsiveness) of taxable income to changes in marginal tax rates. According to a formula devised by Mr. Saez, if the elasticity is 1.0, the revenue-maximizing top tax rate would be 40% including state and Medicare taxes. That means the elasticity of taxable income (ETI) would have to be an unbelievably low 0.2 to 0.25 if the revenue-maximizing top tax rates were 73%-83% for the top 1%. The authors of both papers reach this conclusion with creative, if wholly unpersuasive, statistical arguments.

Most of the older elasticity estimates are for all taxpayers, regardless of income. Thus a recent survey of 30 studies by the Canadian Department of Finance found that "The central ETI estimate in the international empirical literature is about 0.40."

But the ETI for all taxpayers is going to be lower than for higher-income earners, simply because people with modest incomes and modest taxes are not willing or able to vary their income much in response to small tax changes. So the real question is the ETI of the top 1%.

Harvard's Raj Chetty observed in 2009 that "The empirical literature on the taxable income elasticity has generally found that elasticities are large (0.5 to 1.5) for individuals in the top percentile of the income distribution." In that same year, Treasury Department economist Bradley Heim estimated that the ETI is 1.2 for incomes above $500,000 (the top 1% today starts around $350,000).

A 2010 study by Anthony Atkinson (Oxford) and Andrew Leigh (Australian National University) about changes in tax rates on the top 1% in five Anglo-Saxon countries came up with an ETI of 1.2 to 1.6. In a 2000 book edited by University of Michigan economist Joel Slemrod ("Does Atlas Shrug?"), Robert A. Moffitt (Johns Hopkins) and Mark Wilhelm (Indiana) estimated an elasticity of 1.76 to 1.99 for gross income. And at the bottom of the range, Mr. Saez in 2004 estimated an elasticity of 0.62 for gross income for the top 1%.

A midpoint between the estimates would be an elasticity for gross income of 1.3 for the top 1%, and presumably an even higher elasticity for taxable income (since taxpayers can claim larger deductions if tax rates go up.)

But let's stick with an ETI of 1.3 for the top 1%. This implies that the revenue-maximizing top marginal rate would be 33.9% for all taxes, and below 27% for the federal income tax.

To avoid reaching that conclusion, Messrs. Diamond and Saez's 2011 paper ignores all studies of elasticity among the top 1%, and instead chooses a midpoint of 0.25 between one uniquely low estimate of 0.12 for gross income among all taxpayers (from a 2004 study by Mr. Saez and Jonathan Gruber of MIT) and the 0.40 ETI norm from 30 other studies.

That made-up estimate of 0.25 is the sole basis for the claim by Messrs. Diamond and Saez in their 2011 paper that tax rates could reach 73% without losing revenue.

The Saez-Piketty-Stantcheva paper does not confound a lowball estimate for all taxpayers with a midpoint estimate for the top 1%. On the contrary, the authors say that "the long-run total elasticity of top incomes with respect to the net-of-tax rate is large."

Nevertheless, to cut this "large" elasticity down, the authors begin by combining the U.S. with 17 other affluent economies, telling us that elasticity estimates for top incomes are lower for Europe and Japan. The resulting mélange—an 18-country "overall elasticity of around 0.5"—has zero relevance to U.S. tax policy.

Still, it is twice as large as the ETI of Messrs. Diamond and Saez, so the three authors appear compelled to further pare their 0.5 estimate down to 0.2 in order to predict a "socially optimal" top tax rate of 83%. Using "admittedly only suggestive" evidence, they assert that only 0.2 of their 0.5 ETI can be attributed to real supply-side responses to changes in tax rates.

The other three-fifths of ETI can just be ignored, according to Messrs. Saez and Piketty, and Ms. Stantcheva, because it is the result of, among other factors, easily-plugged tax loopholes resulting from lower rates on corporations and capital gains.

Plugging these so-called loopholes, they say, requires "aligning the tax rates on realized capital gains with those on ordinary income" and enacting "neutrality in the effective tax rates across organizational forms." In plain English: Tax rates on U.S. corporate profits, dividends and capital gains must also be 83%.

This raises another question: At that level, would there be any profits, capital gains or top incomes left to tax?

"The optimal top tax," the three authors also say, "actually goes to 100% if the real supply-side elasticity is very small." If anyone still imagines the proposed "socially optimal" tax rates of 73%-83% on the top 1% would raise revenues and have no effect on economic growth, what about that 100% rate?

Mr. Reynolds is a senior fellow with the Cato Institute and the author of "Income and Wealth" (Greenwood Press, 2006).

Bank Capitalization as a Signal. By Daniel C. Hardy

Bank Capitalization as a Signal. By Daniel C. Hardy
IMF Working Paper No. 12/114
May 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25894.0

Summary: The level of a bank‘s capitalization can effectively transmit information about its riskiness and therefore support market discipline, but asymmetry information may induce exaggerated or distortionary behavior: banks may vie with one another to signal confidence in their prospects by keeping capitalization low, and banks‘ creditors often cannot distinguish among them - tendencies that can be seen across banks and across time. Prudential policy is warranted to help offset these tendencies.

Friday, May 4, 2012

Women, Welch Clash at Forum - "Great women get upset about getting into the victim's unit"

Women, Welch Clash at Forum. By John Bussey
Wall Street Journal, May 4, 2012, page B1
http://online.wsj.com/article/SB10001424052702303877604577382321364803912.html

Is Jack Welch a timeless seer or an out-of-touch warhorse?

The former Master and Commander of General Electric still writes widely on business strategy. He's also influential on the speaking circuit.

On Wednesday, Mr. Welch and his wife and writing partner, Suzy Welch, told a gathering of women executives from a range of industries that, in matters of career track, it is results and performance that chart the way. Programs promoting diversity, mentorships and affinity groups may or may not be good, but they are not how women get ahead. "Over deliver," Mr. Welch advised. "Performance is it!"

Angry murmurs ran through the crowd. The speakers asked: Were there any questions?

"We're regaining our consciousness," one woman executive shot back.

Mr. Welch had walked into a spinning turbine fan blade.

"Of course women need to perform to advance," Alison Quirk, an executive vice president at the investment firm State Street Corp., said later. "But we can all do more to help people understand their unconscious biases."

"He showed no recognition that the culture shapes the performance metrics, and the culture is that of white men," another executive said.

Academy Award winning actor Geena Davis talks about the perception of women as seen in the media and about what has and has not changed in the past sixty years.

Dee Dee Myers, a former White House press secretary who is now with Glover Park Group, a communications firm, added: "While he seemed to acknowledge the value of a diverse workforce, he didn't seem to think it was necessary to develop strategies for getting there—and especially for taking a cold, hard look at some of the subtle barriers to women's advancement that still exist. If objective performance measures were enough, more than a handful of Fortune 500 senior executives would already be women. "

"This meritocracy fiction may be the single biggest obstacle to women's advancement," added Lisa Levey, a consultant who heard Mr. Welch speak.

Mr. Welch has sparked controversy in the past with his view of the workplace. In 2009, he told a group of human-resources managers: "There's no such thing as work-life balance." Instead, "there are work-life choices, and you make them, and they have consequences." Step out of the arena to raise kids, and don't be surprised if the promotion passes you by.

Of the Fortune 500 companies, only 3% have a female CEO today. Female board membership is similarly spare. A survey of 60 major companies by McKinsey shows women occupying 53% of entry-level positions, 40% of manager positions, and only 19% of C-suite jobs.

The reasons for this are complex and aren't always about child rearing. A separate McKinsey survey showed that among women who have already reached the status of successful executive, 59% don't aspire to one of the top jobs. The majority of these women have already had children.

"Their work ethic—these people are doing it all," said Dominic Barton of McKinsey. "They say, 'I'm the person turning off the lights'" at the end of the day.

Instead, Mr. Barton said, it's "the soft stuff, the culture" that's shaping their career decisions.

The group of women executives who wrestled with Mr. Welch were at a conference on Women in the Economy held by The Wall Street Journal this week. Among other things, they tackled the culture questions—devising strategies to get more high-performing women to the top, keep women on track during childbearing years, address bias, and make the goals of diversity motivating to employees. They also discussed the sexual harassment some women still experience in the workplace. (A report on the group's findings will be published in the Journal Monday.)

The realm of the "soft stuff" may not be Mr. Welch's favored zone. During his remarks, he referred to human resources as "the H.R. teams that are out there, most of them for birthdays and picnics." He mentioned a women's forum inside GE that he says attracted 500 participants. "The best of the women would come to me and say, 'I don't want to be in a special group. I'm not in the victim's unit. I'm a star. I want to be compared with the best of your best.'"

And then he addressed the audience: "Stop lying about it. It's true. Great women get upset about getting into the victim's unit."

Individual mentoring programs, meanwhile, are "one of the worst ideas that ever came along," he said. "You should see everyone as a mentor."

He had this advice for women who want to get ahead: Grab tough assignments to prove yourself, get line experience, and embrace serious performance reviews and the coaching inherent in them.

"Without a rigorous appraisal system, without you knowing where you stand...and how you can improve, none of these 'help' programs that were up there are going to be worth much to you," he said. Mr. Welch said later that the appraisal "is the best way to attack bias" because the facts go into the document, which both parties have to sign.

Mr. Welch championed the business philosophy of "Six Sigma" at GE, a strategy that seeks to expunge defects from production through constant review and improvement. It appears to work with machines and business processes.

But applying that clinical procedure to the human character, as Mr. Welch seems to want to do, is a stickier proposition.

"His advice was not tailored to how women can attain parity in today's male-dominated workplace," said one female board member of a Fortune 500 company. Indeed, a couple of women walked out in frustration during his presentation.

Wednesday, May 2, 2012

Dynamic Loan Loss Provisioning: Simulations on Effectiveness and Guide to Implementation

Dynamic Loan Loss Provisioning: Simulations on Effectiveness and Guide to Implementation. By Torsten Wezel, Jorge A. Chan Lau, and Francesco Columba
IMF Working Paper No. 12/110
May 01, 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25885.0

Summary: This simulation-based paper investigates the impact of different methods of dynamic provisioning on bank soundness and shows that this increasingly popular macroprudential tool can smooth provisioning costs over the credit cycle and lower banks’ probability of default. In additon, the paper offers an in-depth guide to implementation that addresses pertinent issues related to data requirements, calibration and safeguards as well as accounting, disclosure and tax treatment. It also discusses the interaction of dynamic provisioning with other macroprudential instruments such as countercyclical capital.

Excerpts:

Introduction

Reducing the procyclicality of the banking sector by way of macroprudential policy instruments has become a policy priority. The recent crisis has illustrated how excessive procyclicality of the banking system may activate powerful macro-financial linkages that amplify the business cycle and how increased financial instability can have large negative spillover effects onto the real sector. Moreover, research has shown that crises that included banking turmoil are among the longest and most severe of all crises.

Although there is no consensus yet on the very definition of macroprudential policy, an array of such tools, especially those of countercyclical nature, has been applied in many countries for years. But it was only during the financial crisis that powerful macro-financial linkages played out on a global scale, conveying a sense of urgency.

In the wake of the crisis, policymakers therefore intensified their efforts to gear the macroprudential approach to financial stability towards improving banks’ capacity to absorb shocks—a consultative process that culminated in the development of the Basel III framework in December 2010 to be phased in over the coming years. In addition to improving the quality of bank capital and liquidity as well as imposing a minimum leverage ratio, this new regulatory standard introduces countercyclical capital buffers and lends support to forward-looking loan loss provisioning, which comprises dynamic provisioning (DP).

The new capital standard promotes the build-up of capital buffers in good times that can be drawn down in periods of stress, in the form of a capital conservation requirement to increase the banking sector’s resilience entering into a downturn. Part of this conservation buffer would be a countercyclical buffer that is to be activated only when there is excess credit growth so that the sector is not destabilized in the downturn. Such countercyclical capital has also been characterized as potentially cushioning the economy’s real output during a crisis (IMF, 2011). Similarly, dynamic provisioning requires banks to build a cushion of generic provisions during an upswing that can be used to cover rising specific provisions linked to loan delinquencies during the subsequent downturn.

Both countercyclical capital and DP have been applied in practice. Some countries have adjusted capital regulations in different phases of the cycle to give them a more potent countercyclical impact: Brazil has used a formula to smooth capital requirements for interest rate risk in times of extreme volatility, China introduced a countercyclical capital requirement similar to the countercyclical buffer under Basel III, and India has made countercyclical adjustments in risk weights and in provisioning. DP was first introduced by Spain in 2000 and subsequently adopted in Uruguay, Colombia, Peru, and Bolivia, while other countries such as Mexico and Chile switched to provisioning based on expected loan loss. Peru is the only country to explicitly use both countercyclical instruments in combination.

The concept of DP examined in this paper is intriguing. By gradually building a countercyclical loan loss reserve in good times and then using it to cover losses as they arise in bad times, DP is able to greatly smooth provisioning costs over the cycle and thus insulate banks’ profit and loss statements in this regard. Therefore, DP may usefully complement other policies targeted more at macroeconomic aggregates. The implementation of DP can, however, be a delicate balancing exercise. The calibration is typically challenging because it requires specific data, and even if these are available, it may still be inaccurate if the subsequent credit cycle differs substantially from the previous one(s) on which the model is necessarily predicated. Over-provisioning may ensue in particular instances. This said, a careful calibration that tries to incorporate as many of the stylized facts of past credit developments as possible goes a long way in providing a sizeable cushion for banks to withstand periodic downswings.

This paper provides strong support for DP as a tool for countercyclical banking policies. Our contribution to this strand of the literature is threefold. We first recreate a hypothetical path of provisions under different DP systems based on historical data of an emerging banking market and compare the outcome to the actual situation without DP. These counterfactual simulations suggest that a well-calibrated system of DP mitigates procyclicality in provisioning costs and thus earnings and capital. Second, using Monte-Carlo simulations we show that the countercyclical buffer that DP builds typically lowers a bank’s probability of default. Finally, we offer a guide to implementation of the DP concept that seeks to clarify issues related to data requirements, choice of formula, parametrization, accounting treatment, and recalibration.

Other studies that have used counterfactual simulations based on historical data to assess the hypothetical performance under DP include Balla and McKenna (2009), Fillat and Montoriol- Garriga (2010), both using U.S. bank data, and Wezel (2010), using data for Uruguay. All studies find support for the notion that DP, when properly calibrated, can help absorb rising loan losses in a downturn and thus be a useful macroprudential tool in this regard. Some other studies (Lim et al., 2011; Peydró-Alcalde et al., 2011) even find that DP is effective in mitigating swings in credit growth, although this should not be expected of DP in general.



Conclusion

This paper has provided a thorough analysis of the merits and challenges associated with dynamic provisioning—a macroprudential tool that deserves attention from policymakers and regulators for its capacity to distribute the burden of loan impairment evenly over the credit cycle and so quench an important source of procyclicality in banking. Our simulations that apply the Spanish and Peruvian DP formulas to a full cycle of banking data of an advanced emerging market leave little doubt that the countercyclical buffer built under DP not only smoothes costs but actually bolsters financial stability by lowering banks’ PD in severe downturn conditions. We also show that for best countercyclical results DP should be tailored to the different risk exposures of individual banks and the specific circumstances of banking sectors, presenting measures such as bank-specific rates or hybrid systems combining the virtues of formulas.

While the simple concept of providing in good times for lean years is intuitive, it has its operational challenges. When calibrating a DP system great care must be taken to keep countercyclical reserves in line with expected loan losses and so avoid insufficient buffers or excessive coverage. As many of the features and needed restrictions are not easily understood or operationalized, we offer a comprehensive primer for regulators eager to implement one of the variants of DP analyzed in the paper. The discussion of practical challenges also includes thorny issues like compliance with accounting standards. In fact, policymakers have long tended to dismiss DP on grounds that it is not legitimate from an accounting perspective and therefore focused on other tools such as countercyclical capital. To remedy this problem, we propose ways to recalibrate the formula periodically and so keep it in line with expected loan loss. Further, while recognizing that countercyclical capital has its definite place in the macroprudential toolkit, we argue that DP acts as a first line of defense by directly shielding bank profits, thereby lowering the degree to which other countercyclical instruments are needed. However, there should be no doubt that due to the limited impact of DP in restraining excessive credit growth complacency in supervision due to DP buffers should be avoided and that DP needs to be accompanied by other macroprudential tools aimed at mitigating particular systemic risks.

Clearly, further research is needed on the interaction between DP and countercyclical capital as well as other macroprudential tools to answer the question in what ways they can complement one another in providing an integrated countercyclical buffer. As an early example, Saurina (2011) analyzes DP and countercyclical capital side-by-side but not their possible interaction. Another area of needed research is the impact of DP on credit cycles and other macroeconomic aggregates. Newer studies (e.g., Peydró-Alcalde et al., 2011; Chan-Lau, 2012) evaluate the implications of DP for credit availability, yet broader-based results are certainly warranted. The ongoing efforts by a number of countries towards adopting DP systems and other forms of forward-looking provisioning will provide a fertile ground for such future research.

Tuesday, May 1, 2012

Pharma: New Tufts Report Shows Academic-Industry Partnerships Are Mutually Beneficial


New Tufts Report Shows Academic-Industry Partnerships Are Mutually Beneficial 
http://www.innovation.org/index.cfm/NewsCenter/Newsletters?NID=200
April 30, 2012 -

According to a new study by the Tufts Center for the Study of Drug Development, collaboration among organizations is becoming increasingly important to advancing basic research and developing new medicines. This study specifically explores the breadth and nature of partnerships between biopharmaceutical companies and academic medical centers (AMCs)
[1] which are likely to play an increasingly important role in making progress in treating unmet medical needs. 
In the study, researchers examine a subset of public-private partnerships, including more than 3,000 grants to AMCs from approximately 450 biopharmaceutical company sponsors that were provided through 22 medical schools. Findings show that while it is generally accepted that these partnerships have become an increasingly common approach both to promote public health objectives and to produce healthcare innovations, it is anticipated that their nature will continue to evolve over time and their full potential is yet to be realized.

Tufts researchers also found that the nature of these relationships is varied, ever-changing, and expanding. They often involve company and AMC scientists and other researchers working side-by-side on cutting-edge science, applying advanced tools and resources. This type of innovative research has enabled the United States to advance biomedical research in a number of areas, such as the development of personalized medicines and the understanding of rare diseases.

The report outlines the 12 primary models of academic-industry collaborations and highlights other emerging models, which reflect a shift in the nature of academic-industry relationships toward more risk- and resource-sharing partnerships. While unrestricted research support has generally represented the most common form of academic-industry collaboration, Tufts research found that this model is becoming less frequently used. A range of innovative partnership models are emerging, from corporate venture capital funds to pre-competitive research centers to increasingly used academic drug discovery centers.

These collaborations occur across all aspects of drug discovery and the partnerships benefit both industry and academia since they provide the opportunity for the leading biomedical researchers in both sectors to work together to explore new technologies and scientific discoveries. Such innovation in both the science and technology has the potential to treat the most challenging diseases and conditions facing patients today.

According to Tufts, “[t]he industry is funding and working collaboratively with the academic component of the public sector on basic research that contributes broadly across the entire spectrum of biomedical R&D, not just for products in its portfolio.” In conclusion, the report notes that in the face of an increasingly challenging R&D environment and overall global competition, we are likely to witness the continued proliferation of AMC-industry partnerships.



[1] C.P. Milne, et al., “Academic-Industry Partnerships for Biopharmaceutical Research & Development: Advancing Medical Science in the U.S.,” Tufts Center for the Study of Drug Development, April 2012.

Tuesday, April 24, 2012

From Bail-out to Bail-in: Mandatory Debt Restructuring of Systemic Financial Institutions

From Bail-out to Bail-in: Mandatory Debt Restructuring of Systemic Financial Institutions. By Zhou, Jian-Ping; Rutledge, Virginia; Bossu, Wouter; Dobler, Marc; Jassaud, Nadege; Moore, Michael
IMF Staff Discussion Notes No. 12/03
April 24, 2012            
ISBN/ISSN: 978-1-61635-392-6 / 2221-030X
Stock No: SDNETEA2012003
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25858.0

Excerpts

Executive Summary

Large-scale government support of the financial institutions deemed too big or too important to fail during the recent crisis has been costly and has potentially increased moral hazard. To protect taxpayers from exposure to bank losses and to reduce the risks posed by too-big-tofail (TBTF), various reform initiatives have been undertaken at both national and international levels, including expanding resolution powers and tools.

One example is bail-in, which is a statutory power of a resolution authority (as opposed to contractual arrangements, such as contingent capital requirements) to restructure the liabilities of a distressed financial institution by writing down its unsecured debt and/or converting it to equity. The statutory bail-in power is intended to achieve a prompt recapitalization and restructuring of the distressed institution. This paper studies its effectiveness in restoring the viability of distressed institutions, discusses potential risks when a bail-in power is activated, and proposes design features to mitigate these risks. The main conclusions are:

1. As a going-concern form of resolution, bail-in could mitigate the systemic risks associated with disorderly liquidations, reduce deleveraging pressures, and preserve asset values that might otherwise be lost in a liquidation. With a credible threat of stock elimination or dilution by debt conversion and assumption of management by resolution authorities, financial institutions may be incentivized to raise capital or restructure debt voluntarily before the triggering of the bail-in power.

2. However, if the use of a bail-in power is perceived by the market as a sign of the concerned institution’s insolvency, it could trigger a run by short-term creditors and aggravate the institution’s liquidity problem. Ideally, therefore, bail-in should be activated when a capital infusion is expected to restore a distressed financial institution to viability, with official liquidity support as a backstop until the bank is stabilized.

3. Bail-in is not a panacea and should be considered as one element of a comprehensive solution to the TBTF problem. It should supplement, not replace, other resolution tools that would allow for an orderly closure of a failed institution.

4. Most importantly, the bail-in framework needs to be carefully designed to ensure its effective implementation.
  • The triggers for bail-in power should be consistent with those used for other resolution tools. They should be set at the point when a firm would have breached the regulatory minima but before it became balance-sheet insolvent. To make bail-in a transparent tool, its scope should be limited to (i) elimination of existing equity shares as a precondition for a bail-in; and (ii) conversion and haircut to subordinated and unsecured senior debt.  Debt restructuring under a bail-in should take into account the order of priorities applicable in a liquidation.
  • A clear and coherent legal framework for bail-in is essential. The legal framework needs to be designed to establish an appropriate balance between the rights of private stakeholders and the public policy interest in preserving financial stability. Debt restructuring ideally would not be subject to creditor consent, but a “no creditor worse off” test may be introduced to safeguard creditors’ and shareholders’ interests. The framework also needs to provide mechanisms for addressing issues associated with the bail-in of debt issued by an entity of a larger banking group and with the cross-border operations of that entity or banking group.
  • The contribution of new capital will come from debt conversion and/or an issuance of new equity, with an elimination or significant dilution of the pre-bail in shareholders.  Bail-in will need to be accompanied by mechanisms to ensure the suitability of new shareholders. Some measures (e.g., a floor price for debt/equity conversion) might be necessary to reduce the risk of a “death spiral” in share prices.
  • It may be necessary to impose minimum requirements on banks for issuing unsecured debt or to set limits on the encumbrance of assets (which have been introduced by many advanced countries). This would help reassure the market that a bail-in would be sufficient to recapitalize the distressed institution, thus forestalling potential runs by short-term creditors and avert a downward share price spiral. The framework should also include measures to mitigate contagion risks to other systemic financial institutions, for example, by limiting their cross-holding of unsecured senior debt.

Conclusions

Bail-in power needs to be considered as an additional and complementary tool for the resolution of SIFIs. Bail-in is a statutory power of a resolution authority, as opposed to contractual arrangements, such as contingent capital requirements. It involves recapitalization through relatively straightforward mandatory debt restructuring and could therefore avoid some of the operational and legal complexities that arise when using other tools (such as P&A transactions), which require transferring assets and liabilities between different legal entities and across borders. By restoring the viability of a distressed SIFI, the pressure on the institution to post more collateral, for example against their repo contracts, could be significantly reduced, thereby minimizing liquidity risks and preventing runs by short-term creditors.

The design and implementation of a bail-in power, however, need to take into careful consideration its potential market impact and its implications for financial stability. It is especially important that the triggering of a bail-in power is not perceived by the market as a sign of the concerned institution’s non-viability, a perception that could trigger a run by short-term creditors and aggravate the institution’s liquidity problem. An effective bail-in framework generally includes the following key design elements:

  • The scope of the statutory power should be limited to (i) eliminating or diluting existing shareholders; and (ii) writing down or converting, in the following order, any contractual contingent capital instruments, subordinated debt, and unsecured senior debt, accompanied by the power of the resolution authority to change bank management.
  • The triggers for bail-in power should be consistent with those used for other resolution tools and set at the point when an insititution would have breached the regulatory minima but before it became balance-sheet insolvent, to allow for a prompt response to an SIFI’s financial distress. The intervention criteria (a combination of quantitative and qualitative assessments) need to be as transparent and predictable as possible to avoid market uncertainty.
  • It may be necessary to require banks or bank holding companies to maintain a minimum amount of unsecured liabilities (as a percentage of total liabilities) beforehand, which could be subject to bail-in afterwards. This would help reassure the market that bail-in is sufficient to recapitalize the distressed institution and restore its viability, thus reduce the risk of runs by short-term creditors.
  • To fund potential liquidity outflows, and given the probable temporary loss of market access, bail-in may need to be coupled with adequate official liquidity assistance.
  • Bail-in needs to be considered as one element of a comprehensive framework that includes effective supervision to reduce the likelihood of bank failures and an effective overall resolution framework that allows for an orderly resolution of a failed SIFI, facilitated by up-to-date recovery and resolution plans. In general, statutory bail-in should be used in instances where a capital infusion is likely to restore a distressed financial institution to viability, possibly because, other than a lack of capital, the institution is viable and has a decent business model and good riskmanagement systems. Otherwise, bail-in capital could simply delay the inevitable failure.

Central Bank Independence and Macro-prudential Regulation. By Kenichi Ueda & Fabian Valencia

Central Bank Independence and Macro-prudential Regulation. By Kenichi Ueda & Fabian Valencia
IMF Working Paper No. 12/101
Apr 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25872.0

Summary: We consider the optimality of various institutional arrangements for agencies that conduct macro-prudential regulation and monetary policy. When a central bank is in charge of price and financial stability, a new time inconsistency problem may arise. Ex-ante, the central bank chooses the socially optimal level of inflation. Ex-post, however, the central bank chooses inflation above the social optimum to reduce the real value of private debt. This inefficient outcome arises when macro-prudential policies cannot be adjusted as frequently as monetary. Importantly, this result arises even when the central bank is politically independent. We then consider the role of political pressures in the spirit of Barro and Gordon (1983). We show that if either the macro-prudential regulator or the central bank (or both) are not politically independent, separation of price and financial stability objectives does not deliver the social optimum.

Excerpts

Introduction

A growing literature based on models where pecuniary externalities reinforce shocks in the aggregate advocates the use of macro-prudential regulation (e.g. Bianchi (2010), Bianchi and Mendoza (2010), Jeanne and Korinek (2010), and Jeanne and Korinek (2011)). Most research in this area has focused on understanding the distortions that lead to financial amplification and to assess their quantitative importance. The natural next question is how to implement macro-prudential regulation.

Implementing macro-prudential policy requires, among other things, figuring out the optimal institutional design. In this context, there is an intense policy debate about the desirability of assigning the central bank formally with the responsibility of financial stability. This debate has spurred interest in studying the interactions between monetary and macro-prudential policies with the objective of understanding the conflicts and synergies that may arise from different institutional arrangements.

This paper contributes to this debate by exploring the circumstances under which it may be suboptimal to have the central bank in charge of macro-prudential regulation. We differ from a rapidly expanding literature on macro-prudential and monetary interactions, including De Paoli and Paustian (2011) and Quint and Rabanal (2011), mainly in that our focus is on the potential time-inconsistency problems that can arise, which are not addressed in existing work. Our departure point is the work pioneered by Kydland and Prescott (1977) and Barro and Gordon (1983) who studied how time-inconsistency problems and political pressures distort the monetary authority’s incentives under various institutional arrangements. In our model, there are two stages, in the first stage, the policymaker (possibly a single or several institutions) makes simultaneous monetary policy and macro-prudential regulation decisions. In the second stage, monetary policy decisions can be revised or “fine-tuned” after the realization of a credit shock.  This setup captures the fact that macro-prudential regulation is intended to be used preemptively, once a credit shock (boom or bust) have taken place, it can do little to change the stock of debt. Monetary policy, on the other hand, can be used ex-ante and ex-post.

The key finding of the paper is that a dual-mandate central bank is not socially optimal. In this setting, a time inconsistency problem arises. While it is ex-ante optimal for the dual-mandate central bank to deliver the socially optimal level of inflation, it is not so ex-post. This central bank has the ex-post incentive to reduce the real burden of private debt through inflation, similar to the incentives to monetize public sector debt studied in Calvo (1978) and Lucas and Stokey (1983).  This outcome arises because ex-post the dual-mandate central bank has only one tool, monetary policy, to achieve financial and price stability.

We then examine the role of political factors with a simple variation of our model in the spirit of Barro and Gordon (1983). We find that the above result prevails if policy is conducted by politically independent institutions. However, when institutions are not politically independent (the central bank, the macro-prudential regulator, or both) neither separate institutions nor combination of objectives in a single institution delivers the social optimum. As in Barro and Gordon (1983), the non-independent institution will use its policy tool at hand to try to generate economic expansions. The non-independent central bank will use monetary policy for this purpose and the non-independent macro-prudential regulator will use regulation. Which arrangement generates lower welfare losses in the case of non-independence depends on parameter values. A calibration of the model using parameter values from the literature suggest, however, that a regime with a non-independent dual-mandate central bank almost always delivers a worse outcome than a regime with a non-independent but separate macro-prudential regulator.

Finally, if the only distortion of concern is political interference (i.e. ignoring the time-inconsistency problem highlighted earlier) all that is needed to achieve the social optimum is political independence, with separation or combination of objectives yielding the same outcome.  From a policy perspective, our analysis suggests that a conflict between price and financial stability objectives may arise if pursued by a single institution. Our results also extend the earlier findings by Barro and Gordon (1983) and many others on political independence of the central bank to show that these results are also applicable to a macro-prudential regulator. We should note that we have abstracted from considering the potential synergies that may arise in having dual mandate institutions. For instance, benefits from information sharing and use of central bank expertise may mitigate the welfare losses we have shown may arise (see Nier, Osinski, J´acome and Madrid (2011)), although information sharing would also benefit fiscal and monetary interactions. However, we have also abstracted other aspects that could exacerbate the welfare loss such as loss in reputation.


Conclusions

We consider macro-prudential regulation and monetary policy interactions to investigate the welfare implications of different institutional arrangements. In our framework, monetary policy can re-optimize following a realization of credit shocks, but macro-prudential regulation cannot be adjusted immediately after the credit shock. This feature of the model captures the ability of adjusting monetary policy more frequently than macro-prudential regulation because macro-prudential regulation is an ex-ante tool, whereas monetary policy can be used ex-ante and ex-post. In this setting, a central bank with a price and financial stability mandate does not deliver the social optimum because of a time-inconsistency problem. This central bank finds it optimal ex-ante to deliver the social optimal level of inflation, but it does not do so ex-post. This is because the central bank finds it optimal ex-post to let inflation rise to repair private balance sheets because ex-post it has only monetary policy to do so. Achieving the social optimum in this case requires separating the price and financial stability objectives.

We also consider the role of political independence of institutions, as in Barro and Gordon (1983).  Under this extension, separation of price and financial stability objectives delivers the social optimum only if both institutions are politically independent. If the central bank or the macro-prudential regulator (or both) are not politically independent, they would not achieve the social optimum. Numerical analysis in our model suggest however, that in most cases a non-independent macro-prudential regulator (with independent monetary authority) delivers a better outcome than a non-independent central bank in charge of both price and financial stability.

Wednesday, April 18, 2012

Principles for financial market infrastructures, assessment methodology and disclosure framework

CPSS Publications No 101
April 2012

Final version of the Principles for financial market infrastructures

The report Principles for financial market infrastructures contains new and more demanding international standards for payment, clearing and settlement systems, including central counterparties. Issued by the CPSS and the International Organization of Securities Commissions (IOSCO), the  new standards (called "principles") are designed to ensure that the infrastructure supporting global financial markets is more robust and thus well placed to withstand financial shocks.

The principles apply to all systemically important payment systems, central securities depositories, securities settlement systems, central counterparties and trade repositories (collectively "financial market infrastructures"). They replace the three existing sets of international standards set out in the Core principles for systemically important payment systems (CPSS, 2001); the Recommendations for securities settlement systems (CPSS-IOSCO, 2001); and the Recommendations for central counterparties (CPSS-IOSCO, 2004). CPSS and IOSCO have strengthened and harmonised these three sets of standards by raising minimum requirements, providing more detailed guidance and broadening the scope of the standards to cover new risk-management areas and new types of FMIs.

The principles were issued for public consultation in March 2011. The finalised principles being issued now have been revised in light of the comments received during that consultation.

CPSS and IOSCO members will strive to adopt the new standards by the end of 2012. Financial market infrastructures (FMIs) are expected to observe the standards as soon as possible.

Consultation versions of an assessment methodology and disclosure framework

At the same time as publishing the final version of the principles, CPSS and IOSCO have issued two related documents for public consultation, namely an assessment methodology and a disclosure framework for these new principles.

Comments on these two documents are invited from all interested parties and should be sent by 15 June 2012 to both the CPSS secretariat (cpss@bis.org) and the IOSCO secretariat (fmi@iosco.org). The comments will be published on the websites of the Bank for International Settlements (BIS) and IOSCO unless commentators request otherwise. After the consultation period, the CPSS and IOSCO will review the comments received and publish final versions of the two documents later in 2012.

Other documents

A cover note that explains the background to the three documents above and sets out some specific points on the two consultation documents on which the committees are seeking comments during the public consultation period is also available.

A summary note that provides background on the report and an overview of its contents is also available.

Saturday, April 14, 2012

America's Voluntary Standards System--A "Best Practice" Model for Innovation Policy?

America's Voluntary Standards System--A "Best Practice" Model for Innovation Policy? By Dieter Ernst
East-West Center, Apr 2012
http://www.eastwestcenter.org/publications/americas-voluntary-standards-system-best-practice-model-innovation-policy

For its proponents, America's voluntary standards system is a "best practice" model for innovation policy. Foreign observers however are concerned about possible drawbacks of a standards system that is largely driven by the private sector. There are doubts, especially in Europe and China, whether the American system can balance public and private interests in times of extraordinary national and global challenges to innovation. To assess the merits of these conflicting perceptions, the paper reviews the historical roots of the American voluntary standards system, examines its current defining characteristics, and highlights its strengths and weaknesses. On the positive side, a tradition of decentralized local self-government, has given voice to diverse stakeholders in innovation, avoiding the pitfalls of top-down government-centered standards systems. However, a lack of effective coordination of multiple stakeholder strategies tends to constrain effective and open standardization processes, especially in the management of essential patents and in the timely provision of interoperability standards. To correct these drawbacks of the American standards system, the government has an important role to play as an enabler, coordinator, and, if necessary, an enforcer of the rules of the game in order to prevent abuse of market power by companies with large accumulated patent portfolios. The paper documents the ups and downs of the Federal Government’s role in standardization, and examines current efforts to establish robust public-private standards development partnerships, focusing on the Smart Grid Interoperability project coordinated by the National Institute of Standards and Technology (NIST). In short, countries that seek to improve their standards systems should study the strengths and weaknesses of the American system. However, persistent differences in economic institutions, levels of development and growth models are bound to limit convergence to a US-Style market-led voluntary standards system.

BCBS: Implementation of stress testing practices by supervisors

Implementation of stress testing practices by supervisors: Basel Committee publishes peer review
BCBS
April 13, 2012
http://www.bis.org/press/p120413.htm

The Basel Committee on Banking Supervision has today published a peer review of the implementation by national supervisory authorities of the Basel Committee's principles for sound stress testing practices and supervision.

Stress testing is an important tool used by banks to identify the potential for unexpected adverse outcomes across a range of risks and scenarios. In 2009, the Committee reviewed the performance of stress testing practices during the financial crisis and published recommendations for banks and supervisors entitled Principles for sound stress testing practices and supervision. The guidance set out a comprehensive set of principles for the sound governance, design and implementation of stress testing programmes at banks, as well as high-level expectations for the role and responsibilities of supervisors.

As part of its mandate to assess the implementation of standards across countries and to foster the promotion of good supervisory practice, the Committee's Standards Implementation Group (SIG) conducted a peer review during 2011 of supervisory authorities' implementation of the principles. The review found that stress testing has become a key component of the supervisory assessment process as well as a tool for contingency planning and communication. Countries are, however, at varying stages of maturity in the implementation of the principles; as a result, more work remains to be done to fully implement the principles in many countries.

Overall, the review found the 2009 stress testing principles to be generally effective. The Committee, however, will continue to monitor implementation of the principles and determine whether, in the future, additional guidance might be necessary.

Friday, April 13, 2012

Conference on macrofinancial linkages and their policy implications

Bank of Korea - Bank for International Settlements - International Monetary Fund: joint conference concludes on macrofinancial linkages and their policy implications
April 12, 2012
http://www.bis.org/press/p120412.pdf
 
The Bank of Korea, the Bank for International Settlements and the International Monetary Fund have today brought to a successful conclusion their joint conference on "Macrofinancial linkages: Implications for monetary and financial stability policies". Held on April 10-11 in Seoul, Korea, the event brought together central bankers, regulators and researchers to discuss a variety of topics related to interactions between the financial system and the real economy. The goal of the conference was to promote a continuing dialogue on the policy implications of recent research findings.

The conference programme included the presentation and discussion of research on the following issues:
  • Banks, shadow banks and the macroeconomy;
  • Bank liquidity regulation;
  • The macroeconomic impact of regulatory measures;
  • Macroprudential policies in theory and in practice;
  • Monetary policy and financial stability.
Efforts to recast monetary and financial stability policies to reduce the frequency and severity of financial crises have focused attention on the interactions between the financial system and the macroeconomy. The crisis demonstrated that financial system weaknesses can have sudden and long-lasting macroeconomic effects.

The conference concluded with a panel discussion chaired by Stephen Cecchetti (BIS), and including Jun Il Kim (Bank of Korea), Jan Brockmeijer (IMF), Hiroshi Nakaso (Bank of Japan), and David Fernandez (JP Morgan). The panel discussion focused on the lessons or guideposts for the formulation and implementation of macroprudential and monetary policies that can be drawn from the intensive research efforts on macrofinancial issues in recent years, as well as on the empirical evidence on the effectiveness of policy measures. The roundtable also included a discussion of weaknesses in our understanding of macrofinancial linkages and touched on priorities for future research, analysis, and continuing cooperation between central banks, regulatory authorities, international organisations and academics.

Introducing the conference, Choongsoo Kim, Governor of the Bank of Korea, said, "Since major countries' measures to reform financial regulations, including Basel III of the BCBS, focus mostly on the prevention of crisis recurrence, we need to continuously monitor and track how these measures will affect the sustainability of world economic growth in the medium- and long-term. In doing so, we should be careful so that the strengthening of financial regulation does not weaken the benign function of finance, which is to drive the growth of the real economy through seamless financial intermediation. Moreover, in today's more closely interconnected world economy, the strengthening of financial regulation with a primary focus on advanced countries does not equally affect the financial system in emerging market countries with their significantly different financial structure. Hence, in examining the implementation of regulations, an in-depth analysis should be conducted of how these regulations will affect the financial industries of emerging market countries and all other countries other than the advanced economies and their careful monitoring is called for."

Stephen Cecchetti, Economic Adviser and Head of the BIS Monetary and Economic Department, remarked that "It is important that we continue to learn about the mechanisms through which financial regulation helps to stabilize the economic and financial system. We are not only exploring the effectiveness of existing tools, but also working to fashion new ones. Doing this means refining the intellectual framework, including both the theoretical models and empirical analysis, that forms the basis for macroprudential policy and microprudential policy, as well as conventional and unconventional monetary policy. The papers presented and discussed in this conference are part of the foundation of this new and essential stability-oriented policy framework."

Jan Brockmeijer, Deputy Director of the IMF Monetary and Capital Markets Department, added that "All the institutions involved in developing macroprudential policy frameworks are on a learning curve both with regard to monitoring systemic risks and in using tools to limit such risks. In such circumstances, sharing of views and experiences is crucial to identifying best practices and moving up the learning curve quickly. The Fund is eager to help its members in this regard, and the conference co-organised by the Fund is one way to serve this purpose."

Wednesday, April 11, 2012

IMF Global Financial Stability Report: Risks of stricter prudential regulations

IMF Global Financial Stability Report
Apr 2012
http://www.imf.org/External/Pubs/FT/GFSR/2012/01/index.htm

Chapter 3 of the April 2012 Global Financial Stability Report probes the implications of recent reforms in the financial system for market perception of safe assets. Chapter 4 investigates the growing public and private costs of increased longevity risk from aging populations.

Excerpts from Ch. 3, Safe Assets: Financial System Cornerstone?:

In the future, there will be rising demand for safe assets, but fewer of them will be available, increasing the price for safety in global markets.  In principle, investors evaluate all assets based on their intrinsic characteristics. In the absence of market distortions, asset prices tend to reflect their underlying features, including safety. However, factors external to asset markets—including the required use of specific assets in prudential regulations, collateral practices, and central bank operations—may preclude markets from pricing assets efficiently, distorting the price of safety. Before the onset of the global financial crisis, regulations, macroeconomic policies, and market practices had encouraged the underpricing of safety. Some safety features are more accurately reflected now, but upcoming regulatory and market reforms and central bank crisis management strategies, combined with continued uncertainty and a shrinking supply of assets considered safe, will increase the price of safety beyond what would be the case without such distortions.

The magnitude of the rise in the price of safety is highly uncertain [...]

However, it is clear that market distortions pose increasing challenges to the ability of safe assets to fulfill all their various roles in financial markets. [...] For banks, the common application of zero percent regulatory risk weights on debt issued by their own sovereigns, irrespective of risks, created perceptions of safety detached from underlying economic risks and contributed to the buildup of demand for such securities. [...]

[...] Although regulatory reforms to make institutions safer are clearly needed, insufficient differentiation across eligible assets to satisfy some regulatory requirements could precipitate unintended cliff effects—sudden drops in the prices—when some safe assets become unsafe and no longer satisfy various regulatory criteria. Moreover, the burden of mispriced safety across types of investors may be uneven. For instance, prudential requirements could lead to stronger pressures in the markets for shorter-maturity safe assets, with greater impact on investors with higher potential allocations at shorter maturities, such as banks.

Money and Collaterall, by Manmohan Singh & Peter Stella

Money and Collateral, by Manmohan Singh & Peter Stella
IMF Working Paper No. 12/95
Apr 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25851.0

Summary: Between 1980 and before the recent crisis, the ratio of financial market debt to liquid assets rose exponentially in the U.S. (and in other financial markets), reflecting in part the greater use of securitized assets to collateralize borrowing. The subsequent crisis has reduced the pool of assets considered acceptable as collateral, resulting in a liquidity shortage. When trying to address this, policy makers will need to consider concepts of liquidity besides the traditional metric of excess bank reserves and do more than merely substitute central bank money for collateral that currently remains highly liquid.

Excerpts:

Introduction

In the traditional view of a banking system, credit and money are largely counterparts to each other on different sides of the balance sheet. In the process of maturity transformation, banks are able to create liquid claims on themselves, namely money, which is the counterpart to the less liquid loans or credit.2 Owing to the law of large numbers, banks have—for centuries— been able to safely conduct this business with relatively little liquid reserves, as long as basic confidence in the soundness of the bank portfolio is maintained.

In recent decades, with the advent of securitization and electronic means of trading and settlement, it became possible to greatly expand the scope of assets that could be transformed directly, through their use as collateral, into highly liquid or money-like assets. The expansion in the scope of the assets that could be securitized was in part facilitated by the growth of the shadow financial system, which was largely unregulated, and the ability to borrow from non-deposit sources. This meant deposits no longer equaled credit (Schularick and Taylor, 2008). The justification for light touch or no regulation of this new market was that collateralization was sufficient (and of high quality) and that market forces would ensure appropriate risk taking and dispersion among those educated investors best able to take those risks which were often tailor made to their demands. Where regulation fell short was in failing to recognize the growing interconnectedness of the shadow and regulated sectors, and the growing tail risk that sizable leverage entailed (Gennaioli, Shleifer and Vishny, 2011).

Post-Lehman, there has been a disintermediation process leading to a fall in the money multiplier. This is related to the shortage of collateral (Singh 2011). This is having a real impact—in fact deleveraging is more pronounced due to less collateral. Section II of the paper focuses on money as a legal tender, the money multiplier; then we introduce the adjusted money multiplier. Section III discusses collateral, including tail-risk collateral.  Section IV tries to bridge the money and collateral aspects from a “safe assets” angle. Section V introduces collateral chains and describes the economics behind the private pledged collateral market. Section VI brings the monetary and collateral issues together under an overall financial lubrication framework. In our conclusion (section VII) we offer a useful basis for understanding monetary policy in the current environment.



Conclusion

“Monetary” policy is currently being undertaken in uncharted territory and may change some fundamental assumptions that link monetary and macro-financial policies. Central banks are considering whether and how to augment the apparently ‘failed’ transmission mechanism and in so doing will need to consider the role that collateral plays as financial lubrication (see also Debelle, 2012). Swaps of “good” for “bad” collateral may become part of the standard toolkit.31 If so, the fiscal aspects and risks associated with such policies—which are virtually nil in conventional QE swaps of central bank money for treasuries—are important and cannot be ignored. Furthermore, the issue of institutional accountability and authority to engage in such operations touches at the heart of central bank independence in a democratic society.

These fundamental questions concerning new policy tools and institutional design have arisen at the same time as developed countries have issued massive amounts of new debt.  Although the traditional bogeyman of pure seigniorage financing, that is, massive monetary purchases of government debt may have disappeared from the dark corners of central banks, this does not imply that inflation has been forever arrested. Thus a central bank may “stand firm” yet witness rises in the price level that occur to “align the market value of government debt to the value of its expected real backing.” Hence current concerns as to the potential limitations fiscal policy places on monetary policy are well founded and indeed are novel only to those unfamiliar with similar concerns raised for decades in emerging and developing countries as well as in the “mature” markets before World War II.

Thursday, April 5, 2012

IMF Background Material for its Assessment of China under the Financial Sector Assessment Program

IMF Releases Background Material for its Assessment of China under the Financial Sector Assessment Program
Press Release No. 12/123
April 5, 2012

A joint International Monetary Fund (IMF) and The World Bank assessment of China's financial system was undertaken during 2010 under the Financial Sector Assessment Program (FSAP). The Financial System Stability Assessment (FSSA) report, which is the main IMF output of the FSAP process, was discussed by the Executive Board of the IMF at the time of the annual Article IV discussion in July 2011.

The FSSA report was published on Monday, November 14, 2011. As background for the FSSA, comprehensive assessments were undertaken by the FSAP mission of the financial regulatory infrastructure and the Detailed Assessment Reports of China's observance with international financial standards were prepared during the FSAP exercise. At the request of the Chinese authorities, these five reports are being released today.

The documents published are as follows:

Detailed Assessment of Observance Reports
  1. Observance of Basel Core Principles for Effective Banking Supervision
  2. Observance of IAIS Insurance Core Principles
  3. Observance of IOSCO Objectives and Principles of Securities Regulation
  4. Observance of CPSS Core Principles for Systemically Important Payment Systems
  5. Observance of CPSS-IOSCO Recommendations for Securities Settlement Systems and Central Counterparties

The FSAP is a comprehensive and in-depth analysis of a country’s financial sector. The FSAP findings provide inputs to the IMF’s broader surveillance of its member countries’ economies, known as Article IV consultations. The focus of the FSAP assessments is to gauge the stability of the financial sector and to assess its potential contribution to growth. To assess financial stability, an FSAP examines the soundness of the banks and other financial institutions, conducts stress tests, rates the quality of financial regulation and supervision against accepted international standards, and evaluates the ability of country authorities to intervene effectively in case of a financial crisis. Assessments in developing and emerging market countries are done by the IMF jointly with the World Bank; those in advanced economies are done by the IMF alone.

This is the first time the Chinese financial system has undergone an FSAP assessment.

Since the FSAP was launched in 1999, more than 130 countries have volunteered to undergo these examinations (many countries more than once), with another 35 or so currently underway or in the pipeline. Following the recent global financial crisis, demand for FSAP assessments has been rising, and all G-20 countries have made a commitment to undergo regular assessments.

For additional information on the program, see the Factsheet and FAQs.

Original link: http://www.imf.org/external/np/sec/pr/2012/pr12123.htm

Management Tips from the Wall Street Journal

Management Tips from the Wall Street Journal


Developing a Leadership Style

        Leadership Styles
        What do Managers do?
        Leadership in a Crisis – How To Be a Leader
        What are the Common Mistakes of New Managers?
        What is the Difference Between Management and Leadership?
        How Can Young Women Develop a Leadership Style?

Managing Your People

        How to Motivate Workers in Tough Times
        Motivating Employees
        How to Manage Different Generations
        How to Develop Future Leaders
        How to Reduce Employee Turnover
        Should I Rank My Employees?
        How to Keep Your Most Talented People
        Should I Use Email?
        How to Write Memos

Recruiting, Hiring and Firing

        Conducting Employment Interviews – Hiring How To
        How to Hire New People
        How to Make Layoffs
        What are Alternatives to Layoffs?
        How to Reduce Employee Turnover
        Should I Rank My Employees?
        How to Keep Your Most Talented People

Building a Workplace Culture

        How to Increase Workplace Diversity
        How to Create a Culture of Candor
        How to Change Your Organization’s Culture
        How to Create a Culture of Action in the Workplace

Strategy

        What is Strategy?
        How to Set Goals for Employees
        What Management Strategy Should I Use in an Economic Downturn?
        What is Blue Ocean Strategy?

Execution

        What are the Keys to Good Execution?
        How to Create a Culture of Action in the Workplace

Innovation

        How to Innovate in a Downturn
        How to Change Your Organization’s Culture
        What is Blue Ocean Strategy?

Managing Change

        How to Motivate Workers in Tough Times
        Leadership in a Crisis – How To Be a Leader
        What Management Strategy Should I Use in an Economic Downturn?
        How to Change Your Organization’s Culture

guides.wsj.com/management/

Sunday, April 1, 2012

Encouraging workers to keep track of what they're doing can make them healthier and more productive

Employees, Measure Yourselves. By H. James Wilson
Encouraging workers to keep track of what they're doing can make them healthier and more productiveThe Wall Street Journal, Apr 2012
http://online.wsj.com/article/SB10001424052970204520204577249691204802060.html

Imagine how much better workers could do their jobs if they knew exactly how they spend their day.

Suppose they could get a breakdown of how much time they spend actually working on their computer, as opposed to surfing the Web. Suppose they could tell how much an afternoon workout boosts their productivity, or how much a stressful meeting raises their heart rate.

Thanks to a new wave of technologies called auto-analytics, they can do just that. These devices—from computer software and smartphone apps to gadgets that you wear—let users gather data about what they do at work, analyze that information and use it to do their job better. They give workers a fascinating window into the unseen, unconscious little things that can make such a big difference in their daily work lives. And by encouraging workers to start tracking their own activities—something many already are doing on their own—companies can end up with big improvements in job performance, satisfaction and possibly even well-being.

The key word here is encouragement. It is not the same as insistence. Bosses should be careful to stay out of workers' way, letting employees experiment at their own pace and find their own solutions. They should offer them plenty of privacy safeguards along the way. Too much managerial interference could make the programs seem like Big Brother and dissuade workers from signing on. There's a big difference between employees wanting to measure themselves, and bosses demanding it.

Here's a look at three areas of auto-analytics that are gaining followers in the workplace—and that merit encouragement from managers.



Tracking Screen Time
Many companies monitor what their employees are doing on the computer all day, by watching network traffic or even taking screenshots at random times. But all that oversight is designed to make sure people aren't slacking off; it doesn't help them figure out how to do their jobs better. And besides, a lot of workers probably think it's kind of creepy to have someone watching over their shoulder.

On the other hand, workers are a lot more comfortable with close scrutiny when they're the ones doing the watching.

People are signing on in droves to a new technology called knowledge workload tracking—recording how you use your computer. Software like RescueTime measures things like how long you spend on an open window, how long you're idle and how often you switch from one window to another. The software turns all those measurements into charts so you can see where you're spending your time. From there, you can set up automatic alerts to keep yourself away from distractions; you might send yourself a message if you, say, spend too much time on Twitter.

Programs like these also let you look a lot deeper into your behavior. One employee I observed saw that he got a lot more done when he switched tasks at set intervals. So he had the software remind him to change things up every 20 minutes. (He also set up an algorithm that suggested the best activity to do next.)

Another employee, a programmer, thought his online chats were eating into his work time. So he tested the theory: He looked at how long he spent chatting during certain periods, then looked at how much code he wrote during those times. But in fact, the more he talked, the more code he wrote. Gabbing online with colleagues and customers helped his work.

Managers should encourage experiments and help workers get the ball rolling. They might, for instance, find workers who got good results from the software and have them give presentations to other employees.

Again, though, companies need to use a light touch in encouraging employees: Many workers might be reluctant to track what they do if they think the company might get access to the information, or use it against them. Companies should emphasize that this type of software usually comes with lots of privacy controls. Workers can often store their data in the cloud, for instance, or locally on their machines. In some cases, they can pause tracking and delete pieces of personal data they choose. Likewise, they can also create a list of sites that they want to track by name and label all the other sites they visit as generic.



Collecting Thoughts

Tracking clicks and keystrokes is one thing. But another set of tools goes one step deeper and lets employees track their mental performance—and maybe even improve it.

These tools come in a variety of styles. For example, there's Lumosity, from Lumos Labs Inc., an online system that serves up games employees can play during downtime at work. The games promise to develop memory, thinking speed, attention and problem-solving abilities.

You might have to sort a batch of words into two piles depending on whether or not they follow a certain rule. Or you might be presented with two equations and have to figure if the one on the left is greater than, less than or equal to the one on the right. The software will feed you tougher challenges once you've mastered one level of difficulty.

So far, that might not sound much different than other games you might play at the office. (Minesweeper, anyone?) The difference is tracking. The games offer a scorecard of your performance and let you follow changes in performance over time, so you can see if you're getting better or backsliding. You can also choose what skills you want to improve. If you're having trouble remembering things, for instance, you might ask for memory-boosting games. So, while it may seem like just another game, it can home in on skills you're trying to sharpen for work—and improve them.

Another set of tools promises to help with a couple of age-old problems: forgetting ideas or the context in which you thought of them (or having so many of them you can't decide which will work best for the task at hand).

The method, called cognitive mapping, powers software like TheBrain, from TheBrain Technologies LP. When you get an idea related to work, you type it into the software on your desktop or mobile device. You place it near related ideas by clicking on a visual map that shows clusters of concepts grouped together by category like constellations on your screen.

Let's say your job is designing products for a household-goods company, and you get an idea about a new kind of sponge. You might click on the cluster of ideas for kitchen-cleaning products, which covers mops and paper towels as well as sponges. Then you'd click on the smaller cluster of ideas about sponges and type in your new notion. You'd also be able to attach things like links to websites, photos and meeting notes.

Later on, if you need to come up with some ideas in a particular area, you might type in a few search terms to see the thoughts you've had on the topic and the clusters of ideas and information you originally associated with those terms. Thus, you not only have a historical record of your thoughts, but also detailed insight into the context in which they were created.

As with knowledge workload tracking, employers should encourage workers to use these systems and give them freedom to experiment. But companies can probably be more active in pushing these products, since they don't have the same Big Brother associations as tracking work. So managers might buy subscriptions for influential employees who can help seed interest across the company. If they think it's warranted, managers might even buy companywide subscriptions, as they do for other types of software.




The Physical Side

There's one area where employers are already doing a lot to encourage workers to track themselves: company-sponsored wellness programs. More than two-thirds of companies around the world run wellness programs, and self-tracking tools are fast becoming a common feature.

Usually, the third-party companies that manage the programs give workers tracking devices that can synch up with an external database through a smartphone or work computer. That way, employees can crunch their own data and come up with options for improving health and job performance.

For instance, you might wear a device like Jawbone's UP wristband, which tracks sleep quantity and quality. You could then analyze your data to see how different amounts of sleep affect your work. Do you close more sales on days when you get more quality sleep? Or do you post better numbers when you sacrifice some shut-eye to entertain clients until all hours?

Another approach is tracking how your body works over the course of a workday with a tool such as the emWave2, from HeartMath LLC, which monitors your pulse. You can then look at your stats on a desktop dashboard to see, for instance, what sorts of situations cause you the most stress. The program can then recommend ways to reduce anxiety, such as breathing techniques that can help you reduce your heart rate during a big presentation.

Tracking things at this intimate level might set off all sorts of alarm bells for workers. Many might wonder if an employer could get hold of the information and use it against them. So bosses should ensure that workers have the chance to encrypt or otherwise protect their data.


Mr. Wilson is senior researcher at Babson Executive Education

Thursday, March 29, 2012

Revisiting Risk-Weighted Assets

Revisiting Risk-Weighted Assets. By Vanessa Le Leslé & Sofiya Avramova
IMF Working Paper No. 12/90
Mar 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25807.0

Summary: In this paper, the authors provide an overview of the concerns surrounding the variations in the calculation of risk-weighted assets (RWAs) across banks and jurisdictions and how this might undermine the Basel III capital adequacy framework. They discuss the key drivers behind the differences in these calculations, drawing upon a sample of systemically important banks from Europe, North America, and Asia Pacific. Then, the authors discuss a range of policy options that could be explored to fix the actual and perceived problems with RWAs, and improve the use of risk-sensitive capital ratios.


Introduction

Strengthening capital ratios is a key priority in the aftermath of the global financial crisis.  Increasing the quantity, quality, and transparency of capital is of paramount importance to restore the banking sector to health. Recent regulatory reforms have primarily focused on improving the numerator of capital ratios, while changes to the denominator, i.e., riskweighted assets (RWAs), have been more limited.

Why look at RWAs now? Confidence in reported RWAs is ebbing. Market participants question the reliability and comparability of capital ratios, and contend that banks may not be as strong as they are portrayed by risk-based capital ratios. The Basel Committee recently announced it will review the measurement of RWAs and formulate policy responses to foster greater consistency across banks and jurisdictions.

The academic literature on capital is vast, but the focus on RWAs is more limited. Current studies mostly emanate from market participants, who highlight the wide variations existing in RWAs across banks. There is no convergence in views about the materiality and relative importance of these differences, and thus no consensus on policy implications.

This paper aims to shed light on the scale of the RWA variation issue and identify possible policy responses. The paper (i) discusses the importance of RWAs in the regulatory capital framework; (ii) highlights the main concerns and the controversy surrounding RWA calculations; (iii) identifies key drivers behind the differences in RWA calculations across jurisdictions and business models; and (iv) concludes with a discussion on the range of options that could be considered to restore confidence in banks’ RWA numbers.

A comprehensive analysis of broader questions, such as what is the best way to measure risk or predict losses, and what is the optimal amount of capital that banks should hold per unit of risk, is beyond the scope of this study. A comparison of the respective merits of the leverage and risk-based capital ratios is also outside our discussion.


Conclusion

Perceived differences in RWAs within and across countries have led to a diminishing of trust in the reliability of RWAs and capital ratios, and if not addressed, could affect the credibility of the regulatory framework in general. This paper is a first step towards shedding light on the extent and causes of RWA variability and to foster policy debate.

The paper seeks to disentangle key factors behind observed differences in RWAs, but does not quantify how much of the RWA variance can be explained by each factor. It concludes that a host of factors drive differences in RWA outputs between firms within a region and indeed across regions; many of these factors can be justified, but some less so. Differences in RWAs are not only the result of banks’ business model, risk profile, and RWA methodology (good or bad), but also the result of different supervisory practices. Aiming for full harmonization and convergence of RWA practices may not be achievable, and we would expect some differences to remain. It may be more constructive to focus on improving the transparency and understanding of outputs, and on providing common guidance on methodologies, for banks and supervisors alike.

The paper identifies a range of policy options to address the RWA issue, and contends that a multipronged approach seems the most effective path of reform. A combination of regulatory changes to the RWA regime, enhanced supervision, increased market disclosure, and more robust internal risk management may help restore confidence in RWAs and safeguard the integrity of the capital framework. Finally, the paper contends that even if RWAs are not perfect, retaining risk-sensitive capital ratios is still very important, and the latter can be backstopped by using them in tandem with unweighted capital measures.

This paper aims to encourage discussion and policy suggestions, while the Basel Committee undertakes a more extensive review of the RWA framework.

Accounting Devices and Fiscal Illusions

Accounting Devices and Fiscal Illusions. By Timothy C. Irwin
IMF Staff Discussion Note SDN/12/02
March 28, 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25795.0
ISBN/ISSN: 978-1-61635-386-5 / 2221-030X

A government seeking to reduce its deficit can be tempted to replace genuine spending cuts or tax increases with accounting devices that give the illusion of change without its substance, or that make the change appear larger than it actually is. Under ideal accounting standards, this would not be possible, but in real accounting it sometimes is. For example, governments can sometimes sell assets or borrow money and count the proceeds as revenue, or defer unavoidable spending without recognizing a liability. In each case, this year’s reported deficit is reduced, but only at the expense of future deficits. The result is that the reported deficit loses some of its accuracy as a fiscal indicator.

The use of accounting stratagems cannot be eliminated, but several things can be done to reduce their use or at least bring them quickly to light. Governments can be encouraged to prepare audited financial statements—income statement, cash-flow statement, and balance sheet—according to international accounting standards, and statisticians, who in many countries use accounting data to compile the most important (“headline”) fiscal indicators, can be given the resources and independence to be both expert and impartial, as well as the authority to revise standards in the light of emerging problems. To help reveal remaining problems in headline fiscal indicators, a variety of alternative fiscal indicators can be monitored, since a problem suppressed in one fiscal indicator is likely to show up in another.  Many of the devices documented in this note would be revealed if governments also reported change in net worth and high-quality long-term forecasts of the headline indicator of the deficit under current policy. 

Wednesday, March 14, 2012

Could leveraging Public Credit Registries’ information improve supervision and regulation of financial systems?

Could leveraging Public Credit Registries’ information improve supervision and regulation of financial systems? By Jane Hwang
World Bank blogs, Mar 13, 2012

http://blogs.worldbank.org/allaboutfinance/could-leveraging-public-credit-registries-information-improve-supervision-and-regulation-of-financia

Monday, March 12, 2012

Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking. By Gianni De Nicolo, Andrea Gamba and Marcella Lucchetta
IMF Working Paper No. 12/72
March 01, 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25767.0

This paper studies the impact of bank regulation and taxation in a dynamic model with banks exposed to credit and liquidity risk. We find an inverted U-shaped relationship between capital requirements and bank lending, efficiency, and welfare, with their benefits turning into costs beyond a certain requirement threshold. By contrast, liquidity requirements reduce lending, efficiency and welfare significantly. The costs of high capital and liquidity requirements represent a lower bound on the benefits of these regulations in abating systemic risks. On taxation, corporate income taxes generate higher government revenues and entail lower efficiency and welfare costs than taxes on non-deposit liabilities.

Excerpts:
Introduction

The 2007-2008 financial crisis has been a catalyst for significant bank regulation reforms, as the pre-crisis regulatory framework has been judged inadequate to cope with large financial shocks. The new Basel III framework envisions a raise in bank capital requirements and the introduction of new liquidity requirements, while several proposals have been recently advanced to use forms of taxation with the twin objectives of raising funding to pay for resolution costs in stressed times, as well as a way to control bank risk-taking behavior.1 To date, however, the relatively large literature on bank regulation o ers no formal analysis where a joint assessment of these policies can be made in a dynamic model of banking where banks play a role and are exposed to multiple sources of risk. The formulation of such a dynamic banking model is the main contribution of this paper.

Our model is novel in three important dimensions. First, we analyze a bank that dynamically transforms short term liabilities into longer-term partially illiquid assets whose returns are uncertain. This feature is consistent with banks' special role in liquidity transformation emphasized in the literature (see e.g. Diamond and Dybvig (1983) and Allen and Gale (2007)).

Second, we model bank's financial distress explicitly. This allows us to examine optimal banks' choices on whether, when, and how to continue operations in the face of financial distress. The bank in our model invests in risky loans and risk-less bonds financed by (random) government-insured deposits and short-term debt. Financial distress occurs when the bank is unable to honor part or all of its debt and tax obligations for given realizations of credit and liquidity shocks. The bank has the option to resolve distress in three costly forms: by liquidating assets at a cost, by issuing fully collateralized bonds, or by issuing equity. The liquidation costs of assets are interpreted as fire sale costs, and modeled introducing asymmetric costs of adjustment of the bank's risky asset portfolio. The importance of fire sale costs in amplifying banks financial distress has been brought to the fore in the recent crisis (see e.g.  Acharya, Shin, and Yorulmazer (2010) and Hanson, Kashyap, and Stein (2011)).

Third, we evaluate the impact of bank regulations and taxation not only on bank optimal policies, but also in terms of metrics of bank efficiency and welfare. The first metric is the enterprise value of the bank, which can be interpreted as the efficiency with which the bank carries out its maturity transformation function. The second one, called \social value", proxies welfare in our risk-neutral world, as it summarizes the total expected value of bank activities to all bank stakeholders and the government. To our knowledge, this is the first study that evaluates the joint welfare implications of bank regulation and taxation.

Our benchmark bank is unregulated, but its deposits are fully insured. We consider this bank as the appropriate benchmark, since one of the asserted roles of bank regulation is the abatement of the excessive bank risk-taking arising from moral hazard under partial or total insurance of its liabilities. We use a standard calibration of the parameters of the model |with regulatory and tax parameters mimicking current capital regulation, liquidity requirement and tax proposals| to solve for the optimal policies and the metrics of efficiency and welfare. 

We obtain three sets of results. First, if capital requirements are mild, a bank subject only to capital regulation invests more in lending and its probability of default is lower than its unregulated counterpart. This additional lending is financed by higher levels of retained earnings or equity issuance. Importantly, under mild capital regulation bank efficiency and social values are higher than under no regulation, and their benefits are larger the higher are fire sale costs. However, if capital requirements become too stringent, then the efficiency and welfare benefits of capital regulation disappear and turn into costs, even though default risk remains subdued: lending declines, and the metrics of bank efficiency and social value drop below those of the unregulated bank. Thus, there exists an inverted-U-shaped relationship between bank lending, efficiency, welfare and the stringency of capital requirements. These novel findings suggest the existence of an optimal level of bank-specific regulatory capital under deposit insurance.

Second, the introduction of liquidity requirements reduces bank lending, efficiency and social value significantly, since these requirements hamper bank maturity transformation. In addition, the reduction in lending, efficiency, and social values increases monotonically with their stringency. When liquidity requirements are added to capital requirements, they also eliminate the benefits of mild capital requirements, since bank lending, efficiency and social values are reduced relative to the bank subject to capital regulation only. We should stress that these results do not have to be necessarily interpreted as an indictment of liquidity requirements. 

If liquidity requirements were found to be optimal regulations to correct some negative externalities arising from excessive bank's reliance on short term debt -which we do not model- then our results indicate how large the costs associated with these negative externalities should be to rationalize the need of liquidity requirements.

On taxation, an increase in corporate income taxes reduces lending, bank efficiency and social values due to standard negative income e ects. However, tax receipts increase, generating higher government revenues. With the introduction of a tax on non-deposit liabilities, which in our model is short-term debt, the decline in bank lending, efficiency and social values is larger than that under an increase in corporate taxation, while the increase in government tax receipts is lower. Therefore, in our model corporate taxation is preferable to a tax on non-deposit liabilities, although both forms of taxation reduce lending, efficiency and social value.


Conclusions

This paper has formulated a dynamic model of a bank exposed to credit and liquidity risk that can face financial distress by reducing loans, issuing secured debt, or issuing equity at a cost. We evaluated the joint impact of capital regulation, liquidity requirements and taxation on banks' optimal policies and metrics of bank efficiency of and welfare.

We have uncovered an important inverted U-shaped relationship between bank lending, bank effi ciency, social value and regulatory capital ratios. This result suggests the existence of optimal levels of regulatory capital, which are likely to be highly bank-specific, depending crucially on the configuration of risks a bank is exposed to as a function of the chosen business strategies. Similarly, our results on the high costs of liquidity requirements point out the adverse consequences of the repression of the key maturity transformation role of bank intermediation.  Given our finding of the adverse e ffects of liquidity requirements, the argument by Admati, DeMarzo, Hellwig, and Peiderer (2011) that capital requirements can be designed to substitute for liquidity requirements is reinforced. Finally, for the purpose of rising tax revenues, corporate income taxation seems preferable to taxation of non-deposit liabilities, since the former generates higher revenues and lower efficiency and welfare costs.

Overall, our results suggest that implementing non-trivial increases in capital requirements, liquidity requirements and taxation may be associated with costs significantly larger than what proponents of these policies may have thought. This implies that the benefits of these requirements in terms of their ability to abate systemic risk should at least o set the costs we have identified.