Monday, January 30, 2012

Liberals and Conservatives on Padilla's Fourth Circuit appeal

1  Liberals

In Padilla ruling, Fourth Circuit Court ignores U.S. international obligations.
January 24, 2012, 12:30 pm
http://compliancecampaign.wordpress.com/2012/01/24/in-padilla-ruling-fourth-circuit-court-ignores-u-s-international-obligations/

In a decision with international implications, a U.S. court has demonstrated a decided indifference to the United States’ international obligations on matters of human rights. On Monday the Fourth Circuit Court in Richmond, Va., ruled that the military policies of detention without charge and the harsh interrogation methods established by the Bush administration and continued in part by the Obama administration cannot be challenged in damage lawsuits in federal courts.

Issues raised by the case regarding the detention of terrorist suspects – in particular the treatment of Jose Padilla, a U.S. citizen held for nearly four years without charge as an “enemy combatant” – have been addressed specifically by international bodies to which the U.S. belongs, but these concerns did not factor in to the judges’ deliberations.

In dismissing the Padilla case, the court declared that under the Constitution, the making of counter-terrorism policy is entrusted solely to Congress and the President, and the courts may not “trespass” on this authority. The court therfore threw out the lawsuit brought by Padilla, who was seeking damages of one dollar from each of the defendants: Donald H. Rumsfeld, Former Secretary of Defense; Catherine T. Hanft, Former Commander Consolidated Brig; Melanie A. Marr, Former Commander Consolidated Brig; Lowell E. Jacoby, Vice Admiral, Former Director Defense Intelligence Agency; Paul Wolfowitz, Former Deputy Secretary Of Defense; William Haynes, Former General Counsel Department of Defense; Leon E. Panetta, Secretary of Defense.

Padilla had contended that he was entitled to sue the defendants because the government deprived him of other ways to seek remedies for his treatment, even under military code.

In its ruling, however, the court recognized the President’s purported absolute authority to hold terrorist suspects – even U.S. citizens – indefinitely and incommunicado as enemy combatants:

    On June 9, 2002, acting pursuant to his authority under the AUMF [2001 Authorization of Military Force], President George W. Bush issued an order to defendant Donald Rumsfeld, then Secretary of Defense, to detain Padilla as an enemy combatant, the President having determined that Padilla possessed vital intelligence and posed an ongoing threat to the national security of the United States.

    That day, Padilla was removed from civilian custody and transferred to the Naval Consolidated Brig at Charleston, South Carolina. While in military custody, Padilla claims that he was repeatedly abused, threatened with torture, deprived of basic necessities, and unjustifiably cut off from access to the outside world. Over time, these conditions were relaxed, and he was allowed monitored meetings with his attorneys.


The ruling seemed to downplay Padilla’s actual allegations though, which are not simply that he was “threatened with torture,” but in fact that he was tortured. According to his attorneys, Padilla was routinely mistreated and abused in ways designed to cause pain, anguish, depression and ultimately the loss of will to live.

“The extended torture visited upon Mr. Padilla has left him damaged, both mentally and physically,” said a court filing by Orlando do Campo, one of Padilla’s lawyers. The filing says that Padilla was subjected to sleep deprivation and extremes of heat and cold, and forced to stand for extended periods in painful “stress positions.”

His lawyers have also claimed that Padilla was forced to take LSD and PCP to act as truth serums during his interrogations.

As forensic psychiatrist Dr. Angela Hegarty, who interviewed Jose Padilla for 22 hours to determine the state of his mental health, told Democracy Now in 2007:

    What happened at the brig was essentially the destruction of a human being’s mind. That’s what happened at the brig. His personality was deconstructed and reformed.

    And essentially, like many abuse victims, whether it’s torture survivors or battered women or even children who are abused by parents, as long as the parents or the abuser is in control in their minds, essentially they identify with the primary aims of the abuser. And all abusers, whoever they are, have one absolute requirement, and that is that you keep their secret. I mean, it’s common knowledge that people who abuse children or women will say, “Look at what you made me do,” putting the blame on the victim, trying to instill guilt. “People will judge you. People will think you’re crazy if you tell them about this. You will be an enemy. You will be seen as an enemy. You will be seen as a bad person if this comes out. There will be dire and terrible consequences, not only for you.” Jose was very, very concerned that if torture allegations were made on his behalf, that somehow it would it interfere with the government’s ability to detain people at Guantanamo, and this was something he couldn’t sign onto. He was very identified with the goals of the government.

Dr. Hegarty commented specifically on the psychological effect of the prolonged isolation and sensory deprivation that Padilla was subjected to:

    This was the first time I ever met anybody who had been isolated for such an extraordinarily long period of time. I mean, the sensory deprivation studies, for example, tell us that without sleep, especially, people will develop psychotic symptoms, hallucinations, panic attacks, depression, suicidality within days. And here we had a man who had been in this situation, utterly dependent on his interrogators, who didn’t treat him all that nicely, for years. And apart from –- the only people I ever met who had such a protracted experience were people who were in detention camps overseas, that would come close, but even then they weren’t subjected to the sensory deprivation. So, yes, he was somewhat of a unique case in that regard.

Glossing over the specifics of Padilla’s four years of mistreatment, the Fourth Circuit’s decision instead treated these issues as mere policy decisions that were made expeditiously by the Executive and Legislative Branches – decisions that the Judiciary constitutionally has no say in.

The ruling makes clear the court’s opinion that the Judicial Branch has no competence to inject itself into matters that pertain to Congress’s war-making authority or the President’s powers as Commander-in-Chief, even when constitutional rights of U.S. citizens are involved:

    Special factors do counsel judicial hesitation in implying causes of action for enemy combatants held in military detention. First, the Constitution delegates authority over military affairs to Congress and to the President as Commander in Chief. It contemplates no comparable role for the judiciary. Second, judicial review of military decisions would stray from the traditional subjects of judicial competence.

The court noted that:

    Padilla’s complaint seeks quite candidly to have the judiciary review and disapprove sensitive military decisions made after extensive deliberations within the executive branch as to what the law permitted, what national security required, and how best to reconcile competing values. It takes little enough imagination to understand that a judicially devised damages action would expose past executive deliberations affecting sensitive matters of national security to the prospect of searching judicial scrutiny. It would affect future discussions as well, shadowed as they might be by the thought that those involved would face prolonged civil litigation and potential personal liability.

Further,

    This is a case in which the political branches, exercising powers explicitly assigned them by our Constitution, formulated policies with profound implications for national security. One may agree or not agree with those policies. One may debate whether they were or were not the most effective counterterrorism strategy. But the forum for such debates is not the civil cause of action pressed in the case at bar.

So, essentially, the Fourth Circuit Court in Richmond, Va., has washed the Judiciary’s hands of any responsibility in determining the constitutionality of any treatment of U.S. citizens who are designated by the Executive Branch as “enemy combatants.” Anything goes if the government calls you a terrorist, according to the court.

As Padilla’s attorney, Ben Wizner, said in a statement Monday:

    Today is a sad day for the rule of law and for those who believe that the courts should protect American citizens from torture by their own government. By dismissing this lawsuit, the appeals court handed the government a blank check to commit any abuse in the name of national security, even the brutal torture of a U.S. citizen on U.S. soil. This impunity is not only anathema to a democracy governed by laws, but contrary to history’s lesson that in times of fear our values are a strength, not a hindrance.

It could also be pointed out that since the Constitution provides that treaties entered into by the United States are “the supreme law of the land,” the court has issued the U.S. government a blank check to disregard this clause and violate international treaties at will, in particular the  International Covenant on Civil and Political Rights, ratified by the United States in 1992.

As Padilla was held in military custody for nearly four years without charge or trial, it appears the U.S. has violated of Article 9 of the ICCPR, which states:

    1. Everyone has the right to liberty and security of person. No one shall be subjected to arbitrary arrest or detention. No one shall be deprived of his liberty except on such grounds and in accordance with such procedure as are established by law.

    2. Anyone who is arrested shall be informed, at the time of arrest, of the reasons for his arrest and shall be promptly informed of any charges against him.

    3. Anyone arrested or detained on a criminal charge shall be brought promptly before a judge or other officer authorized by law to exercise judicial power and shall be entitled to trial within a reasonable time or to release. It shall not be the general rule that persons awaiting trial shall be detained in custody, but release may be subject to guarantees to appear for trial, at any other stage of the judicial proceedings, and, should occasion arise, for execution of the judgement.

    4. Anyone who is deprived of his liberty by arrest or detention shall be entitled to take proceedings before a court, in order that that court may decide without delay on the lawfulness of his detention and order his release if the detention is not lawful.

By denying Padilla a right to compensation in civil courts, the Fourth Circuit appears to have also overlooked this provision of the ICCPR: “Anyone who has been the victim of unlawful arrest or detention shall have an enforceable right to compensation.”

As a party to the Covenant, the U.S. is required to submit a report to the UN Human Rights Committee every five years on its compliance with the Covenant’s provisions.

The last report submitted by the United States – in 2005 – was seven years overdue. Regarding the matter of indefinite detention, the 2005 report pointed out that the U.S. Supreme Court has stated “that the United States is entitled to detain enemy combatants, even American citizens, until the end of hostilities, in order to prevent the enemy combatants from returning to the field of battle and again taking up arms.”

The U.S. asserted that “the detention of such individuals is such a fundamental and accepted incident of war that it is part of the ‘necessary and appropriate’ force that Congress authorized the President to use against nations, organizations, or persons associated with the September 11 terrorist attacks.”

The Human Rights Committee objected to this “restrictive interpretation made by the State party of its obligations under the Covenant,” and urged the U.S. to “review its approach and interpret the Covenant in good faith, in accordance with the ordinary meaning to be given to its terms in their context, including subsequent practice, and in the light of its object and purpose.”

The HRC had particularly harsh words for the U.S.’s indefinite detention policies: “The State party [the U.S.] should ensure that its counter-terrorism measures are in full conformity with the Covenant and in particular that the legislation adopted in this context is limited to crimes that would justify being assimilated to terrorism, and the grave consequences associated with it.”

The Committee reminded the United States of its obligations under the Covenant to both prosecute those responsible for using torture or cruel, inhuman or degrading treatment, and to provide compensation to the victims of such policies:

    The State party should conduct prompt and independent investigations into all allegations concerning suspicious deaths,  torture or cruel, inhuman or degrading treatment or punishment inflicted by its personnel (including commanders) as well as contract employees, in detention facilities in Guantanamo Bay, Afghanistan, Iraq and other overseas locations.  The State party should ensure that those responsible are prosecuted and punished in accordance with the gravity of the crime.  The State party should adopt all necessary measures to prevent the recurrence of such behaviors, in particular by providing adequate training and clear guidance to its personnel (including commanders) and contract employees, about their respective obligations and responsibilities, in line with articles 7 and 10 of the Covenant.  During the course of any legal proceedings, the State party should also refrain from relying on evidence obtained by treatment incompatible with article 7.  The Committee wishes to be informed about the measures taken by the State party to ensure the respect of the right to reparation for the victims.

By dismissing Padilla’s lawsuit, the Fourth Circuit Court has essentially done the opposite of what the UN Human Rights Committee has recommended to bring the U.S. in compliance with the ICCPR regarding its detention policies. The court has ensured, at least for now, that the right of reparations for the victims of U.S. detention and torture policies will remain unrecognized by the United States. It has ensured that the U.S. will remain in violation of its obligations under international law.


2  Conservatives

'Lawfare' Loses Big
The ACLU loses its nasty suit against former defense officials.
WSJ, Jan 28, 2012
http://online.wsj.com/article/SB10001424052970203718504577181191271527180.html

The guerrilla legal campaign against national security suffered a big defeat this week, and the good news deserves more attention. The victory for legal sanity came Monday when the Fourth Circuit Court of Appeals upheld a lower court decision to toss out a suit brought by aspiring terrorist Jose Padilla against a slew of Bush Administration officials.

Readers may remember that Padilla was arrested in 2002 for plotting to set off a dirty bomb on U.S. soil. He was detained as an enemy combatant, convicted in a Miami court and sentenced to 17 years in prison. But Padilla has been adopted as a legal mascot by the ACLU and the National Litigation Project at Yale Law School, which have sued far and wide alleging mistreatment and lack of due process.

Padilla may in fact have had more due process than any defendant in history. His case has been ruled on by no fewer than 10 civilian courts, and as a prisoner in the Navy brig in Charleston, South Carolina from 2002 to 2006 he received the benefit of protections under the highly disciplined U.S. Code of Military Justice. Your average bank robber should be so lucky.

But the lawyers suing for Padilla aren't interested in justice. They're practicing "lawfare," which is an effort to undermine the war on terror by making U.S. officials afraid to pursue it for fear of personal liability.

The ACLU and the rest of the legal left have failed to persuade several Congresses and two Administrations to agree to their anti-antiterror policies. So instead they're suing former officials in civilian court to harass them and damage their reputations. It's shameful stuff, and if it succeeds it would have the effect of making Pentagon officials look over their shoulder at potential lawsuits every time they had to make a difficult military or interrogation decision.

In Lebron v. Rumsfeld et al., the ACLU sued under the Supreme Court's 1971 Bivens decision, which has been interpreted as creating a right of action against the federal government. Their targets included a retinue of Pentagon officials, starting with former Secretary of Defense Donald Rumsfeld and going down to the Navy brig commander where Padilla was held. Mr. Rumsfeld doesn't have to worry about getting another job, but the ACLU wants to make lower-level officials politically radioactive so they have a difficult time getting promoted or working in any influential position.

The good news is that the Fourth Circuit's three-judge panel saw this for what it was and unanimously rejected the claims. In his 39-page opinion, the influential Judge J. Harvie Wilkinson wrote that the Constitution gives authority over military affairs to Congress and to the President as Commander in Chief, but it never created a similar role for the courts.

"It takes little enough imagination," Judge Wilkinson wrote, "to understand that a judicially devised damages action would expose past executive deliberations . . . [and] would affect future discussions as well, shadowed as they might be by the thought that those involved would face prolonged civil litigation and potential personal liability."

The decision is especially notable because one of the three judges is Clinton appointee Diana Motz, who has been a skeptic of the Bush Administration's detainee policies and has dissented from her colleagues in cases like 2003's Hamdi v. Rumsfeld.

The ACLU may appeal to all of the Fourth Circuit judges, but Judge Wilkinson's ruling is comprehensive enough that an appeal is unlikely to prevail. The judges deserve credit for understanding that the Constitution gave war powers to the political branches, not to courts. The country will be safer for it.

Friday, January 27, 2012

China's Cyber Thievery Is National Policy—And Must Be Challenged

China's Cyber Thievery Is National Policy—And Must Be Challenged. By MIKE MCCONNELL, MICHAEL CHERTOFF AND WILLIAM LYNN
It is more efficient for the Chinese to steal innovations and intellectual property than to incur the cost and time of creating their own.WSJ, Jan 27, 2012
http://online.wsj.com/article/SB10001424052970203718504577178832338032176.html

Only three months ago, we would have violated U.S. secrecy laws by sharing what we write here—even though, as a former director of national intelligence, secretary of homeland security, and deputy secretary of defense, we have long known it to be true. The Chinese government has a national policy of economic espionage in cyberspace. In fact, the Chinese are the world's most active and persistent practitioners of cyber espionage today.

Evidence of China's economically devastating theft of proprietary technologies and other intellectual property from U.S. companies is growing. Only in October 2011 were details declassified in a report to Congress by the Office of the National Counterintelligence Executive. Each of us has been speaking publicly for years about the ability of cyber terrorists to cripple our critical infrastructure, including financial networks and the power grid. Now this report finally reveals what we couldn't say before: The threat of economic cyber espionage looms even more ominously.

The report is a summation of the catastrophic impact cyber espionage could have on the U.S. economy and global competitiveness over the next decade. Evidence indicates that China intends to help build its economy by intellectual-property theft rather than by innovation and investment in research and development (two strong suits of the U.S. economy). The nature of the Chinese economy offers a powerful motive to do so.

According to 2009 estimates by the United Nations, China has a population of 1.3 billion, with 468 million (about 36% of the population) living on less than $2 a day. While Chinese poverty has declined dramatically in the last 30 years, income inequality has increased, with much greater benefits going to the relatively small portion of educated people in urban areas, where about 25% of the population lives.

The bottom line is this: China has a massive, inexpensive work force ravenous for economic growth. It is much more efficient for the Chinese to steal innovations and intellectual property—the source code of advanced economies—than to incur the cost and time of creating their own. They turn those stolen ideas directly into production, creating products faster and cheaper than the U.S. and others.

Cyberspace is an ideal medium for stealing intellectual capital. Hackers can easily penetrate systems that transfer large amounts of data, while corporations and governments have a very hard time identifying specific perpetrators.

Unfortunately, it is also difficult to estimate the economic cost of these thefts to the U.S. economy. The report to Congress calls the cost "large" and notes that this includes corporate revenues, jobs, innovation and impacts to national security. Although a rigorous assessment has not been done, we think it is safe to say that "large" easily means billions of dollars and millions of jobs.

So how to protect ourselves from this economic threat? First, we must acknowledge its severity and understand that its impacts are more long-term than immediate. And we need to respond with all of the diplomatic, trade, economic and technological tools at our disposal.

The report to Congress notes that the U.S. intelligence community has improved its collaboration to better address cyber espionage in the military and national-security areas. Yet today's legislative framework severely restricts us from fully addressing domestic economic espionage. The intelligence community must gain a stronger role in collecting and analyzing this economic data and making it available to appropriate government and commercial entities.

Congress and the administration must also create the means to actively force more information-sharing. While organizations (both in government and in the private sector) claim to share information, the opposite is usually the case, and this must be actively fixed.

The U.S. also must make broader investments in education to produce many more workers with science, technology, engineering and math skills. Our country reacted to the Soviet Union's 1957 launch of Sputnik with investments in math and science education that launched the age of digital communications. Now is the time for a similar approach to build the skills our nation will need to compete in a global economy vastly different from 50 years ago.

Corporate America must do its part, too. If we are to ever understand the extent of cyber espionage, companies must be more open and aggressive about identifying, acknowledging and reporting incidents of cyber theft. Congress is considering legislation to require this, and the idea deserves support. Companies must also invest more in enhancing their employees' cyber skills; it is shocking how many cyber-security breaches result from simple human error such as coding mistakes or lost discs and laptops.

In this election year, our economy will take center stage, as will China and its role in issues such as monetary policy. If we are to protect ourselves against irreversible long-term damage, the economic issues behind cyber espionage must share some of that spotlight.

Mr. McConnell, a retired Navy vice admiral and former director of the National Security Agency (1992-96) and director of national intelligence (2007-09), is vice chairman of Booz Allen Hamilton. Mr. Chertoff, a former secretary of homeland security (2005-09), is senior counsel at Covington & Burling. Mr. Lynn has served as deputy secretary of defense (2009-11) and undersecretary of defense (1997-2001).

Thursday, January 26, 2012

Sovereign Risk, Fiscal Policy, and Macroeconomic Stability

Sovereign Risk, Fiscal Policy, and Macroeconomic Stability. By Giancarlo Corsetti, Keith Kuester, Andre Meier, and Gernot J. Mueller
IMF Working Paper No. 12/33
January, 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25681.0

Abstract
This paper analyzes the impact of strained government finances on macroeconomic stability and the transmission of fiscal policy. Using a variant of the model by Curdia and Woodford (2009), we study a “sovereign risk channel” through which sovereign default risk raises funding costs in the private sector. If monetary policy is constrained, the sovereign risk channel exacerbates indeterminacy problems: private-sector beliefs of a weakening economy may become self-fulfilling. In addition, sovereign risk amplifies the effects of negative cyclical shocks. Under those conditions, fiscal retrenchment can help curtail the risk of macroeconomic instability and, in extreme cases, even stimulate economic activity.

Conclusion
The present paper analyzes how the ”sovereign risk channel” affects macroeconomic dynamics and stabilization policy. Through this channel, rising sovereign risk drives up private-sector borrowing costs, unless higher risk premia are offset by looser monetary policy. If the central bank is constrained in counteracting higher risk premia, sovereign risk becomes a critical determinant of macroeconomic outcomes. Its implications for stabilization policy have not been fully appreciated in earlier formal analyses, although they are likely to be of great importance for many advanced economies currently facing intense fiscal strain.

Building on the model proposed by C´urdia and Woodford (2009), we show that the sovereign risk channel makes the economy (more) vulnerable to problems of indeterminacy. In particular, private-sector beliefs about a weakening economy can become self-fulfilling, driving up risk premia and choking off demand. In this environment, a procyclical fiscal stance—that is, tighter fiscal policy during economic downturns–can help to ensure determinacy.

Further, we find that sovereign risk tends to exacerbate the effects of negative cyclical shocks: recessionary episodes will be deeper the stronger the sovereign risk channel, which in our specification is a nonlinear function of public-sector indebtedness. Moreover, in deep recessions that force the central bank down to the zero lower bound (ZLB) for nominal interest rates, sovereign risk delays the exit from the ZLB, hence prolonging macroeconomic distress.  The sovereign risk channel also has a significant bearing on fiscal multipliers. Specifically, the effect of government spending on aggregate output hinges on (i) the responsiveness of private-sector risk premia to indicators of fiscal strain; and (ii) the length of time during which monetary policy is expected to be constrained. Our analysis suggests that upfront fiscal retrenchment is less detrimental to economic activity (i.e., multipliers are smaller) in the presence of significant sovereign risk, as lower public deficits improve private-sector financing conditions. In relatively extreme cases where fiscal strains are severe and monetary policy is constrained for an extended period, fiscal tightening may even exert an expansionary effect.  That being said, fiscal retrenchment is no miracle cure. Indeed, all our simulations feature a deep recession even if tighter fiscal policy, under the aforementioned conditions, may stimulate economic activity relative to an even bleaker baseline.

As an additional caveat, we note that our analysis has focused on fiscal multipliers under a go-it-alone policy that does not involve external financial support at below-market rates.  Availability of such support could allow countries to stretch out the necessary fiscal adjustment as they benefit from lower funding costs and, possibly, positive credibility effects. Indeed, if and where announcements of future fiscal adjustment are credible, delaying some of the planned spending cuts remains a superior strategy in terms of protecting short-term growth.  How countries end up dealing with the challenges summarized here may prove to be a defining feature of global economic developments over the coming years.

Bank Funding Structures and Risk: Evidence from the Global Financial Crisis

Bank Funding Structures and Risk: Evidence from the Global Financial Crisis. By Francisco Vazquez and Pablo Federico
IMF Working Paper WP/12/29
http://www.imfbookstore.org/ProdDetails.asp?ID=WPIEA2012029
Jan, 2012

Summary: This paper analyzes the evolution of bank funding structures in the run up to the global financial crisis and studies the implications for financial stability, exploiting a bank-level dataset that covers about 11,000 banks in the U.S. and Europe during 2001–09. The results show that banks with weaker structural liquidity and higher leverage in the pre-crisis period were more likely to fail afterward. The likelihood of bank failure also increases with bank risk-taking. In the cross-section, the smaller domestically-oriented banks were relatively more vulnerable to liquidity risk, while the large cross-border banks were more susceptible to solvency risk due to excessive leverage. The results support the proposed Basel III regulations on structural liquidity and leverage, but suggest that emphasis should be placed on the latter, particularly for the systemically-important institutions. Macroeconomic and monetary conditions are also shown to be related with the likelihood of bank failure, providing a case for the introduction of a macro-prudential approach to banking regulation.


Introduction
The global financial crisis raised questions on the adequacy of bank risk management practices and triggered a deep revision of the regulatory and supervisory frameworks governing bank liquidity risk and capital buffers. Regulatory initiatives at the international level included, inter alia, the introduction of liquidity standards for internationally-active banks, binding leverage ratios, and a revision of capital requirements under Basel III (BCBS 2009; and BCBS 2010 a, b).2 In addition to these micro-prudential measures, academics and policymakers argued for the introduction of a complementary macro-prudential framework to help safeguard financial stability at the systemic level (Hanson, Kashyap and Stein, 2010).

This regulatory response was implicitly based on two premises. First, the view that individual bank decisions regarding the size of their liquidity and capital buffers in the run up to the crisis were not commensurate with their risk-taking—and were therefore suboptimal from the social perspective. Second, the perception that the costs of bank failures spanned beyond the interests of their direct stakeholders due, for example, to supply-side effects in credit markets, or network externalities in the financial sector (Brunnermeier, 2009).

The widespread bank failures in the U.S. and Europe at the peak of the global financial crisis provided casual support to the first premise. Still, empirical work on the connection between bank liquidity and capital buffers and their subsequent probability of failure is incipient.  Background studies carried out in the context of Basel III proposals, which are based on aggregate data, concluded that stricter regulations on liquidity and leverage were likely to ameliorate the probability of systemic banking crises (BCBS, 2010b).3 In turn, studies based on micro data for U.S. banks also support the notion that banks with higher asset liquidity, stronger reliance on retail insured deposits, and larger capital buffers were less vulnerable to failure during the global financial crisis (Berger and Bouwman, 2010; Bologna, 2011).  Broadly consistent results are reported in Ratnovski and Huang (2009), based on data for large banks from the OECD.

This paper makes two contributions to previous work. First, it measures structural liquidity and leverage in bank balance sheets in a way consistent with the formulations of the Net Stable Funding Ratio (NSFR), and the leverage ratio (EQUITY) proposed in Basel III. Second, it explores for systematic differences in the relationship between structural liquidity, leverage, and subsequent probability of failure across bank types. In particular, we distinguish between large, internationally-active banks (henceforth Global banks), and (typically smaller) banks that focus on their domestic retail markets (henceforth Domestic banks).

This sample partition is suitable from the financial stability perspective. Global banks are systemically important and extremely challenging to resolve, due to the complexity of their business and legal structures, and because their operations span across borders, entailing differences in bank insolvency frameworks and difficult fiscal considerations. Furthermore, the relative role of liquidity and capital buffers for bank financial soundness is likely to differ systematically across these two types of banks. All else equal, Global banks benefit from the imperfect co-movement macroeconomic and monetary conditions across geographic regions (Griffith-Jones, Segoviano, and Spratt, 2002; Garcia-Herrero and Vazquez, 2007) and may exploit their internal capital markets to reshuffle liquidity and capital between business units.  In addition, Global banks tend to enjoy a more stable funding base than Domestic banks due to flight to safety, particularly during times of market distress. To the extent that these factors are incorporated in bank risk management decisions, optimal choices on structural liquidity and leverage are likely to differ across these two types of banks.

The paper exploits a bank-level dataset that covers about 11,000 U.S. and European banks during 2001-09. This sample coverage allows us to study bank dynamics leading to, and during, the global financial crisis. As a by-product, we document the evolution of structural liquidity and leverage in the pre-crisis period, and highlight some patterns across bank types to motivate further research. Contrary to expectations, the average structural liquidity in bank balance sheets in the run up to the global financial crisis (as measured by a proxy of the NSFR) was close to the target values proposed in Basel III recommendations.4 However, we find a wide dispersion in structural liquidity across banks. A mild (albeit sustained) increase in structural liquidity mismatches in the run up to the crisis was driven by banks located at the lower extreme of the distribution. Pre-crisis leverage was also widely uneven across banks, with the Global banks displaying thinner capital buffers and wider gaps between leverage ratios and Basel capital to risk-weighted assets.

In line with alleged deficiencies in bank risk management practices, we find that banks with weaker structural liquidity and banks with higher leverage ratios in the run up to the crisis were more vulnerable to failure, after controlling for their pre-crisis risk-taking. However, the average effects of stronger structural liquidity and capital buffers on the likelihood of bank failure are not large. On the other hand, there is evidence of substantial threshold effects, and the benefits of stronger buffers appear substantial for the banks located at the lower extremes of the distributions. In addition, we find systematic differences in the relative importance of liquidity and leverage for financial fragility across groups of banks. Global banks were more susceptible to failure on excessive leverage, while Domestic banks were more susceptible to failure on weak structural liquidity (i.e., excessive liquidity transformation) and overreliance on short-term wholesale funding. 

In the estimations, we include bank-level controls for pre-crisis risk taking, and for countryspecific macroeconomic conditions (i.e., common to all banks incorporated in a given country). The use of controls for pre-crisis risk-taking is critical to this study. To the extent that banks perform active risk management, higher risk-taking would tend to be associated with stronger liquidity and capital buffers, introducing a bias to the results. In fact, we find that banks engaging in more aggressive risk taking in the run-up to the crisis—as measured by the rate of growth of their credit portfolios and by their pre-crisis distance to default— were more likely to fail afterward. Macroeconomic conditions in the pre-crisis period are also found to affect bank probabilities of default, suggesting that banks may have failed to internalize risks stemming from overheated economic activity and exuberant asset prices.

All in all, these results provide support to the proposed regulations on liquidity and capital, as well as to the introduction of a macro-prudential approach to bank regulation. From the financial stability perspective, however, the evidence indicates that regulations on capital— particularly for the larger banking groups—are likely to be more relevant.

Concluding remarks
Overall, the findings of this paper provide broad support to Basel III initiatives on structural liquidity and leverage, and show the complementary nature of these two areas. Banks with weaker structural liquidity and higher leverage before the global financial crisis were more vulnerable to subsequent failure. The results are driven by banks in the lower extremes of the distributions, suggesting the presence of threshold effects. In fact, the marginal stability gains associated with stronger liquidity and capital cushions do not appear to be large for the average bank, but seem substantial for the weaker institutions.

At the same time, there is evidence of systematic differences across bank types. The smaller banks were more susceptible to failure on liquidity problems, while the large cross-border banking groups typically failed on insufficient capital buffers. This difference is crucial from the financial stability perspective, and implies that regulatory and supervisory emphasis should be placed on ensuring that the capital buffers of the systemically important banks are commensurate with their risk-taking.

The evidence also indicates that bank risk-taking in the run-up to the crisis was associated with increased financial vulnerability, suggesting that bank decisions regarding the associated liquidity and capital buffers were not commensurate with the underlying risks, resulting in excessive hazard to their business continuity. Country-specific macroeconomic conditions also played a role in the likelihood of subsequent bank failure, implying that banks failed to properly internalize the associated risks in their individual decision-making processes. Thus, while more intrusive regulations entail efficiency costs, the results point to associated gains in terms of financial stability that have to be pondered. This also supports the introduction of a macro-prudential framework as a complement to traditional, microprudential approach. In this regard, further work is needed to deepen the understanding of the role of the macroeconomic environment on financial stability.

Wednesday, January 25, 2012

No More Résumés, Say Some Firms

No More Résumés, Say Some Firms. By RACHEL EMMA SILVERMAN
WSJ, Jan 25, 2012
http://online.wsj.com/article/SB10001424052970203750404577173031991814896.html

Union Square Ventures recently posted an opening for an investment analyst.

Instead of asking for résumés, the New York venture-capital firm—which has invested in Twitter, Foursquare, Zynga and other technology companies—asked applicants to send links representing their "Web presence," such as a Twitter account or Tumblr blog. Applicants also had to submit short videos demonstrating their interest in the position.

Union Square says its process nets better-quality candidates —especially for a venture-capital operation that invests heavily in the Internet and social-media—and the firm plans to use it going forward to fill analyst positions and other jobs.

Companies are increasingly relying on social networks such as LinkedIn, video profiles and online quizzes to gauge candidates' suitability for a job. While most still request a résumé as part of the application package, some are bypassing the staid requirement altogether.

A résumé doesn't provide much depth about a candidate, says Christina Cacioppo, an associate at Union Square Ventures who blogs about the hiring process on the company's website and was herself hired after she compiled a profile comprising her personal blog, Twitter feed, LinkedIn profile, and links to social-media sites Delicious and Dopplr, which showed places where she had traveled.

StickerGiant's John Fischer, right, and interviewee Adam Thackeray shoot a video Monday. Mr. Fischer uses an online survey to screen applicants.

"We are most interested in what people are like, what they are like to work with, how they think," she says.

John Fischer, founder and owner of StickerGiant.com, a Hygiene, Colo., company that makes bumper and marketing stickers, says a résumé isn't the best way to determine whether a potential employee will be a good social fit for the company. Instead, his firm uses an online survey to help screen applicants.

Questions are tailored to the position. A current opening for an Adobe Illustrator expert asks applicants about their skills, but also asks questions such as "What is your ideal dream job?" and "What is the best job you've ever had?" Applicants have the option to attach a résumé, but it isn't required. Mr. Fischer says he started using online questionnaires several years ago, after receiving too many résumés from candidates who had no qualifications or interest. Having applicants fill out surveys is a "self-filter," he says.

A previous posting for an Internet marketing position had applicants rate their marketing and social-media skills on a scale of one to 10 and select from a list of words how friends or co-workers would describe them. Options included: high energy, type-A, laid back, perfect, creative or fun.

In times of high unemployment, bypassing résumés can also help companies winnow out candidates from a broader labor pool.

IGN Entertainment Inc., a gaming and media firm, launched a program dubbed Code Foo, in which it taught programming skills to passionate gamers with little experience, paying participants while they learned. Instead of asking for résumés, the firm posted a series of challenges on its website aimed at gauging candidates' thought processes. (One challenge: Estimate how many pennies lined side by side would span the Golden Gate Bridge.)

It also asked candidates to submit a video demonstrating their love of gaming and the firm's products.

IGN is a unit of News Corp., which also owns The Wall Street Journal.

Nearly 30 people out of about 100 applicants were picked for the six-week Code Foo program, and six were eventually hired full-time. Several of the hires were nontraditional applicants who didn't attend college or who had thin work experience.

"If we had just looked at their résumés at the moment we wouldn't have hired them," says Greg Silva, IGN's vice president of people and places. The company does require résumés for its regular job openings.

At most companies, résumés are still the first step of the recruiting process, even at supposedly nontraditional places like Google Inc., which hired about 7,000 people in 2011, after receiving some two million résumés. Google has an army of "hundreds" of recruiters who actually read every one, says Todd Carlisle, the technology firm's director of staffing.

But Dr. Carlisle says he reads résumés in an unusual way: from the bottom up.

Candidates' early work experience, hobbies, extracurricular activities or nonprofit involvement—such as painting houses to pay for college or touring with a punk rock band through Europe—often provide insight into how well an applicant would fit into the company culture, Dr. Carlisle says.

Plus, "It's the first sample of work we have of yours," he says.

Tuesday, January 24, 2012

Pricing of Sovereign Credit Risk: Evidence from Advanced Economies During the Financial Crisis

Pricing of Sovereign Credit Risk: Evidence from Advanced Economies During the Financial Crisis. By C. Emre Alper, Lorenzo Forni and Marc Gerard
IMF Working Paper WP/12/24
January, 2012

Summary: We investigate the pricing of sovereign credit risk over the period 2008-2010 for selected advanced economies by examining two widely-used indicators: sovereign credit default swap (CDS) and relative asset swap (RAS) spreads. Cointegration analysis suggests the existence of an imperfect market arbitrage relationship between the cash (RAS) and the derivatives (CDS) markets, with price discovery taking place in the latter. Likewise, panel regressions aimed at uncovering the fundamental drivers of the two indicators show that the CDS market, although less liquid, has provided a better signal for sovereign credit risk during the period of the recent financial crisis.

IV. CONCLUDING REMARKS
This paper addressed the linkages and determinants of two widely used indicators of sovereign risk: CDS and RAS spreads. It focused on advanced economies during the recent financial crisis and the sovereign market tensions that followed. It showed strong co-movements between both series, especially for those countries that have come under significant market pressure. At the same time, arbitrage distortions have remained pervasive in the biggest economies. This suggests that the liquidity of the derivatives market is of paramount importance for CDS spreads to fully reflect sovereign credit risk. For those economies where the evidence stands in favor of a cointegration relationship, deviations from arbitrage have been long lasting, though in line with results in the literature. Also, CDS spreads were found to anticipate changes in RAS, suggesting that the derivatives market has been leading in the process of pricing sovereign credit risk. Regarding the role of fundamentals, we showed that variables related to fiscal sustainability are able to explain only a limited share of the variation of CDS spreads. Spreads seem to respond more to financial variables (such as domestic banking sector capitalization, short-term liquidity conditions, large-scale long-term bond purchases by major central banks) or purely global variables (global growth, global risk aversion, dummies for the different stages of the crisis).

These results refer to a specific group of advanced countries over a short span of time. They suggest that movements in CDS and RAS spreads need to be interpreted with caution. First, while in theory they should be strictly connected, CDS and RAS spreads do not, generally, follow the pattern suggested by the no-arbitrage condition. Moreover, they are affected by several factors, with global and financial considerations playing a dominant role, while at the same time leaving room for a large unexplained component. In general, however, CDS spreads seem to have provided better signals than RAS regarding the market assessment of sovereign risk: over the period covered by the analysis, they have led the process of price discoveries in those countries under market pressure and have been more correlated than RAS to those fundamentals that are expected to affect sovereign risk.
PDF here: http://www.imf.org/external/pubs/ft/wp/2012/wp1224.pdf

The Challenge of Public Pension Reform in Advanced and Emerging Economies

The Challenge of Public Pension Reform in Advanced and Emerging Economies. Prepared by the Fiscal Affairs Department
IMF
December 28, 2011

Summary: This paper reviews past trends in public pension spending and provides projections for 27 advanced and 25 emerging economies over 2011–2050. In constructing these projections, the paper incorporates the impact of recent pension reforms and highlights the key assumptions underlying these projections and associated risks. The paper also presents reform options to address future pension spending pressures in the advanced and emerging economies. These reforms—mainly increasing retirement ages, reducing replacement rates, or increasing payroll taxes—are discussed in the context of their role in fiscal consolidation, and their implications for both equity and economic growth. In addition, the paper examines the challenge of emerging economies of expanding coverage in a fiscally sustainable manner.

Executive Summary
Public pension reform will be a key policy challenge in both advanced and emerging economies over coming decades. Many economies will need to achieve significant fiscal consolidation over the next two decades. Given high levels of taxation, particularly in advanced economies, fiscal consolidation will often need to focus on the expenditure side. As public pension spending comprises a significant share of total spending, and is projected to rise further, efforts to contain these increases will in most cases be a necessary part of fiscal consolidation packages. Pension reforms can also help avoid the need for even larger cuts in pro-growth spending, such as public investment, and help prevent the worsening of intergenerational equity caused by rising life expectancies (at a pace faster than expected) and longer periods of retirement. Finally, some pension reforms, such as increases in retirement ages, can raise potential growth. Thus, while the appropriate level of pension spending and the design of the pension system are ultimately matters of public preference, there are several potential benefits for countries that choose to undertake pension reform. Against this background, this paper provides: (i) an assessment of the main drivers underlying spending trends over recent decades; (ii) new projections for public pension spending in advanced and emerging economies over the next 20 to 40 years; (iii) an assessment of the sensitivity of the country projections to demographic and macroeconomic factors, and risks of reform reversal; and (iv) country-specific policy recommendations to respond to pension spending pressures.

Pension spending is projected to rise in advanced and emerging economies by an average of 1 and 2½ percentage points of GDP over the next two and four decades, respectively, and is subject to a number of risks. During 2010–2030, increases in spending in excess of 2 percentage points of GDP are projected in nine advanced and six emerging economies. There is considerable uncertainty with respect to these projections, but risks are on the upside for a number of countries. Under a scenario where life expectancy is higher than anticipated—life expectancy projections have in the past underestimated actual increases—pension spending would be over 1 percentage point of GDP higher than projected in 2030 in five economies.  Under a low labor productivity scenario, pension spending would be over ½ percentage point of GDP higher in three economies. Sizable risks are also associated with implementing enacted reforms as well as contingent fiscal risks if governments have to supplement private pensions should these fail to deliver adequate benefits.

The appropriate reform mix depends on country circumstances and preferences, although increasing retirement ages has many advantages. It is important that pension reforms do not undermine the ability of public pensions to alleviate poverty among the elderly.  Raising retirement ages avoids the need for further cuts in replacement rates on top of those already legislated, and in many countries the scope for raising contributions may be limited in light of high payroll tax burdens. Longer working lives also raise potential output over time. In many advanced economies there is room for more ambitious increases in statutory retirement ages in light of continued gains in life expectancy, but this should be accompanied by measures that protect the incomes of those who cannot continue to work. In emerging Europe, one possible strategy would be to equalize retirement ages of men and women. In other emerging economies, where pension coverage is low, expansion of non-contributory “social pensions” could be considered, combined with reforms that place pension systems on sound financial footing, including raising the statutory age of retirement. Where average pensions are high relative to average wages, efforts to increase statutory ages could be complemented by reductions in the generosity of pensions. Where taxes on labor income are relatively low, increasing revenues could be considered, and all countries should strive to improve the efficiency of payroll contribution collections.

PDF here: http://www.imf.org/external/np/pp/eng/2011/122811.pdf

Sunday, January 22, 2012

Are Rating Agencies Powerful? An Investigation into the Impact and Accuracy of Sovereign Ratings

Are Rating Agencies Powerful? An Investigation into the Impact and Accuracy of Sovereign Ratings. By John Kiff, Sylwia Nowak, and Liliana Schumacher
IMF Working Paper WP/12/23
Jan 2012
http://www.imfbookstore.org/ProdDetails.asp?ID=WPIEA2012023

Abstract
We find that Credit Rating Agencies (CRA)’s opinions have an impact in the cost of funding of sovereign issuers and consequently ratings are a concern for financial stability. While ratings produced by the major CRAs perform reasonably well when it comes to rank ordering default risk among sovereigns, there is evidence of rating stability failure during the recent global financial crisis.  These failures suggest that ratings should incorporate the obligor’s resilience to stress scenarios. The empirical evidence also supports: (i) reform initiatives to reduce the impact of CRAs’ certification services; (ii) more stringent validation requirements for ratings if they are to be used in capital regulations; and (iii) more transparency with regard to the quantitative parameters used in the rating process.

Excerpts
I. INTRODUCTION
1. Recent rating activities by the Credit Rating Agencies (CRAs) have induced some to ask whether ratings represent accurate risk assessments and to question how influential they are. The contention that ratings represent accurate default risk metrics was brought into question by the sheer volume and intensity of the multiple downgrades to U.S. mortgage-related structured finance securities in the wake of the crisis. Voices have also been raised against the timing of recent downgrades of European sovereigns amidst criticism that these downgrades promoted uncertainty in financial markets, leading to “cliff effects” and as a consequence affect their ability to funding themselves. Rating agencies have also been accused of behaving oligopolistically.

2. These criticisms are not new. CRAs’ downgrading actions have been accused before of not being timely but instead procyclical. It was argued that “the Mexican crisis of 1994-95 brought out that credit rating agencies, like almost anybody else, were reacting to events rather than anticipating them” (Reisen, 2003). During the late 1990s Asian crisis, CRAs were also blamed of downgrading East Asian countries too late and more than the worsening in these countries’ economic fundamentals justified, exacerbating the cost of borrowing.

3. The goal of this paper is to assess these concerns. We first examine CRAs’ role and whether CRAs are influential or just lag the market once new information is available and priced into fixed income securities. This is an important point. If CRAs influence the market, their opinions are important from a financial stability perspective. If they do not and just reflect information available to the market, their actions are not relevant and there is no policy concern.  In this regard, we test three hypotheses regarding the services that CRAs provide to the market: information, certification and monitoring. We find evidence that CRAs’ opinions are influential and favor the information and certification role. We then attempt to determine what ratings actually measure and how accurate they are. We conclude with some policy recommendations based on these findings. This study is limited to sovereign ratings (of emerging markets and advanced economies) and covers the period January 2005-June 2010.

4. Our analysis of the interaction between the market and CRAs indicates that ratings have information value beyond the information already publicly available to the market.  Specifically, the following results were evident:
  • An event study shows that negative credit warnings (i.e., “reviews,” “watches,” and “outlooks”) have a significant impact on CDS spreads. This evidence is also supported by a Granger-causality test that finds that negative credit warnings Granger-cause changes in CDS spreads. These findings are consistent with the view that rating agencies do provide additional information to the markets, in addition to what is publicly available and used by markets to price fixed income securities.
  • Although upgrades and downgrades in general do not have a significant impact on CDS spreads, upgrades and downgrades in and out of investment grade categories are statistically significant. This supports the view that the certification services provided by rating agencies do matter and likely create a purely liquidity effect (e.g.  purchases and sales of assets forced by regulations or other formal mandates and not based on the additional information already in the market).
  • We do not find evidence in favor of the most important testable implication of the monitoring services theory. The impact of downgrades preceded by an outlook review in the same direction is not statistically significant. 
  • From an informational point of view, the market appears to discriminate more than rating agencies among different kinds of issuers—in particular at lower rating grades and during crisis periods. This finding may indicate that ratings need to incorporate more granularity and leads to the second question of what ratings measure and how accurate they are.
  • A common element among ratings by the major CRAs is that they represent a rank ordering of credit risk. This ordering is based on qualitative and quantitative inputs such as default probabilities, expected losses, and downgrade risk. However, there is no oneto- one mapping between any of these quantitative measures of credit risk and credit ratings. There is also no disclosure of the quantitative parameters that characterize each rating grade. For this reason, validation tests undertaken by outsiders can only apply to the ability of ratings to differentiate potential defaulters and non defaulters, but not to estimating cardinal measures such as default probabilities.
  • The point highlighted above implies that—in spite of playing a similar role to internal ratings in the Basel II internal ratings-based (IRB) approach—ratings produced by the CRAs are subject to lower validation standards than are the banks using the IRB approach. In the Basel II IRB approach, financial institutions use measures of default probabilities (PD), losses given default (LGD) and exposure at default (EAD) to produce their internal ratings and are subject to calibration tests.  Although validation is foremost the responsibility of banks, both bank risk managers and bank supervisors need to develop a thorough understanding of validation methods in evaluating whether banks’ rating systems comply with the operating standards set forth by Basel II.
  • Ratings produced by the major CRAs perform reasonably well when it comes to rank ordering default risk among sovereigns, i.e. defaults tend to take place among the lowest rated issuers. Accuracy ratios (AR) indicate that agencies are more successful at sorting out potential defaulters among sovereign issuers (average ARs in the 80 to 90 percent range) than among corporate and structured finance issuers (average ARs in the 63 to 87 percent range), the latter ones having suffered a strong deterioration over the global financial crisis. For all classes of products though, the ARs indicate that sovereign rating accuracy deteriorates as the evaluation horizon increases.
  • In general, long-term credit transition matrices show that higher ratings are more stable than lower ones, and tend to remain unchanged. But this was not the case during the global crisis period, when there has been a tendency to see heavier downgrade activity among higher-rated sovereigns than among lower-rated ones. There has been evidence of significant rating failure (defined here as three or more rating changes in one year) during the recent global financial crisis, although less than in the Asian crisis.

IV. SOME POLICY RECOMMENDATIONS
  • Based on the evidence of the impact of the CRA’s certification services, the removal of the excessive reliance of regulations on ratings is warranted. This will not affect the information value of ratings—that appears to work mostly through outlook reviews—and will help lessen the additional liquidity impact due to the need to meet regulations, reducing potential cliff effects.
  • To the extent that ratings continue to play a significant role in regulations, an issue arises as to whether CRAs should be more transparent about the quantitative measures they calibrate in the rating process (PDs, LGD, and stability assumptions), how these measures are mapped into ratings, and whether the final ratings can be used to infer the parameters used to obtain these measures. This is particularly relevant in the use of external ratings by banks employing the standardized approach in Basel II since internal ratings systems are subject to rigorous back testing.
  • Moreover, recent heavy downgrade activity suggests that ratings should embed the notion that risk is a forward looking dimension conditional on the macroeconomic scenario. In this regard, ratings should be better tied to macroeconomic conditions, including their resilience to stress scenarios.

PDF here: http://www.imf.org/external/pubs/ft/wp/2012/wp1223.pdf

Thursday, January 19, 2012

Oil-price shocks have had substantial and statistically significant effects during the last 25 years

Measuring Oil-Price Shocks Using Market-Based Information. By Tao Wu & Michele Cavallo
IMF Working Paper No. 12/19
January 01, 2012
http://www.imfbookstore.org/ProdDetails.asp?ID=WPIEA2012019

Summary

We study the effects of oil-price shocks on the U.S. economy combining narrative and quantitative approaches. After examining daily oil-related events since 1984, we classify them into various event types. We then develop measures of exogenous shocks that avoid endogeneity and predictability concerns. Estimation results indicate that oil-price shocks have had substantial and statistically significant effects during the last 25 years. In contrast, traditional VAR approaches imply much weaker and insignificant effects for the same period. This discrepancy stems from the inability of VARs to separate exogenous oil-supply shocks from endogenous oil-price fluctuations driven by changes in oil demand.

Excerpts:

I. INTRODUCTION
The relationship between oil-price shocks and the macroeconomy has attracted extensive scrutiny by economists over the past three decades. The literature, however, has not reached a consensus on how these shocks affect the economy, or by how much. A large number of studies have relied on vector autoregression (VAR) approaches to identify exogenous oilprice shocks and estimate their effects. Nevertheless, estimation results generally have not provided compelling support for the conventional-wisdom view that following a positive oilprice shock, real GDP declines and the overall price level increases. In addition, the estimated relationship is often unstable over time. This is why, after a careful examination of various approaches, Bernanke, Gertler, and Watson (1997) conclude that “finding a measure of oil price shocks that ‘works’ in a VAR context is not straightforward. It is also true that the estimated impacts of these measures on output and prices can be quite unstable over different samples.”

Traditional VAR-based measures of oil-price shocks exhibit two recurrent weaknesses: endogeneity and predictability. With regard to the first one, VAR approaches often cannot separate oil-price movements driven by exogenous shocks from those reflecting endogenous responses to other kinds of structural shocks. For instance, the oil price increases that occurred over the 2002–2008 period were viewed by many as the result of “an expanding world economy driven by gains in productivity” (The Wall Street Journal, August 11, 2006). The occurrence of such endogenous movements will undoubtedly lead to biased estimates of the effects of oil shocks.

On the other hand, part of the observed oil price changes might have been anticipated by private agents well in advance. Therefore, they can hardly be considered as “shocks.” Most measures of oil-price shocks in the literature are constructed using only spot oil prices. However, when the market senses any substantial supply-demand imbalances in the future, changes in the spot prices may not fully reflect such imbalances. A number of authors (e.g., Wu and McCallum, 2005; Chinn, LeBlanc, and Coibon, 2005) have found that oil futures prices are indeed quite powerful in predicting spot oil price movements, indicating that at least a portion of such movements may have been anticipated at least a few months in advance. Both these concerns underscore the need to pursue a different approach to obtain more reliable measures of exogenous oil-price shocks.

In this paper, we combine narrative and quantitative approaches to develop new measures of exogenous oil-price shocks that avoid the endogeneity and predictability concerns. We begin by identifying the events that have driven oil-price fluctuations on a daily basis from 1984 to 2007. To achieve this goal, we first collect information from daily oil-market commentaries published in a number of oil-industry trade journals, such as Oil Daily, Oil & Gas Journal, and Monthly Energy Chronology. This leads to the construction of a database that identifies the oil-related events that have occurred each day since January 1984. We then classify these daily events into a number of different event types based on their specific features, such as weather changes in the U.S., military actions in the Middle East, OPEC announcements on oil production, U.S. oil inventory announcements, etc. (see Table 1). Next, for each event type we construct a measure of oil-price shocks by running oil-price forecasting equations on a daily basis. Finally, shock series from exogenous oil events are selected and aggregated into a single measure of exogenous oil-price shocks. By construction, these shock measures should be free of endogeneity and predictability problems, and statistical tests are also conducted to confirm their exogeneity. For robustness, we also provide a number of alternative definitions of exogenous oil-price shocks and construct corresponding shock measures for each one of them.

We employ our new, market-information based measures to study the responses of U.S. output, consumer prices, and monetary policy to exogenous oil-price shocks. We also compare the estimated responses with those obtained following two traditional VAR-based identification strategies that are very popular in the literature. Estimation results reveal substantial and statistically significant output and price responses to exogenous oil-price shocks identified by our market-based methodology. In contrast, responses implied by the VAR-based approaches are much weaker, statistically insignificant, and unstable over time. Moreover, we find that following a demand-driven oil-price shock, real GDP increases and the price level declines. This finding is consistent with scenarios in which oil-price fluctuations are endogenous responses to changes in the level of economic activity rather than reflecting exogenous oil shocks. We argue that traditional VAR-based approaches cannot separate the effects of these two kinds of shocks and consequently lead to biased estimates of the dynamic responses.

Our approach is similar in spirit to the narrative approach pursued in a number of existing studies. Romer and Romer (2004, 2010) adopt it in their analyses of monetary policy and tax shocks, Alexopoulos (2011) and Alexopoulos and Cohen (2009) in the context of technology shocks, and Ramey (2009) in her analysis of government spending shocks. With regard to oil-price shocks, several earlier studies have tried to isolate some geopolitical events associated with abrupt oil-price increases and examine their effects on the U.S. economy. Hamilton (1983, 1985) identifies a number of “oil-price episodes” before 1981, mainly Middle East tensions, and concludes that such oil shocks had effectively contributed to postwar recessions in the U.S. Hoover and Perez (1994) revise Hamilton’s (1983) quarterly dummies into a monthly dummy series and find that oil shocks had led to declines in U.S. industrial production. Bernanke, Gertler, and Watson (1997) construct a quantitative measure, weighting Hoover and Perez’s dummy variable by the log change in the producer price index for crude oil, yet they were not able to find statistically significant macroeconomic responses to oil shocks in a VAR setting. Hamilton (2003) identifies five military conflicts during the postwar period and reexamines the effects of the associated oil shocks on U.S. GDP growth. Finally, Kilian (2008) also analyzes six geopolitical events since 1973, five in the Middle East and one in Venezuela, and examines their effects on the U.S. economy. Our study contributes to the literature by constructing a database of all oil-related events on a daily basis. This allows us to identify all kinds of oil shocks and conduct a more comprehensive analysis than earlier studies. Extracting the “unpredictable” component of oil-price fluctuations using an oil futures price-based forecasting model represents another novelty of our work.

More recently, Kilian (2009) has also used information from the oil market to disentangle different kinds of oil-price shocks. In particular, he has constructed an index of global real economic activity, including it in a tri-variate VAR, along with data on world oil production and real oil prices. Using a recursive ordering of these variables, he recovers an oil-supply shock, a global aggregate demand-driven shock, and an oil market-specific demand shock.  Although his approach is completely different from ours, the effects on the U.S. economy of all three kinds of structural shocks estimated in his work are quite close to our empirical estimates.  This, in turn, corroborates the validity of our approach. We present detailed evidence in subsequent sections.

Our study is also related to the ongoing debate about how the real effects of oil-price shocks have changed over time. For instance, VAR studies, such as those of Hooker (1996) and Blanchard and Galí (2009), have usually found a much weaker and statistically insignificant relationship between their identified oil-price shocks and real GDP growth in the U.S. and other developed economies during the last two to three decades. These results are often cited as evidence suggesting that the U.S. economy has become less volatile and more insulated from external shocks, the result of better economic policy, a lack of large adverse shocks, or a smaller degree of energy dependence (e.g., a more efficient use of energy resources and a larger share of service sector in the U.S. economy), all contributing to a “Great Moderation” starting in the first half of the 1980s. Although we do not challenge this general characterization of the “Great Moderation,” our estimation results reveal a substantial and significant adverse effect of exogenous oil shocks on the U.S. economy, even during the last two and a half decades. Results from VAR studies, in particular the time variation in coefficient estimates, may simply reflect an inadequate identification strategy.

V. CONCLUDING REMARKS
This paper combines narrative and quantitative approaches to examine the dynamic effects of oil-price shocks on the U.S. economy. To correctly identify exogenous oil shocks, we first collect oil-market related information from a number of oil-industry trade journals, and compile a database identifying all the events that have affected the global oil market on a daily basis since 1984. Based on such information, we are able to isolate events that are exogenous to the U.S. economy and construct corresponding measures of exogenous oil-price shocks.  Furthermore, shock magnitudes are calculated by running a real-time oil-price forecasting model incorporating oil futures prices. These procedures help alleviate the endogeneity and predictability problems that have pestered the traditional VAR identification strategies in the literature.

One contribution of our work is the thorough examination of all kinds of oil-related events in the past two and a half decades, more comprehensive than just focusing on geopolitical or military events, as most of the earlier literature has done so far. Moreover, in constructing the database, we have preserved as much primitive information on the oil-market developments as possible, with the hope of facilitating possible future studies by other researchers on the nature and implications of these events. 

After deriving our measures of various kinds of oil shocks, we go on to examine their dynamic macroeconomic effects. We find that exogenous oil-price shocks have had substantial and statistically significant effects on the U.S. economy during the past two and a half decades. In contrast, traditional VAR identification strategies imply a substantially weaker and insignificant real effect for the same period. Further analysis reveals that this discrepancy is likely to stem from the inability of VAR-based approaches to separate exogenous oil-supply shocks from endogenous oil-price fluctuations driven by changes in oil demand. Notably, our study also suggests that the U.S. economy may not have become as insulated from oil shocks during the last two and a half decades as earlier studies have suggested. To examine fully how the oil price-macroeconomy relationship has evolved during the whole postwar period, a thorough study along the same narrative and quantitative approach for the period prior to the “Great Moderation” is called for. This will be the topic for future research.

Wednesday, January 18, 2012

Volatility, rather than abundance per se, drives the "resource curse" paradox

Commodity Price Volatility and the Sources of Growth. By Tiago V. de V. Cavalcanti, Kamiar Mohaddes, and Mehdi Raissi
IMF Working Paper No. 12/12
http://www.imfbookstore.org/ProdDetails.asp?ID=WPIEA2012012

Summary

This paper studies the impact of the level and volatility of the commodity terms of trade on economic growth, as well as on the three main growth channels: total factor productivity, physical capital accumulation, and human capital acquisition. We use the standard system GMM approach as well as a cross-sectionally augmented version of the pooled mean group (CPMG) methodology of Pesaran et al. (1999) for estimation. The latter takes account of cross-country heterogeneity and cross-sectional dependence, while the former controls for biases associated with simultaneity and unobserved country-specific effects. Using both annual data for 1970-2007 and five-year non-overlapping observations, we find that while commodity terms of trade growth enhances real output per capita, volatility exerts a negative impact on economic growth operating mainly through lower accumulation of physical capital. Our results indicate that the negative growth effects of commodity terms of trade volatility offset the positive impact of commodity booms; and export diversification of primary commodity abundant countries contribute to faster growth. Therefore, we argue that volatility, rather than abundance per se, drives the "resource curse" paradox.


Excerpts

I. INTRODUCTION

Finally, while the resource curse hypothesis predicts a negative effect of commodity booms on long-run growth, our empirical findings (in line with the results reported in Cavalcanti et al.  (2011a) and elsewhere in the literature) show quite the contrary: a higher level of commodity terms of trade significantly raises growth. Therefore, we argue that it is volatility, rather than abundance per se, that drives the "resource curse" paradox. Indeed, our results confirm that the negative growth effects of CTOT volatility offset the positive impact of commodity booms on real GDP per capita.



VI. CONCLUDING REMARKS

This paper examined empirically the effects of commodity price booms and terms of trade volatility on GDP per capita growth and its sources using two econometric techniques. First, we employed a system GMM dynamic panel estimator to deal with the problems of simultaneity and omitted variables bias, derived from unobserved country-specific effects.  Second, we created an annual panel dataset to exploit the time-series nature of the data and used a cross-sectionally augmented pooled mean group (PMG) estimator to account for both cross-country heterogeneity and cross-sectional dependence which arise from unobserved common factors. The maintained hypothesis was that commodity terms of trade volatility affects output growth negatively, operating mainly through the capital accumulation channel.  This hypothesis is shown to be largely validated by our time series panel data method, as well as by the system GMM technique used, suggesting the importance of volatility in explaining the under-performance of primary commodity abundant countries.

While the resource curse hypothesis postulates a negative effect of resource abundance (proxied by commodity booms) on output growth, the empirical results presented in this paper show the contrary: commodity terms of trade growth seems to have affected primary-product exporters positively. Since the negative impact of CTOT volatility on GDP per capita is larger than the growth-enhancing effects of commodity booms, we argue that volatility, rather than abundance per se, drives the resource curse paradox.

An important contribution of our paper was to stress the importance of the overall negative impact of CTOT volatility on economic growth, and to investigate the channels through which this effect operates. We illustrated that commodity price uncertainty mainly lowers the accumulation of physical capital. The GMM results also implied that CTOT volatility adversely affects human capital formation. However, this latter effect was not robust when we used an alternate GARCH methodology to calculate CTOT volatility. Therefore, an important research and policy agenda is to determine how countries can offset the negative effects of commodity price uncertainty on physical and human capital investment.

Another notable aspect of our results was to show the asymmetric effects of commodity terms of trade volatility on GDP per capita growth in the two country groups considered. While CTOT instability created a significant negative effect on output growth in the sample of 62 primary product exporters, in the case of the remaining 56 countries (or even in the full sample of 118 countries) the same pattern was not observed. One explanation for this observation is that the latter group of countries, with more diversified export structure, were better able to insure against price volatility than a sample of primary product exporters.  Finally, we offered some empirical evidence on growth-enhancing effects of export diversification, especially for countries whose GDP is highly dependent on revenues from just a handful of primary products.

The empirical results presented here have strong policy implications. Improvements in the conduct of macroeconomic policy, better management of resource income volatility through sovereign wealth funds (SWF) as well as stabilization funds, a suitable exchange rate regime, and export diversification can all have beneficial growth effects. Moreover, recent academic research has placed emphasis on institutional reform. By establishing the right institutions, one can ensure the proper conduct of macroeconomic policy and better use of resource income revenues, thereby increasing the potential for growth. We await better data on institutional quality to test this hypothesis. Clearly, fully articulated structural models are needed to properly investigate the channels through which the negative growth effects of volatility could be attenuated. This remains an important challenge for future research.

Tuesday, January 17, 2012

CPSS-IOSCO's Requirements for OTC derivatives data reporting and aggregation: final report

Requirements for OTC derivatives data reporting and aggregation: CPSS-IOSCO publishes final report
January 17, 2012
http://www.bis.org/press/p120117.htm

The Committee on Payment and Settlement Systems (CPSS) and the Technical Committee of the International Organization of Securities Commissions (IOSCO) have published their final report on the OTC derivatives data that should be collected, stored and disseminated by trade repositories (TRs).

The committees support the view that TRs, by collecting such data centrally, would provide authorities and the public with better and more timely information on OTC derivatives. This would make markets more transparent, help to prevent market abuse, and promote financial stability.

The final report reflects public comments received in response to a consultative version of the report published in August 2011. Following the consultation exercise, the report was expanded to elaborate on the description of possible options to address data gaps.

The report was also updated to reflect recent international developments in data reporting and aggregation requirements stemming from the Legal Entity Identifier (LEI) workshop in September 2011 and other efforts under the auspices of the Financial Stability Board (FSB), in support of a request by the G20 at the Cannes Summit, to advance the development of a global LEI.

As the report indicates, some questions remain regarding how best to address current data gaps and define authorities' access to TRs. As requested by the G20, two internationally coordinated working groups will address these questions in the coming year. The FSB will establish an ad hoc group of experts to further consider means of filling current data gaps, while the CPSS and IOSCO will establish a joint group to examine authorities' access to trade repositories.


Report Background

The report addresses Recommendation 19 in the October 2010 report of the FSB, Implementing OTC derivatives market reforms, which called on the CPSS and IOSCO to consult with the authorities and the OTC Derivatives Regulators Forum in developing:

(i)    minimum data reporting requirements and standardised formats, and

(ii)   the methodology and mechanism for data aggregation on a global basis. A final report is due by the end of 2011.

The requirements and data formats will apply both to market participants reporting to TRs and to TRs reporting to the public and to regulators. The report also finds that certain information currently not supported by TRs would be helpful in assessing systemic risk and financial stability, and discusses options for bridging these gaps.

Issues relating to data access for the authorities and reporting entities are discussed, including methods and tools that could provide the authorities with better access to data. Public dissemination of data, it is noted, promotes the understanding of OTC derivatives markets by all stakeholders, underpins investor protection, and facilitates the exercise of market discipline.

The report also covers the mechanisms and tools that the authorities will need for the purpose of aggregating OTC derivatives data.


Notes to editors

  • This report was originally published in August 2011 as a consultative report.
  • The CPSS serves as a forum for central banks in their efforts to monitor and analyse developments in payment and settlement arrangements as well as in cross-border and multicurrency settlement schemes. The CPSS secretariat is hosted by the Bank of International Settlements (BIS).
  • IOSCO is an international policy forum for securities regulators. The Technical Committee, a specialised working committee established by IOSCO's Executive Committee, comprises 18 agencies that regulate some of the world's larger, more developed and internationalised markets. Its objective is to review major regulatory issues related to international securities and futures transactions and to coordinate practical responses to these concerns.
  • Both committees are recognised as international standard-setting bodies by the Financial Stability Board (www.financialstabilityboard.org)

Saturday, January 7, 2012

The sustainability of pension schemes

The sustainability of pension schemes, by Srichander Ramaswamy
BIS Working Papers No 368
http://www.bis.org/publ/work368.htm


Abstract

Poor financial market returns and low long-term real interest rates in recent years have created challenges for the sponsors of defined benefit pension schemes. At the same time, lower payroll tax revenues in a period of high unemployment, and rising fiscal deficits in many advanced economies as economic activity has fallen, are also testing the sustainability of pay-as-you-go public pension schemes. Amendments to pension accounting rules that require corporations to regularly report the valuation differences between their defined benefit pension assets and plan liabilities on their balance sheet have made investors more aware of the pension risk exposure for the sponsors of such schemes. This paper sheds light on what effects these developments are having on the design of occupational pension schemes, and also provides some estimates for the post-employment benefits that could be delivered by these schemes under different sets of assumptions. The paper concludes by providing some policy perspectives.


8  Summary and policy issues (edited)

A weak macroeconomic environment and unusually low real interest rates in many countries have put the funding challenges faced by occupational and public pension schemes in the spotlight. This paper took a simple actuarial model to quantify how the cost of funding DB pension schemes increase as the real rate of return in asset markets falls. If real returns on pension assets are assumed to be lower by 0.5% compared to their historical averages, service costs of DB schemes would be 15% higher than in the past for the same benefit payments. Converting final salary pension schemes to career average schemes (and not altering the percentages applied) would lower pensions by 20–25% assuming that real wages grow at the rate of 1–1.5% per annum.

Declining mortality rates will put further upward pressure on the contribution rates needed to fund these schemes. When the expected increases in longevity are priced into the actuarial model for computing the service cost, this cost is likely to be 10% higher than estimates presented in the paper. Increasing longevity as well as demographic changes that point to a rise in the old-age dependency ratio poses challenges to the sustainability of PAYG schemes. The projected increase in old-age dependency ratio suggests that in many countries the contributions to PAYG schemes have to increase by 20% from current levels in 2020 to pay pensions. But as PAYG schemes that service current pensions from employee contributions and taxes do not report the contractual pension liabilities, estimating the funding shortfalls these schemes might face going forward is a challenge.

In contrast to PAYG schemes and some funded public pension schemes, occupational DB schemes have to comply with accounting standards to report the market value of their pension liabilities and the assets that back them so that potential funding shortfalls faced by these schemes can be quantified. Unusually low real interest rates and poor financial market returns in the past decade have had an adverse impact on the coverage ratio of these schemes through the valuation effects on liabilities and lower returns on pension assets. Estimates of the coverage ratio of occupational DB schemes based on these returns would point to a funding deficit of 10 to 20 per cent against their pension liabilities. The size of any deficit that eventually materialises over the long lives of these schemes, however, would depend on future returns – which are unknown.

For occupational DB schemes that face large funding shortfalls, employer contributions will have to rise to improve the coverage ratio of these schemes. At the same time, increasing longevity and falling real yields against the backdrop of a weak macroeconomic environment are raising the service costs of DB schemes and adding to the upward pressures on required contribution rates. Recent amendments to pension accounting standards, which require companies to provide more disclosures in their financial statements on the risks the DB scheme poses to the entity and to report the net gains or losses from their DB pension plans on their balance sheet, are likely to accelerate the shift out of occupational DB plans into DC plans. This is because DC plans limit the contractual liabilities of employers to the contribution rates to be paid for the current service period of the employee.

A progressive shift from DB to DC schemes can have material implications for post-employment benefits because it exposes employees to the investment risks on the pension assets. In addition to this risk, beneficiaries of DC plans will also be exposed to the principal risk factors that determine annuity payments, namely level of real interest rates and the projections of mortality rates into the future when the actual annuity payments will be made. Using a simple model to estimate the retirement income from DC schemes, the numerical results presented in Table 2 showed that when contributions to DC schemes are 18% of salaries over a 30-year period and the returns net of administrative expenses on plan assets are 2% higher than the rate at which wages grow, post-employment benefits from a DC scheme would roughly be 43% of the final salary. The excess return assumption of 2% is based on the following input variables in the model to compute retirement income for DC plans: real yield on long-term bonds is 2%; equity risk premium over the returns on long-term government bonds is 3%; plan assets have an equal share of bonds and equities; administrative expenses are 0.5% of plan assets; and the annual real wage growth rate is 1.25%.

The quantitative analysis presented in this paper provides some insights on the possible trade-offs that may be available for public policy on the design of sustainable pension schemes. For example, the internal rate of return on the notional assets of PAYG schemes will be approximately equal to the rate of real GDP growth of the local economy, which is expected to be 2% or lower in advanced economies. The actuarial model showed that service cost of a pension scheme will be high when the rate of return on the pension assets is low. A funded public pension scheme, on the other hand, will be able to raise the level of return on pension fund assets by investing them in higher growth markets. Estimates using the actuarial model suggest that a 50 basis points increase in real returns lowers the service cost of the pension scheme by 15%. Funded pension schemes therefore offer the prospect of lowering service costs and to be able to better align the pension benefits offered by these schemes to the contribution rates received.

Public policy may also be needed to develop efficient markets for pricing annuity risk as occupational DC plans become the preferred post-employment benefit scheme offered by employers. Efficient markets for pricing annuities will in turn depend on how the market for managing and hedging longevity risk develops. As more employers progressively shift towards DC schemes for providing post-employment benefits, regulatory policies might be needed to restrict the range of permissible investment options available for plan assets to avoid unintended risks being taken by the plan beneficiaries, and to set mandatory minimum contribution rates for participating in DC schemes. Finally, considering that plan beneficiaries in DC schemes are exposed to interest rate risk at the time of converting plan assets into an annuity, the pros and cons of providing insurance policies that guarantee a minimum real yield at which these assets can be converted into an annuity will have to be examined.

Wednesday, December 21, 2011

BCBS: Application of own credit risk adjustments to derivatives - consultative document

Application of own credit risk adjustments to derivatives - Basel Committee consultative document
December 21, 2011
http://www.bis.org/press/p111221.htm

The Basel Committee today issued a consultative document on the application of own credit risk adjustments to derivatives.

The Basel III rules seek to ensure that a deterioration in a bank's own creditworthiness does not at the same time lead to an increase in its common equity as a result of a reduction in the value of the bank's liabilities. Paragraph 75 of the Basel III rules requires a bank to "[d]erecognise in the calculation of Common Equity Tier 1, all unrealised gains and losses that have resulted from changes in the fair value of liabilities that are due to changes in the bank's own credit risk".

The application of paragraph 75 to fair valued derivatives is not straightforward since their valuations depend on a range of factors other than the bank's own creditworthiness. The consultative paper proposes that debit valuation adjustments (DVAs) for over-the-counter derivatives and securities financing transactions should be fully deducted in the calculation of Common Equity Tier 1. It briefly reviews other options for applying the underlying concept of paragraph 75 to these products and the reasons these alternatives were not supported by the Basel Committee.

The Basel Committee welcomes comments on all aspects of this consultative document by Friday 17 February 2012. Comments should be sent to baselcommittee@bis.org. Alternatively, comments may be submitted to the following address: Basel Committee on Banking Supervision, Bank for International Settlements, Centralbahnplatz 2, 4002 Basel, Switzerland. All comments may be published on the BIS website unless a commenter specifically requests confidential treatment.


Summary (http://www.bis.org/publ/bcbs214.htm, edited):

A deterioration in a bank's own creditworthiness can lead to an increase in the bank's common equity as a result of a reduction in the value of its liabilities. The Basel III rules seek to prevent this. Paragraph 75 of the Basel III rules requires a bank to "[d]erecognise in the calculation of Common Equity Tier 1, all unrealised gains and losses that have resulted from changes in the fair value of liabilities that are due to changes in the bank's own credit risk". The application of paragraph 75 to fair valued derivatives is not straightforward since their valuations depend on a range of factors other than the bank's own creditworthiness. The consultative paper proposes that debit valuation adjustments (DVAs) for over-the-counter derivatives and securities financing transactions should be fully deducted in the calculation of Common Equity Tier 1. It briefly reviews other options for applying the underlying concept of paragraph 75 to these products and the reasons these alternatives were not supported by the Basel Committee.

PDF: http://www.bis.org/publ/bcbs214.pdf

2011: A Year of Important Pharmacological Advances for Patients

2011: A Year of Important Pharmacological Advances for Patients
December 21, 2011
http://www.innovation.org/index.cfm/NewsCenter/Newsletters?NID=193

New advances in biopharmaceuticals came as welcome good news for U.S. patients. In recent years, despite increasing investments in R&D, fewer medicines have recieved approval, reminding us just how difficult drug discovery is. But in 2011 there were more new medicines approved than in recent years and these approvals represented important advances in many areas.

The Food and Drug Administration reported in November that in fiscal year 2011 (10/1/10–9/30/11) there were 35 new medicines approved,[i] among the highest in the last decade. According to the FDA report, "few years have seen as many important advances for patients." The final tally of medicines approved in calendar year 2011 waits to be seen but it is clear that 2011 has been a great year for advancing the fight on many disease fronts. Below is information on some of the treatments highlighted in the FDA report.

Cancer: With improvements in early detection and a steady stream of new and enhanced treatments cancer can be more effectively managed and even beaten. Two new personalized medicines for lung cancer and melanoma now provide effective options for patients with tumors expressing certain genetic markers.[ii] The personalized melanoma treatment and another new melanoma medicine became the first new approvals for the disease in 13 years. Read about continued efforts to improve cancer treatment and the 887 medicines currently in development.

Rare Diseases: An estimated 25-30 million Americans suffer from rare or "orphan" diseases, which are often among the most devastating to patients and complex for researchers.[iii] However, advances in science have allowed us to hone in on the causes of many rare diseases and translate those findings into new treatments. Between January 1st and December 7th 2011, eleven new medicines to treat rare diseases were made available to patients for diseases such as the genetic defect congenital Factor XIII deficiency, several cancers, and scorpion poisoning.[iv] A record 460 new medicines are in clinical trials or awaiting FDA review.[v] Read more about the ongoing commitment to improve treatment for rare diseases.

Lupus: Lupus is a serious and potentially fatal autoimmune disease that attacks healthy organs and tissues of the body. For the approximately 300,000 to 1.5 million lupus sufferers in the U.S. the last approved drug came in 1955. This year a newly approved medicine ended that drought with a new approach to treating lupus. Read more about medicines in development for autoimmune diseases.

Hepatitis C: Hepatitis C is a chronic viral disease that affects the liver and can lead liver cancer and liver failure. It affects approximately 3 million people in the United States. Two new medicines approved this year are the first in a new class and offer a greater chance of cure for some patients compared with existing therapies. For more information on medicines in development for infectious diseases, click here.

This is only a partial list of the many advances approved in 2011, for a more complete listing visit www.FDA.org.

Looking ahead to 2012, biomedical research continues to draw us in new directions and helps us to better understand the cause and progression of disease. Coupled with innovative approaches at the bench and in the clinic, we can better prevent, detect, and treat disease to save and improve lives.


References
[i]US Food and Drug Administration, FY2011 Innovative Drug Approvals, (November 2011) http://www.fda.gov/AboutFDA/ReportsManualsForms/Reports/ucm276385.htm

[ii]US Food and Drug Administration, FY2011 Innovative Drug Approvals, (November 2011) http://www.fda.gov/AboutFDA/ReportsManualsForms/Reports/ucm276385.htm

[iii]PhRMA, Orphan Drugs in Development for Rare Diseases (February 2011) http://www.phrma.org/sites/default/files/878/rarediseases2011.pdf

[iv]US Food and Drug Administration, CDER New Drug Review: 2011 Update (December 2011) http://www.fda.gov/downloads/AboutFDA/CentersOffices/OfficeofMedicalProductsandTobacco/CDER/UCM282984.pdf

[v]PhRMA, Orphan Drugs in Development for Rare Diseases (February 2011) http://www.phrma.org/sites/default/files/878/rarediseases2011.pdf

BCBS: revised "Core principles for effective banking supervision" - consultative paper

Consultative paper on revised "Core principles for effective banking supervision" issued by the Basel Committee
December 20, 2011
http://www.bis.org/press/p111220.htm

The Basel Committee on Banking Supervision today issued for public comment its revised "Core principles for effective banking supervision" [http://www.bis.org/publ/bcbs213.htm].

The consultative paper updates the Committee's 2006 "Core principles for effective banking supervision" [http://www.bis.org/publ/bcbs129.htm] and the associated "Core principles methodology" [http://www.bis.org/publ/bcbs130.htm], and merges the two documents into one. The Core Principles have also been re-ordered, highlighting the difference between what supervisors do themselves and what they expect banks to do: Principles 1 to 13 address supervisory powers, responsibilities and functions, focusing on effective risk-based supervision, and the need for early intervention and timely supervisory actions. Principles 14 to 29 cover supervisory expectations of banks, emphasising the importance of good corporate governance and risk management, as well as compliance with supervisory standards.

Among other things, the revision of the Core Principles builds on the lessons of the last financial crisis. The Core Principles have been enhanced to strengthen supervisory practices and risk management. In addition, the revised Core Principles respond to several key trends and developments that emerged during the last few years of market turmoil: the need for greater intensity and resources to deal effectively with systemically important banks; the importance of applying a system-wide, macro perspective to the microprudential supervision of banks to assist in identifying, analysing and taking pre-emptive action to address systemic risk; and the increasing focus on effective crisis management, recovery and resolution measures in reducing both the probability and impact of a bank failure.

Ms Sabine Lautenschläger, Co-chair of the Core Principles Group and Vice-President of the Deutsche Bundesbank, noted that "the revised Core Principles contribute to the broader ongoing effort by the Basel Committee to raise the bar for banking supervision in the post-crisis era". She added that "the Committee has achieved a lot in terms of rule-making over the past five years and this work will be instrumental in firmly entrenching many of the supervisory lessons and regulatory developments since the Core Principles were last revised".

The latest revision ensures the continued relevance of the Core Principles in providing a benchmark for supervisory practices that will withstand the test of time and changing environments. The total number of Core Principles has increased from 25 to 29; 36 new essential and additional criteria have been introduced and another 33 additional criteria have been upgraded to essential criteria that represent minimum baseline requirements for all countries.

The Core Principles are the de facto framework of minimum standards for sound supervisory practices and are universally applicable. The Committee believes that implementation of the revised Core Principles by all countries will be a significant step towards improving financial stability domestically and internationally, and provide a good basis for further development of effective supervisory systems.

"With the advent of various policy measures for addressing both bank-specific and broader systemic risks, the key challenge in this revision of the Core Principles has been to uphold their relevance for different jurisdictions and banking systems," stated Ms Teo Swee Lian, Co-chair of the Core Principles Group and Deputy Managing Director of the Monetary Authority of Singapore. "As highlighted in the paper, a proportionate approach achieves this through advocating risk-based supervision and supervisory expectations that are commensurate with a bank's risk profile and systemic importance."

The revised Core Principles represented the collective efforts between the Basel Committee and other banking supervisors from around the world, as well as the International Monetary Fund and the World Bank.

For information purposes, a document comparing the 2006 assessment methodology with the revised version has also been posted. This document is provided to facilitate a direct comparison between the two versions of assessment criteria.

The Basel Committee welcomes comments on the revised Core Principles. Comments should be submitted by Tuesday 20 March 2012 by email to: baselcommittee@bis.org. Alternatively, comments may be sent by post to the Secretariat of the Basel Committee on Banking Supervision, Bank for International Settlements, CH-4002 Basel, Switzerland. All comments may be published on the Bank for International Settlements's website unless a commenter specifically requests confidential treatment.


PDF: http://www.bis.org/publ/bcbs213.pdf (84 pages)

Tuesday, December 20, 2011

BCBS: Proposed regulatory capital disclosure requirements

Proposed regulatory capital disclosure requirements issued by the Basel Committee

December 19, 2011
http://www.bis.org/press/p111219a.htm

The Basel Committee on Banking Supervision today published for consultation a set of requirements for banks to disclose the composition of their regulatory capital. These aim to improve the transparency and comparability of banks' capital bases, including on a cross border basis.

During the financial crisis, market participants and supervisors attempted to undertake detailed assessments of the capital positions of banks and make cross jurisdictional comparisons. These efforts were often hampered by insufficiently detailed disclosure and a lack of consistency in reporting between banks and across jurisdictions. A lack of clarity on the quality of capital may have contributed to uncertainty during the financial crisis.

In addition to improving the quality and level of required capital, Basel III established certain high level disclosure requirements to improve transparency of regulatory capital and enhance market discipline. The Basel Committee noted that it would issue more detailed Pillar 3 disclosure requirements in 2011. Today's publication sets out these detailed requirements for consultation.

The Basel Committee welcomes comments on the proposed consultative document. Comments should be submitted by Friday 17 February 2012 by email to: baselcommittee@bis.org. Alternatively, comments may be sent by post to the Secretariat of the Basel Committee on Banking Supervision, Bank for International Settlements, CH-4002 Basel, Switzerland. All comments may be published on the Bank for International Settlements's website unless a commenter specifically requests confidential treatment.