Wednesday, January 23, 2013

Liquidity and Transparency in Bank Risk Management. By Lev Ratnovski

Liquidity and Transparency in Bank Risk Management. By Lev Ratnovski
IMF Working Paper No. 13/16
http://www.imf.org/external/pubs/cat/longres.aspx?sk=40258.0

Summary: Banks may be unable to refinance short-term liabilities in case of solvency concerns. To manage this risk, banks can accumulate a buffer of liquid assets, or strengthen transparency to communicate solvency. While a liquidity buffer provides complete insurance against small shocks, transparency covers also large shocks but imperfectly. Due to leverage, an unregulated bank may choose insufficient liquidity buffers and transparency. The regulatory response is constained: while liquidity buffers can be imposed, transparency is not verifiable. Moreover, liquidity requirements can compromise banks' transparency choices, and increase refinancing risk. To be effective, liquidity requirements should be complemented by measures that increase bank incentives to adopt transparency.

Conclusion

The paper emphasized that both liquidity buffers and — in a novel perspective — bank transparency (better communication that enhances access to external refinancing) are important in bank liquidity risk management. In a liquidity event, a liquidity buffer can cover small withdrawals with certainty. Transparency allows the bank to refinance large withdrawals too, but it is not always effective. Banks may choose insufficient liquidity and transparency; the optimal policy response is constrained by the fact that bank transparency is not verifiable.

The paper offers important policy implications, particularly for the ongoing liquidity regulation debate. The results caution that the focus on liquidity requirements needs to be complemented by measures to improve bank transparency and access to market refinancing. Without such measures, liquidity requirements may not achieve the full potential of improvements in social welfare, and under some conditions may have unintended effects. We also highlight the need for better corporate governance as a way to improve bank transparency, and the scope to use net stable funding ratios to increase the effectiveness of liquidity requirements.

Q&A session with a Japanese citizen on contemporary politics

The comments of a Japanese citizens on politics and our questions:
[...]

- what do supporters of Tokyo's governor say about him? I always read comments against him, but he wins the elections, so there are lots of (silent) supporters.

our present PM Shinzo Abe is supported by citizens pretty well so far. he showed a policy for economics called "Abenomics" lately. then even though nothing's done yet just show it to us, stock prises' been rising and rising, curency rate's been much better (Yen got too much strong and international exporting companies as even Sony, Panasonic or Toyota were getting big loss for these 3 years).

also, his strategy of international is evaluated. former government always compromised to Chinese or Korean's unreasonable accusations just for economical reasons. but he is trying to build a strong relationships between South-east Asian countries, Australia, India and Russia. especially SE Asian countries welcome this because they've been threatened by Chinese forces, they've actually wanted Japan to have leadership against China.
he seems doing very fine now.

i have to say many of japanese medias have big problems... many people in medias are kind of "traitor". they pick up "noisy minority" and show it just like the majority to give Japan bad names to the world... it's fine if their opinions are to make things better, but just critisise. it will be too long if i explain this.

Question: what do supporters of nuclear arms say? I always read assurances about the Japanese not wanting the A-bomb, but from time to time some politician says that Japan is considering protecting herself (against North Korea and China)

i think there are not many supporters of nuclear arms. some polititians and scholars are saying to discuss it. because almost all of Japanese has a sort of allergy for nuclear weapons (it comes from trauma of WW2) and even discussions are taboo. indeed, even though we have been aimed by Chinese and maybe Korean A-missiles. so they are claiming to get out of trauma now. also they say even only discussion will be a detterent to the countries. (personally i agree with them, we don't need to have it, but it's nonsense not to even talk). recently it seems peoples who agree with it has been increasing.

PM Abe is trying to have a good relationship between US, but on the other hand, trying to protect ourself with proper forces. and it's generally supported by majority (at least it seems so to me). almost all of people likes US better than China in politics, but lately people started to think stand by ourselvs.
actually now is the turning point of Japan after WW2 i think...

if you want to understand how Japanese are, it might be important to understand "Shinto". it's a domestic thoughts/philosophy of Japan, quite religious but not religion. it says we have 8 million gods in our land - god of fire, god of wood, god of sea, god of marrige, god of traffic, god of study... so we should thank to everything, be good and respect others. it is the base of moral of Japanese, we are brought up with it by our parents. it's easy to accept other religion for Shinto, because any god or buddha of other religions can be one of the god. (Japanese buddism are a lot influenced and mixed with Shinto). our Tenno (emperor) is regarded as offspring of the god. http://en.wikipedia.org/wiki/Shinto

last year, South Korean president insulted Tenno, then all of Japanese citizens got angry, (i have never seen such angry Japanese ever!), i understood Tenno is the symbol of this country and Shinto at the moment.

sorry it's getting out of focus.

[...]

take care,

[...]

---
Remember Ozawa: "If Japan desires, it can possess thousands of nuclear warheads." 2009. https://www.bipartisanalliance.com/2009/06/remember-ozawa-if-japan-desires-it-can.html

Sunday, January 20, 2013

Are there clear affinities of Communism with Fascism?

Political philosopher John N. Gray on liberals' totalitarian temptation
Times Literary Supplement, Jan 02, 2013:

One of the features that distinguished Bolshevism from Tsarism was the insistence of Lenin and his followers on the need for a complete overhaul of society. Old-fashioned despots may modernize in piecemeal fashion if doing so seems necessary to maintain their power, but they do not aim at remaking society on a new model, still less at fashioning a new type of humanity. Communist regimes engaged in mass killing in order to achieve these transformations, and paradoxically it is this essentially totalitarian ambition that has appealed to liberals. Here as elsewhere, the commonplace distinction between utopianism and meliorism is less than fundamental. In its predominant forms, liberalism has been in recent times a version of the religion of humanity, and with rare exceptions— [Bertrand] Russell is one of the few that come to mind—liberals have seen the Communist experiment as a hyperbolic expression of their own project of improvement; if the experiment failed, its casualties were incurred for the sake of a progressive cause. To think otherwise—to admit the possibility that the millions who were judged to be less than fully human suffered and died for nothing—would be to question the idea that history is a story of continuing human advance, which for liberals today is an article of faith. That is why, despite all evidence to the contrary, so many of them continue to deny Communism's clear affinities with Fascism. Blindness to the true nature of Communism is an inability to accept that radical evil can come from the pursuit of progress.

John Gray is professor emeritus at the London School of Economics

Friday, January 18, 2013

Relying on financial models to set loan-loss reserves could hurt small banks and their customers

Bank Reform Takes One Flawed Step Forward. By Eugene A Ludwig and Paul A Volcker
Relying on financial models to set loan-loss reserves could hurt small banks and their customers.The Wall Street Journal
January 18, 2013, on page A15
http://online.wsj.com/article/SB10001424127887323468604578245421482083936.html

The Financial Accounting Standards Board finished 2012 on a high note, issuing a draft new rule to change the way banks build reserves against losses on loans. It is a major step forward from our current system. Still, FASB's proposed rule is flawed conceptually and in its application, and in itself it cannot achieve the international consistency that is desirable.

The good news: The board recognizes that its existing rules on the Allocation for Loan and Lease Losses may have worsened the 2008 financial crisis. These rules limited bank reserves to those that are already "incurred." This all but ensures that banks' rainy day funds will be too skinny, particularly in periods when credit markets are under stress. Worse yet, limiting loss estimates to events that have already occurred makes the allowance for loan and lease losses procyclical—reported earnings are too high in good times and losses hit hardest in bad times.

The FASB's draft proposal to reform these rules incorporates what is known as the "Current Expected Credit Loss Model." It is meant to expand reserves to reflect losses that are expected over the life of the loan, and it is a big improvement over the existing regime. But as it stands, the proposal could create risks for the financial system.

In an effort to ensure that everything is "auditable," the proposal ties the loan-loss reserve to what the accounting profession will decide is an acceptable "model." While the proposal is well-intentioned and makes clear that various models can be used, this model-driven approach is dangerous.

Modeling by its very nature is backward looking. It would push bankers to address only risks that are readily and historically quantifiable. It would discourage them from acting on forward-looking but less well-defined risks, like broader economic trends, that can be just as damaging.

A focus on modeling also unnecessarily favors large institutions. Banks with smaller loan books and more hands-on experience have some advantages when setting their reserves. But what community bank has a sufficient data set, a team of "modelers," or complex statistical analysis software on hand? The FASB proposal could hurt small banks and their customers.

That is not to say that some quantitative models have no place in establishing reserves. Some institutions may choose to use models, even slavishly. But this should not be a requirement, unless experience and judgment lead the bank's prudential regulator to think otherwise.

There are other ways to go about setting reserves. A bank can follow a rigorous, board-approved process, for example by drawing on well-documented reviews from its CEO, chief credit officer, and the credit committee of the board of directors. The assumptions used in these judgmental reviews can be audited by regulators and outside accountants, and implementation of the process itself can be audited. This approach can be honest and effective without relying entirely on mathematical models.

The FASB proposal may have at least one smaller-scale but serious flaw. Although the text is unclear, the proposal appears to base reserves on cash flows above all other credit factors, such as collateral. We understand that this is not what was intended, and that "cash flows" is meant to include monies derived from collateral liquidation too. If this is the case, the language should be clarified.

While we do believe it is critical to allow bankers to use their expertise in estimating losses for reserve purposes, we also believe it is critical that they disclose to regulators and the public both the methodology they employ to set reserves and the quarter-by-quarter decisions on reserves they actually make. That way investors can follow a bank's net revenue picture before and after loan reserves are set aside, and the methods they use to establish these reserves.

It would be highly desirable to have one international rule in this area, as with accounting standards in the financial services area generally. The International Accounting Standards Board is preparing a new standard for bank reserves. Both the FASB and the IASB approaches will be open to comment. The goal should be to achieve consistency along the broad lines opened by the FASB proposal.

In sum, the FASB's draft proposal is a positive step. But it will require revision so that small banks are not put at a disadvantage, and so that all banks can employ rational and effective methods to set aside their rainy day funds.

Mr. Ludwig, CEO of Promontory Financial Group, was comptroller of the currency from 1993 to 1998. Mr. Volcker was chairman of the Federal Reserve System from 1979-1987.

Monday, January 14, 2013

Incentive Audits: A New Approach to Financial Regulation

Incentive Audits: A New Approach to Financial Regulation. By Martin Cihak
World Bank Blogs, Jan 14, 2012
http://blogs.worldbank.org/allaboutfinance/incentive-audits-a-new-approach-to-financial-regulation

Economists often disagree on policy advice. If you ask 10 of them, you may get 10 different answers, or more. But from time to time, economists actually do agree. One such area of agreement relates to the role of incentives in the financial sector. A large and growing literature points to misaligned incentives playing a key role in the run-up to the global financial crisis. In a recent paper, co-authored with Barry Johnston, we propose to address the incentive breakdowns head-on by performing “incentive audits”.

The global financial crisis has highlighted the destructive impact of misaligned incentives in the financial sector. This includes bank managers’ incentives to boost short-term profits and create banks that are “too big to fail”, regulators’ incentives to forebear and withhold information from other regulators in stressful times, credit rating agencies’ incentives to keep issuing high ratings for subprime assets, and so on. Of course, incentives play an important role in many economic activities, not just the financial ones. But nowhere are they as prominent, and nowhere can their impact get as damaging as in the financial sector, due to its leverage, interconnectedness, and systemic importance. A large body of recent literature examines these issues in depth. For example, Caprio, Demirgüç-Kunt and Kane (2008) show that incentive conflicts explain how securitization went wrong and why credit ratings proved so inaccurate; Barth, Caprio and Levine (2012) highlight incentive failures in regulatory authorities. Incentives were not the only factor – they were accentuated by problems of insufficient information, herd behavior, and so on – but breakdowns in incentives had clearly a central role in the run-up to the crisis.

Despite the broad agreement among economists, the focus of financial sector regulation and supervision has often been on other things, leaving incentives to be addressed indirectly at best. At the global level, substantial efforts have been devoted to issues such as calibrating risk weights to calculate banks’ minimum capital requirements. Numerous outside observers have called for more concerted efforts to address the incentive breakdowns that led to the crisis (e.g., LSE 2010; Squam Lake Working Group 2010; and Beck 2010). At the individual country level, regulatory changes have taken place in recent years, but in-depth analyses show a major scope to better address incentive problems (see Čihák, Demirgüç-Kunt, Martínez Pería, and Mohseni 2012, based on data from the World Bank’s 2011–12 Bank Regulation and Supervision Survey). The World Bank’s 2013 Global Financial Development Report also called for more vigorous steps to address incentive issues, rather than leaving them as an afterthought.

In a recent paper, joint with Barry Johnston, we propose a pragmatic approach to re-orienting financial regulation to have at its core addressing incentives on an ongoing basis. The paper, which of course represents our views and not necessarily those of the World Bank, proposes “incentive audits” as a tool to help in identifying incentive misalignments in the financial sector. The paper is an extended version of an earlier piece recognized by the International Centre for Financial Regulation and the Financial Times among top essays on “what good regulation should look like“.
The incentive audit approach aims to address systemic risk buildup directly at its source. While traditional, regulation-based approaches focus on building up capital and liquidity buffers in financial institutions, the incentive-based approach seeks to identify and correct distortions and frictions that contribute to the buildup of excessive risk. It goes beyond the symptoms to their source. For example, the buildup of massive risk concentrations before the crisis could be attributed to information gaps that prevented the assessment of exposures and network risks, to incentive failures in the monitoring of the risks due to conflicts of interest and moral hazard, and to regulatory incentives that encouraged risk transfers. Building up buffers can help, but to address systemic risk effectively, it is crucial to tackle the underlying incentives that give rise to it. Focusing on increasingly complex capital and liquidity charges has the danger of creating incentives for circumvention, and can run into limited capacity for implementation and enforcement. In the incentive-based approach, more emphasis is given on methods for identifying incentive failures resulting in systemic risk. The remedies go beyond narrowly defined prudential tools and include also other measures, such as elimination of tax incentives that encourage excessive borrowing.

What would an incentive audit involve? It would entail an analysis of structural and organizational features that affect incentives to conduct and monitor financial transactions. It would comprise a sequenced set of analyses proceeding from higher level questions on market structure, government safety nets and legal and regulatory framework, to progressively more detailed questions aimed at identifying the incentives that motivate and guide financial decisions (Figure 1). This sequenced approach enables drilling down and identifying factors leading to market failures and excessive risk taking.
Figure 1. The Design of Incentive Audits
The incentive audit is a novel concept, but analysis of incentives has been done. One example is the report of a parliamentary commission examining the roots of the Icelandic financial crisis. The report (Special Investigation Commission 2010) notes the rapid growth of Icelandic banks as a major contributor of the crisis. It documents the underlying “strong incentives for growth”, which included the banks’ incentive schemes and the high leverage of their owners. It maps out the network of conflicting interests of the key owners, who were also the largest debtors of these banks. Another example of work that is close to an incentive audit is the analysis by Calomiris (2011). He examines incentive failures in the U.S. financial market, and identifies a subset of reforms that are “incentive-robust,” that is, they improve market incentives, market discipline, and incentives of regulators and supervisors by making rules and their enforcement more transparent, increasing credibility and accountability. These examples illustrate that an incentive audit is doable and useful.

Who would perform incentive audits? Our paper offers some suggestions. The governance of the institution performing the audits is important--its own incentives to act need to be appropriately aligned. Also, to be effective, incentive audits would have to be performed regularly, and their outcomes would have to be used to address incentive issues by adapting regulation, supervision, and other measures. In Iceland, the analysis of incentives was a part of a “post mortem” on the crisis, but it is feasible to do such analysis ex-ante. Indeed, much of the information used in the above mentioned report was available even before the crisis. The Commission had modest resources, illustrating that incentive audits need not be very costly or overly complicated to perform. As the Commission’s report points out, “it should have been clear to the supervisory authorities that such incentives existed and that there was reason for concern,” but supervisors “did not keep up with the rapid changes in the banks’ practices”. Instead of examining the reasons for the changes, the supervisors took comfort in banks’ capital ratios exceeding a statutory minimum and appearing robust in narrowly-defined stress tests (Čihák and Ong 2010).

An incentive audit needs to be complemented by other tools. It needs to be combined with quantitative risk assessment and with assessments of the regulatory, supervisory, and crisis preparedness frameworks. The audit provides an organizing framework, putting the identification and correction of incentive misalignments front and center.

Incentive audits are not a panacea, of course. Financial markets suffer from issues that go beyond misaligned incentives, such as limited rationality, herd behavior and so on. But better identifying and addressing incentive misalignments is a key practical step, and the incentive audits can help.

References
Barth, James, Gerard Caprio, and Ross Levine. 2012. Guardians of Finance: Making Regulators Work for Us, MIT Press.
Beck, Thorsten (ed). 2010. Future of Banking. Centre for Economic Policy Research (CEPR). Published by vox.eu.
Caprio, Gerard, Asli Demirgüç-Kunt, and Edward J. Kane. 2010. “The 2007 Meltdown in Structured Securitization: Searching for Lessons, not Scapegoats.” World Bank Research Observer 25 (1): 125-55.
Calomiris, Charles. 2011. Incentive‐Robust Financial Reform, Cato Journal  31 (3): 561–589.
Čihák, Martin, Asli Demirgüç-Kunt, Maria Soledad Martínez Pería, and Amin Mohseni. 2012. “Banking Regulation and Supervision around the World: Crisis Update.” Policy Research Working Paper 6286, World Bank, Washington, DC.
Čihák, Martin, Asli Demirgüç-Kunt, and R. Barry Johnston. 2013. “Incentive Audits: A New Approach to Financial Regulation.” Policy Research Working Paper 6308, World Bank, Washington, DC.
Čihák, Martin, and Li Lian Ong. 2010. “Of Runes and Sagas: Perspectives on Liquidity Stress Testing Using an Iceland Example.” Working Paper 10/156, IMF, Washington, DC.
London School of Economics. 2010. The Future of Finance: The LSE Report. London: London School of Economics.
Special Investigation Commission. 2010. Report on the collapse of the three main banks in Iceland. Icelandic Parliament, April 12.
Squam Lake Working Group. 2010. Regulation of Executive Compensation in Financial Services. Squam Lake Working Group on Financial Regulation
World Bank. 2012. Global Financial Development Report 2013: Rethinking the Role of the State in Finance, World Bank, Washington DC.

Thursday, January 10, 2013

BCBS Principles for effective risk data aggregation and risk reporting

BCBS Principles for effective risk data aggregation and risk reporting
January 2013
http://www.bis.org/publ/bcbs239.htm

The financial crisis that began in 2007 revealed that many banks, including global systemically important banks (G-SIBs), were unable to aggregate risk exposures and identify concentrations fully, quickly and accurately. This meant that banks' ability to take risk decisions in a timely fashion was seriously impaired with wide-ranging consequences for the banks themselves and for the stability of the financial system as a whole.

The Basel Committee's Principles for effective risk data aggregation will strengthen banks' risk data aggregation capabilities and internal risk reporting practices. Implementation of the principles will strengthen risk management at banks - in particular, G-SIBs - thereby enhancing their ability to cope with stress and crisis situations.

An earlier version of the principles published today was issued for consultation in June 2012. The Committee wishes to thank those who provided feedback and comments as these were instrumental in revising and finalising the principles.

Objectives (excerpted):

The adoption of these Principles will enable fundamental improvements to the management of banks. The Principles are expected to support a bank’s efforts to:

• Enhance the infrastructure for reporting key information, particularly that used by the board and senior management to identify, monitor and manage risks;
• Improve the decision-making process throughout the banking organisation;
• Enhance the management of information across legal entities, while facilitating a comprehensive assessment of risk exposures at the global consolidated level;
• Reduce the probability and severity of losses resulting from risk management weaknesses;
• Improve the speed at which information is available and hence decisions can be made; and
• Improve the organisation’s quality of strategic planning and the ability to manage the risk of new products and services.

Tuesday, January 8, 2013

Capital Requirements for Over-the-Counter Derivatives Central Counterparties

Capital Requirements for Over-the-Counter Derivatives Central Counterparties. By Li Lin and Jay Surti
IMF Working Paper No. 13/3, January 08, 2013
http://www.imf.org/external/pubs/cat/longres.aspx?sk=40220.0

Summary: The central counterparties dominating the market for the clearing of over-the-counter interest rate and credit derivatives are globally systemic. Employing methodologies similar to the calculation of banks’ capital requirements against trading book exposures, this paper assesses the sensitivity of central counterparties’ required risk buffers, or capital requirements, to a range of model inputs. We find them to be highly sensitive to whether key model parameters are calibrated on a point-in-time versus stress-period basis, whether the risk tolerance metric adequately captures tail events, and the ability—or lack thereof—to define exposures on the basis of netting sets spanning multiple risk factors. Our results suggest that there are considerable benefits from having prudential authorities adopt a more prescriptive approach to for central counterparties’ risk buffers, in line with recent enhancements to the capital regime for banks.

ISBN: 9781475535501
ISSN: 2227-8885
Stock No: WPIEA2013003

Sunday, January 6, 2013

Group of Governors and Heads of Supervision endorses revised liquidity standard for banks

Group of Governors and Heads of Supervision endorses revised liquidity standard for banks
January 6, 2013
http://www.bis.org/press/p130106.htm

The Group of Governors and Heads of Supervision (GHOS), the oversight body of the Basel Committee on Banking Supervision, met today to consider the Basel Committee's amendments to the Liquidity Coverage Ratio (LCR) as a minimum standard. It unanimously endorsed them. Today's agreement is a clear commitment to ensure that banks hold sufficient liquid assets to prevent central banks becoming the "lender of first resort".

The GHOS also endorsed a new Charter for the Committee, and discussed the Committee's medium-term work agenda.

The GHOS reaffirmed the LCR as an essential component of the Basel III reforms. It endorsed a package of amendments to the formulation of the LCR announced in 2010. The package has four elements: revisions to the definition of high quality liquid assets (HQLA) and net cash outflows; a timetable for phase-in of the standard; a reaffirmation of the usability of the stock of liquid assets in periods of stress, including during the transition period; and an agreement for the Basel Committee to conduct further work on the interaction between the LCR and the provision of central bank facilities.

A summary description of the agreed LCR is in Annex 1. The changes to the definition of the LCR, developed and agreed by the Basel Committee over the past two years, include an expansion in the range of assets eligible as HQLA and some refinements to the assumed inflow and outflow rates to better reflect actual experience in times of stress. These changes are set out in Annex 2. The full text incorporating these changes will be published on Monday 7 January.

The GHOS agreed that the LCR should be subject to phase-in arrangements which align with those that apply to the Basel III capital adequacy requirements. Specifically, the LCR will be introduced as planned on 1 January 2015, but the minimum requirement will begin at 60%, rising in equal annual steps of 10 percentage points to reach 100% on 1 January 2019. This graduated approach is designed to ensure that the LCR can be introduced without disruption to the orderly strengthening of banking systems or the ongoing financing of economic activity.

The GHOS agreed that, during periods of stress it would be entirely appropriate for banks to use their stock of HQLA, thereby falling below the minimum. Moreover, it is the responsibility of bank supervisors to give guidance on usability according to circumstances.

The GHOS also agreed today that, since deposits with central banks are the most - indeed, in some cases, the only - reliable form of liquidity, the interaction between the LCR and the provision of central bank facilities is critically important. The Committee will therefore continue to work on this issue over the next year.

GHOS members endorsed two other areas of further analysis. First, the Committee will continue to develop disclosure requirements for bank liquidity and funding profiles. Second, the Committee will continue to explore the use of market-based indicators of liquidity to supplement the existing measures based on asset classes and credit ratings.

The GHOS discussed and endorsed the Basel Committee's medium-term work agenda. Following the successful agreement of the LCR, the Committee will now press ahead with the review of the Net Stable Funding Ratio. This is a crucial component in the new framework, extending the scope of international agreement to the structure of banks' debt liabilities. This will be a priority for the Basel Committee over the next two years.

Over the next few years, the Basel Committee will also: complete the overhaul of the policy framework currently under way; continue to strengthen the peer review programme established in 2012 to monitor the implementation of reforms in individual jurisdictions; and monitor the impact of, and industry response to, recent and proposed regulatory reforms. During 2012 the Committee has been examining the comparability of model-based internal risk weightings and considering the appropriate balance between the simplicity, comparability and risk sensitivity of the regulatory framework. The GHOS encouraged continuation of this work in 2013 as a matter of priority. Furthermore, the GHOS supported the Committee's intention to promote effective macro- and microprudential supervision.

The GHOS also endorsed a new Charter for the Basel Committee. The new Charter sets out the Committee's objectives and key operating modalities, and is designed to improve understanding of the Committee's activities and decision-making processes.

Finally, the GHOS reiterated the importance of full, timely and consistent implementation of Basel III standards.

Mervyn King, Chairman of the GHOS and Governor of the Bank of England, said, "The Liquidity Coverage Ratio is a key component of the Basel III framework. The agreement reached today is a very significant achievement. For the first time in regulatory history, we have a truly global minimum standard for bank liquidity. Importantly, introducing a phased timetable for the introduction of the LCR, and reaffirming that a bank's stock of liquid assets are usable in times of stress, will ensure that the new liquidity standard will in no way hinder the ability of the global banking system to finance a recovery."

Stefan Ingves, Chairman of the Basel Committee and Governor of the Sveriges Riksbank, noted that "the amendments to the LCR are designed to ensure that it provides a sound minimum standard for bank liquidity - a standard that reflects actual experience during times of stress. The completion of this work will allow the Basel Committee to turn its attention to refining the other component of the new global liquidity standards, the Net Stable Funding Ratio, which remains subject to an observation period ahead of its implementation in 2018."
Listen to the press conference

To listen to introductory remarks from GHOS Chairman Mervyn King and the Basel Committee on Banking Supervision's Chairman Stefan Ingves as well as the question and answer session which followed, please dial +41 58 262 07 00 and enter the following access code: 2641523333.

 
Translations in German, Spanish, French and Italian will be published soon.

Saturday, January 5, 2013

We, Too, Are Violent Animals. By Jane Goodall, Richard Wrangham, and Dale Peterson

We, Too, Are Violent Animals. By Jane Goodall, Richard Wrangham, and Dale Peterson
Those who doubt that human aggression is an evolved trait should spend more time with chimpanzees and wolvesThe Wall Street Journal,January 5, 2013, on page C3
http://online.wsj.com/article/SB10001424127887323874204578220002834225378.html

Where does human savagery come from? The animal behaviorist Marc Bekoff, writing in Psychology Today after last month's awful events in Newtown, Conn., echoed a common view: It can't possibly come from nature or evolution. Harsh aggression, he wrote, is "extremely rare" in nonhuman animals, while violence is merely an odd feature of our own species, produced by a few wicked people. If only we could "rewild our hearts," he concluded, we might harness our "inborn goodness and optimism" and thereby return to our "nice, kind, compassionate, empathic" original selves.

If only if it were that simple. Calm and cooperative behavior indeed predominates in most species, but the idea that human aggression is qualitatively different from that of every other species is wrong.

The latest report from the research site that one of us (Jane Goodall) directs in Tanzania gives a quick sense of what a scientist who studies chimpanzees actually sees: "Ferdinand [the alpha male] is rather a brutal ruler, in that he tends to use his teeth rather a lot…a number of the males now have scars on their backs from being nicked or gashed by his canines…The politics in Mitumba [a second chimpanzee community] have also been bad. If we recall that: they all killed alpha-male Vincent when he reappeared injured; then Rudi as his successor probably killed up-and-coming young Ebony to stop him helping his older brother Edgar in challenging him…but to no avail, as Edgar eventually toppled him anyway."

A 2006 paper reviewed evidence from five separate chimpanzee populations in Africa, groups that have all been scientifically monitored for many years. The average "conservatively estimated risk of violent death" was 271 per 100,000 individuals per year. If that seems like a low rate, consider that a chimpanzee's social circle is limited to about 50 friends and close acquaintances. This means that chimpanzees can expect a member of their circle to be murdered once every seven years. Such a rate of violence would be intolerable in human society.

The violence among chimpanzees is impressively humanlike in several ways. Consider primitive human warfare, which has been well documented around the world. Groups of hunter-gatherers who come into contact with militarily superior groups of farmers rapidly abandon war, but where power is more equal, the hostility between societies that speak different languages is almost endless. Under those conditions, hunter-gatherers are remarkably similar to chimpanzees: Killings are mostly carried out by males, the killers tend to act in small gangs attacking vulnerable individuals, and every adult male in the society readily participates. Moreover, with hunter-gatherers as with chimpanzees, the ordinary response to encountering strangers who are vulnerable is to attack them.

Most animals do not exhibit this striking constellation of behaviors, but chimpanzees and humans are not the only species that form coalitions for killing. Other animals that use this strategy to kill their own species include group-living carnivores such as lions, spotted hyenas and wolves. The resulting mortality rate can be high: Among wolves, up to 40% of adults die from attacks by other packs.

Killing among these carnivores shows that ape-sized brains and grasping hands do not account for this unusual violent behavior. Two other features appear to be critical: variable group size and group-held territory. Variable group size means that lone individuals sometimes encounter small, vulnerable parties of neighbors. Having group territory means that by killing neighbors, the group can expand its territory to find extra resources that promote better breeding. In these circumstances, killing makes evolutionary sense—in humans as in chimpanzees and some carnivores.

What makes humans special is not our occasional propensity to kill strangers when we think we can do so safely. Our unique capacity is our skill at engineering peace. Within societies of hunter-gatherers (though only rarely between them), neighboring groups use peacemaking ceremonies to ensure that most of their interactions are friendly. In state-level societies, the state works to maintain a monopoly on violence. Though easily misused in the service of those who govern, the effect is benign when used to quell violence among the governed.

Under everyday conditions, humans are a delightfully peaceful and friendly species. But when tensions mount between groups of ordinary people or in the mind of an unstable individual, emotion can lead to deadly events. There but for the grace of fortune, circumstance and effective social institutions go you and I. Instead of constructing a feel-good fantasy about the innate goodness of most people and all animals, we should strive to better understand ourselves, the good parts along with the bad.

—Ms. Goodall has directed the scientific study of chimpanzee behavior at Gombe Stream National Park in Tanzania since 1960. Mr. Wrangham is the Ruth Moore Professor of Biological Anthropology at Harvard University. Mr. Peterson is the author of "Jane Goodall: The Woman Who Redefined Man."

Wednesday, January 2, 2013

Gross inflows, financial booms and crises. By Cesar Calderon and Megumi Kubota

Gross inflows, financial booms and crises. By Cesar Calderon and Megumi Kubota
World Bank Blogs, Wed, Jan 2nd, 2013
http://blogs.worldbank.org/allaboutfinance/gross-inflows-financial-booms-and-crises

Favorable growth prospects and higher asset returns in emerging market economies have been led to a sharp increase in flows of foreign finance in recent years. Massive inflows to the domestic economy may fuel activity in financial markets and — if not properly managed — booms in credit and asset prices may arise (Reinhart and Reinhart, 2009; Mendoza and Terrones, 2008, 2012). In turn, the expansion of credit and overvalued asset prices have been good predictors not only of the current financial crises but also past ones (Schularick and Taylor, 2012; Gourinchas and Obstfeld, 2012).

In a recent paper, Megumi Kubota and I synthesized both strands of the empirical literature and examine whether gross private inflows can predict the incidence of credit booms — and, especially, those financial booms that end up in a systemic banking crises.1  More specifically, our paper finds that surges gross private capital inflows can help explain the incidence of subsequent credit booms — and, especially those financial booms that are followed by systemic banking crises. When looking at the predictive power of capital flows, we argue that not all types of flows behave alike. We find that gross private other investment (OI) inflows robustly predict the incidence of credit booms — while portfolio investment (PI) has no systematic link and FDI  surges will at best mitigate the probability of credit booms. Consequently, gross private OI inflows are a good predictor of credit booms.

Our paper evaluates the linkages between surges in gross private capital inflows and the incidence of booms in credit markets. In contrast to previous research papers in this literature: (i) we use data on gross inflows rather than net inflows; and, (ii) we use quarterly data for 71 countries from 1975q1 and 2010q4 instead of annual frequency. In this context, we argue that the dynamic behavior of capital flows and credit markets along the business cycle is better captured using quarterly data.2 As a result, we can evaluate more precisely the impact on credit booms of (the overall amount and the different types of) financing flows coming from abroad. On the other hand, we are more interested the impact on credit markets of investment inflows coming from foreign investors. Using information on net inflows — especially since the mid-1990s for emerging markets — would not allow us to appropriately differentiate the behavior of foreign investors from that of domestic ones and it may provide misleading inference on the amount of capital supplied from abroad (Forbes and Warnock, 2012).3

Credit booms are identified using two different methodologies: (a) Mendoza and Terrones (2008), and (b) Gourinchas, Valdés and Landarretche (2001) — also applied in Barajas, Dell’Ariccia, and Levchenko (2009). Moreover, we look deeper into credit boom episodes and differentiate bad booms from those that booms that may come along with a soft landing of the economy. In general, the literature finds that credit booms are not always followed by a systemic banking crisis — see Tornell and Westermann (2002) and Barajas et al. (2009). For instance, Calderón and Servén (2011) find that only 4.6 percent of lending booms may end up in a full-blown banking crisis for advanced countries whereas its probability is 8.3 and 4.6 percent for Latin America and the Caribbean (LAC) and non-LAC emerging markets. Those credit booms that end up in an episodes of systemic banking crisis are denoted as “bad” credit booms — see Barajas et al. (2009).

Our panel Probit regression shows that gross private capital inflows are a good predictor of the incidence of credit booms. This result is robust with respect to any sample of countries, any criteria of credit booms and any set of control variables. Next, the probability of credit booms is higher when the surges in capital flows are driven by gross OI inflows and, to a lesser extent, by increases in gross portfolio investment (FPI) inflows. Surges of gross foreign direct investment (FDI) inflows would, at best, reduce the likelihood of credit booms. The main conduit is gross OI bank inflows10 when we unbundle the effect of gross private OI inflows on credit booms. Third, we find that capital flows do explain the incidence of bad credit booms and that the overall impact is significantly positive and greater than the impact on overall credit booms.

Finally, the likelihood of bad credit booms is greater when surges in capital inflows are driven by increases in OI inflows. As a result, the overall positive impact of gross OI inflows significantly predicts an increase in credit booms although the evidence on the impact of gross FDI and FPI inflows is somewhat mixed. So far, the literature has shown that increasing leverage in the financial system and overvalued currencies are the best predictors of financial crisis (Schularick and Taylor, 2012; Gourinchas and Obstfeld, 2012). Moreover, our findings suggest that surges in capital flows (especially, rising cross-border banking flows) are also a good indicator of future financial turmoil.

References
Barajas, A., G. Dell’Ariccia, and A. Levchenko, 2009.  “Credit Booms: the Good, the Bad, and the Ugly.” Washington, DC: IMF, manuscript
Calderón, C., and M. Kubota, 2012. “Gross Inflows Gone Wild: Gross Capital inflows, Credit Booms and Crises.” The World Bank Policy Research Working Paper 6270, December.
Calderón, C., and M. Kubota, 2012. “Sudden stops: Are global and local investors alike?” Journal of International Economics 89(1), 122-142
Calderón, C., and L. Servén, 2011. “Macro-Prudential Policies over the Cycle in Latin America.” Washington, DC: The World Bank, manuscript
Forbes, K.J., and F.E. Warnock, 2012. “Capital Flow Waves: Surges, Stops, Flight, and Retrenchment.” Journal of International Economics 88(2), 235-251
Gourinchas, P.O., and M. Obstfeld, 2012. “Stories of the Twentieth Century for the Twenty-First.” American Economic Journal: Macroeconomics 4(1), 226-265
Gourinchas, P.O., R. Valdes, and O. Landerretche, 2001. “Lending Booms: Latin America and the World.” Economia, Spring Issue, 47-99.
Mendoza, E.G., and M.E. Terrones, 2008. “An anatomy of credit booms: Evidence from macro aggregates and micro data.” NBER Working Paper 14049, May
Mendoza, E.G. and M.E. Terrones, 2012. “An Anatomy of Credit Booms and their Demise,” NBER Working Paper 18379, September.
Reinhart, C.M., and V. Reinhart, 2009. “Capital Flow Bonanzas: An Encompassing View of the Past and Present.” In: Frankel, J.A., and C. Pissarides, Eds., NBER International Seminar on Macroeconomics 2008. Chicago, IL: University of Chicago Press for NBER, pp. 9-62
Rothenberg, A., Warnock, F., 2011. “Sudden flight and true sudden stops.” Review of International Economics 19(3), 509-524.
Schularick, M., and A.M. Taylor, 2012. “Credit Booms Gone Bust: Monetary Policy, Leverage Cycles, and Financial Crises, 1870–2008.” American Economic Review 102(2), 1029–1061

______________________
1 Read Working Paper.
2 Rothenberg and Warnock (2011), Forbes and Warnock (2012) and Calderón and Kubota (2012) already provide a more accurate analysis of extreme movement in (net and gross) capital flows using quarterly data.
3 The “two-way capital flows” phenomena cannot be identified using net inflows.

Sunday, December 30, 2012

From "Weiwei-isms." By Ai Weiwei

Selection from "Weiwei-isms," by Ai Weiwei. Edited by Larry Warsh. Princeton University Press, 152 pp, ISBN-13: 978-0691157665

Living in a system under the communist ideology, an artist cannot avoid fighting for freedom of expression. You always have to be aware that art is not only a self-expression but a demonstration of human rights and dignity. To express yourself freely, a right as personal as it is, has always been difficult, given the political situation.—NY Arts, March-April 2008

Tips on surviving the regime: Respect yourself and speak for others. Do one small thing every day to prove the existence of justice.—Twitter, Aug. 6, 2009

Choices after waking up: To be true or to lie? To take action or be brainwashed? To be free or be jailed? —Twitter, Sept. 4, 2009

No outdoor sports can be more elegant than throwing stones at autocracy; no melees can be more exciting than those in cyberspace. —Twitter, March 10, 2010

Nothing can silence me as long as I am alive. I don't give any kind of excuse. If I cannot come out [of China] or I cannot go in [to China] this is not going to change my belief. But when I am there, I am in this condition: I see it, I see people who need help. Then you know, I just want to offer my possibility to help them.—The Paley Center for Media, March 15, 2010

The officials want China to be seen as a cultured, creative nation, but in this anti-liberal political society everything outside the direct control of the state is seen as a potential threat.—CNBC.com, May 12, 2010

During my detention, they kept asking me: Ai Weiwei, what is the reason you have become like this today? My answer is: First, I refuse to forget. My parents, my family, their whole generation and my generation all paid a great deal in the struggle for freedom of speech. Many people died just because of one sentence or even one word. Somebody has to take responsibility for that. —Der Spiegel, Nov. 21, 2011

In a society like this there is no negotiation, no discussion, except to tell you that power can crush you any time they want—not only you, your whole family and all people like you.—Financial Times, Feb. 24, 2012

China might seem quite successful in its controls, but it has only raised the water level. It's like building a dam: It thinks there is more water so it will build higher. But every drop of water is still in there. It doesn't understand how to let the pressure out. It builds up a way to maintain control and push the problem to the next generation. —Guardian, April 15, 2012

I will never leave China, unless I am forced to. Because China is mine. I will not leave something that belongs to me in the hands of people I do not trust.—Reuters, May 29, 2012

Thursday, December 27, 2012

Brookings: The Exaggerated Death of the Middle Class

The Exaggerated Death of the Middle Class. By Ron Haskins and Scott Winship
Brookings, December 11, 2012
http://www.brookings.edu/research/opinions/2012/12/11-middle-class-haskins-winship?cid=em_es122712

Excerpts:

The most easily obtained income figures are not the most appropriate ones for assessing changes in living standards; those are also the figures that are often used to reach unwarranted conclusions about “middle class decline.” For example, analysts and pundits often rely on data that do not include all sources of income. Consider data on comprehensive income assembled by Cornell University economist Richard Burkhauser and his colleagues for the period between 1979—the year it supposedly all went wrong for working Americans—and 2007, before the Great Recession.

When Burkhauser looked at market income as reported to the Internal Revenue Service (IRS), the basis for the top 1 percent inequality figures that inspired Occupy Wall Street, he found that incomes for the bottom 60 percent of tax filers stagnated or declined over the nearly three-decade period. Incomes in the middle fifth of tax returns grew by only 2 percent on average, and those in the bottom fifth declined by 33 percent.

Things appeared somewhat better when Burkhauser looked at the definition of income favored by the Census Bureau which, unlike IRS figures, includes government cash payments from programs like Social Security and welfare, and looks at households rather than tax returns.

Still, the income of the middle fifth only rose by 15 percent over the entire three decades, much less than 1 percent per year. The Census Bureau reports that from 2000 to 2010, the income of the middle fifth actually fell by 8 percent. With numbers like these, it’s understandable why so many people think the American middle class is under threat and in decline.

But there are three reasons why even the Census Bureau figures are deceiving. The size of U.S. households, which has been declining, is not taken into account. The figures ignore the net impact on income of government taxes and non-cash transfers like food stamps and health insurance, which benefit the poor and middle class much more than richer households, and the value of health insurance provided by employers is also left out.

Burkhauser and his colleagues show that if these factors are taken into account, the incomes of the bottom fifth of households actually increased by 26 percent, rather than declining by 33 percent. Those of the middle fifth increased by 37 percent, rather than by only 2 percent. There is no disappearing middle class in these data; nor can household income, even at the bottom, be characterized as stagnant, let alone declining. Even after 2000, estimates from the Congressional Budget Office (CBO) show the bottom 60 percent of households got 10 percent richer by 2009, the most recent year available.


Making sense of income trends
Aside from the brighter picture presented by the Burkhauser and CBO analyses, there is a more complicated trend emerging in the United States. Four factors, both inside and outside the market, explain those trends.

The first market factor affecting middle-class income is a longtime trend of low literacy and math achievement in U.S. schools, which partially explains why conventional analyses of income show stagnation and decline. Young Americans entering the job market need skills valuable in a modern economy if they expect to earn a decent wage. Education and technical training are key to acquiring these skills. Yet the achievement test scores of children in literacy and math have been stagnant for more than two decades and are consistently far down the list in international comparisons.

It is true that African American and Hispanic students have closed part of the gap between themselves and Caucasian and Asian students; but the gap between students from economically advantaged families and students from disadvantaged ones has widened substantially—by 30 to 40 percent over the past 25 years.1

In a nation committed to educational equality and economic mobility, the income gap in achievement test scores is deeply problematic. Far from increasing educational equality as an important route to boosting economic opportunity, the American educational system reinforces the advantages that students from middle-class families bring with them to the classroom. Thus, the nation has two education problems that are limiting the income of workers at both the bottom and middle of the distribution: the average student is not learning enough, compared with students from other nations, and students from poor families are falling further and further behind.

It is difficult to see how students with a poor quality of education will be able to support a family comfortably in our technologically advanced economy if they rely exclusively on their earnings.

The second market factor is the increasing share of our economy devoted to health care. According to the Kaiser Foundation, employer-sponsored health insurance premiums for families increased 113 percent between 2001 and 2011. Most economists would say that this money comes directly out of worker wages. In other words, if it weren’t for the remarkable increase in the cost of health care, workers’ wages would be higher. When the portion of market compensation received in the form of health insurance is ignored in conventional analyses, income gains over time are understated.

Turning to non-market factors, marriage and childbearing increasingly distinguish the haves and have-nots.

Families have fewer children, and more U.S. adults are living alone today than in the past. As a result, households on average are better off since there are fewer mouths to feed, regardless of income. At the same time, single parenthood has grown more common, thereby increasing inequality between the poor and the middle class. Female-headed families are more than four times as likely to be in poverty, and children from these families are more likely to have trouble in school as compared with children in married-couple families. The increasing tendency of similarly educated men and women to marry each other also contributes to rising inequality.

The most important non-market factor is the net impact of government taxes and transfer payments on household income. The budget of the U.S. government for 2012 is $3.6 trillion. About 65 percent of that amount is spent on transfer payments to individuals. The biggest transfer payments are: $770 billion for Social Security, $560 billion for Medicare, $262 billion for Medicaid, and nearly $100 billion for nutrition programs. In addition to these federal expenditures, state governments also spend tens of billions of dollars on programs for low-income households. Almost all of the over $1 trillion in state and federal spending on means-tested programs (those that provide benefits only to people below some income cutoff) goes to low-income households.

Thus, taking into account the progressive nature of Social Security and Medicare benefits, the effect of government expenditures is to greatly increase household income at the bottom and reduce economic inequality.

Similarly, federal taxation—and to a lesser extent state taxation—is progressive. Americans in the bottom 40 percent of the income distribution pay negative federal income taxes because the Earned Income Tax Credit and the Child Tax Credit actually pay cash to millions of low-income families with children.

IRS data on incomes incorporate only the small fraction of transfer income that is taxable. Census data includes all cash transfer payments but leaves out non-cash transfers—among which Medicaid and Medicare benefits are the most important—and taxes.

The bottom line is that market income has grown, and government programs have greatly increased the well-being of low-income and middle-class households. The middle class is not shrinking or becoming impoverished. Rather, changes in workers’ skills and employers’ demand for them, along with changes in families’ size and makeup, have caused the incomes of the well-off to climb much faster than the incomes of most Americans.

Rising inequality can occur even as everyone experiences improvement in living standards.

Even so, unless the nation’s education system improves, especially for children from poor families, millions of working Americans will continue to rely on government transfer payments. This signals a real problem. Millions of individuals and families at the bottom and in the middle of the income distribution are dependent on government to enjoy a decent or rising standard of living. While the U.S. middle class may not be shrinking, the trends outlined above make clear why this is no reason for complacency. Today’s form of widespread dependency on government benefits has helped stem a decline in income, but far better would be to have more people earning all or nearly all their income through work. Getting there, though, will require deeper reforms in the structure of the U.S. education system.

---
1 Sean F. Reardon, Wither Opportunity? Rising Inequality and the Uncertain Life Chances of Low-Income Children (New York: Russel Sage Foundation Press, 2001).

Tuesday, December 25, 2012

Smarter Ways to Discipline Children

Smarter Ways to Discipline Children. By Andrea Petersen
Research Suggests Which Strategies Really Get Children to Behave; How Timeouts Can Work BetterWSJ, December 24, 2012
http://online.wsj.com/article/SB10001424127887323277504578189680452680490.html

When it comes to disciplining her generally well-behaved kids, Heather Henderson has tried all the popular tricks. She's tried taking toys away. (Her boys, ages 4 and 6, never miss them.) She's tried calm explanations about why a particular behavior—like hitting your brother—is wrong. (It doesn't seem to sink in.) And she's tried timeouts. "The older one will scream and yell and bang on walls. He just loses it," says the 41-year-old stay-at-home mother in Syracuse, N.Y.

What can be more effective are techniques that psychologists often use with the most difficult kids, including children with attention deficit hyperactivity disorder and oppositional defiant disorder. Approaches, with names like "parent management training" and "parent-child interaction therapy," are backed up by hundreds of research studies and they work on typical kids, too. But while some of the approaches' components find their way into popular advice books, the tactics remain little known among the general public.

The general strategy is this: Instead of just focusing on what happens when a child acts out, parents should first decide what behaviors they want to see in their kids (cleaning their room, getting ready for school on time, playing nicely with a sibling). Then they praise those behaviors when they see them. "You start praising them and it increases the frequency of good behavior," says Timothy Verduin, clinical assistant professor of child and adolescent psychiatry at the Child Study Center at NYU Langone Medical Center in New York.

This sounds simple, but in real life can be tough. People's brains have a "negativity bias," says Alan E. Kazdin, a professor of psychology and child psychiatry at Yale University and director of the Yale Parenting Center. We pay more attention to when kids misbehave than when they act like angels. Dr. Kazdin recommends at least three or four instances of praise for good behavior for every timeout a kid gets. For young children, praise needs to be effusive and include a hug or some other physical affection, he says.

According to parent management training, when a child does mess up, parents should use mild negative consequences (a short timeout or a verbal reprimand without shouting).

Giving a child consequences runs counter to some popular advice that parents should only praise their kids. But reprimands and negative nonverbal responses like stern looks, timeouts and taking away privileges led to greater compliance by kids according to a review article published this month in the journal Clinical Child and Family Psychology Review.

"There's a lot of fear around punishment out there," says Daniela J. Owen, a clinical psychologist at the San Francisco Bay area Center for Cognitive Therapy in Oakland, Calif. and the lead author of the study. "Children benefit from boundaries and limits." The study found that praise and positive nonverbal responses like hugs and rewards like ice cream or stickers, however, didn't lead to greater compliance in the short term. "If your child is cleaning up and he puts a block in the box and you say 'great job,' it doesn't mean the child is likely to put another block in the box," says Dr. Owen.

But in the long run, regular praise does make a child more likely to comply, possibly because the consistent praise strengthens the parent-child relationship overall, Dr. Owen says. The article reviewed 41 studies looking at discipline strategies and child compliance.

Parents who look for discipline guidance often find conflicting advice from the avalanche of books and mommy blogs and the growing number of so-called parent coaches. (In 2011, 3,520 parenting books were published or distributed in the U.S., up from 2,774 in 2007, according to Bowker Books In Print database.)

"Many of the things that are recommended we know now to be wrong," says Dr. Kazdin, a leading expert on parent management training. "It is the equivalent of telling people to smoke a lot for their health."

Parents often torpedo their discipline efforts by giving vague, conditional commands and not giving kids enough time to comply with them, says Dr. Verduin, who practices parent-child interaction therapy. When crossing the street, "A bad command would be, 'be careful.' A good command would be 'hold my hand,' " he says. He also instructs parents to count to five to themselves after giving a child a directive, like, for example, "Put on your coat." "Most parents wait a second or two," he says, before making another command, which can easily devolve into yelling and threats.

The techniques are applicable to all ages, but psychologists note that starting early is better. Once kids hit about 10 or 11, discipline gets a lot harder. "Parents don't have as much leverage" with tweens and teens, says Dr. Verduin. "Kids don't care as much what the parents think about them."

Some parents try and reason with young children, which Dr. Kazdin says is bound to fail to change a kid's behavior. Reason doesn't change behavior, which is why stop-smoking messages don't usually work, Dr. Kazdin says. Overly harsh punishments also fail. "One of the side effects of punishment is noncompliance and aggression," he says.

Spanking, in particular, has been linked to aggressive behavior in kids and anger problems and increased marital conflict later on in adulthood. Still, 26% of parents "often" or "sometimes" spank their 19-to-35-month-old children, according to a 2004 study in the journal Pediatrics, which analyzed survey data collected by the federal government from 2,068 parents of young children.

At the Yale Parenting Center, psychologists have found that getting kids to "practice" temper tantrums can lessen their frequency and intensity. Dr. Kazdin recommends that parents have their kids "practice" once or twice a day. Gradually, ask the child to delete certain unwanted behaviors from the tantrum, like kicking or screaming. Then effusively praise those diluted tantrums. Soon, for most children, "the real tantrums start to change," he says. "From one to three weeks, they are kind of over." As for whining, Dr. Kazin recommends whining right along with your child. "It changes the stimulus. You will likely end up laughing," he says.

Researchers noted that not every technique is effective for every child. Some parents find other creative solutions that work for their kids.

Karen Pesapane has found yelling "pillow fight," when her two kids are arguing can put a halt to the bickering. "Their sour attitudes change almost immediately into silliness and I inevitably become their favorite target," said Ms. Pesapane, a 34-year-old from Silver Spring, Md., who works in fundraising for a nonprofit and has a daughter 10, and a son, 6.

Dayna Even has found spending one hour a day fully focused on her 6-year-old son, Maximilian, means "he's less likely to act out, he's more likely to play independently and less likely to interrupt adults," says the 51-year-old writer and tutor in Kailua, Hawaii.

Parents need to take a child's age into account. Benjamin Siegel, professor of pediatrics at the Boston University School of Medicine notes that it isn't until about age 3 that children can really start to understand and follow rules. Dr. Siegel is the chair of the American Academy of Pediatrics' committee that is currently reworking the organization's guidelines on discipline, last updated in 1998.

Monday, December 24, 2012

A case study in the dangers of the Law of the Sea Treaty

Lawless at Sea. WSJ Editorial
A case study in the dangers of the Law of the Sea Treaty.
The Wall Street Journal, December 24, 2012, on page A12
http://online.wsj.com/article/SB10001424127887324407504578187523862827016.html

The curious case of the U.S. hedge fund, the Argentine ship and Ghana is getting curiouser, and now it has taken a turn against national sovereignty. That's the only reasonable conclusion after a bizarre ruling this month from the International Tribunal for the Law of the Sea in Hamburg.

The tribunal—who knew it existed?—ordered the Republic of Ghana to overrule a decision of its own judiciary that had enforced a U.S. court judgment. The Hamburg court is the misbegotten child of the 1982 United Nations Convention on the Law of the Sea. Sold as a treaty to ensure the free movement of people and goods on the high seas, it was rejected by Ronald Reagan as an effort to control and redistribute the resources of the world's oceans.

The U.S. never has ratified the treaty, despite a push by President Obama, and now the solons of Hamburg have demonstrated the wisdom of that decision. While debates on the treaty have centered around the powers a country might enjoy hundreds of miles off its coast, many analysts have simply assumed that nations would still exercise control over the waters just offshore.

Now the Hamburg court has trampled local law in a case involving a ship sitting in port, and every country is now on notice that a Hamburg court is claiming authority over its internal waters.

Specifically, Hamburg ordered Ghana to release a sailing ship owned by the Argentine navy. On October 2, a subsidiary of U.S. investment fund Elliott Management persuaded a Ghanaian judge to order the seizure of the vessel. The old-fashioned schooner, used to train cadets, was on a tour of West Africa.

U.S. hedge funds don't normally seize naval ships, but in this case Elliott and the Ghanaian court are on solid ground. Elliott owns Argentine bonds on which Buenos Aires has been refusing to pay since its 2001 default. Elliott argues that a contract is a contract, and a federal court in New York agrees. Argentina had freely decided to issue its debt in U.S. capital markets and had agreed in its bond contracts to waive the sovereign immunity that would normally prevent lenders from seizing things like three-masted frigates.

To his credit, Judge Richard Adjei-Frimpong of Ghana's commercial court noted that Argentina had specifically waived its immunity when borrowing the money and that under Ghanaian law the ship could therefore be attached by creditors with a valid U.S. judgment registered in Ghana. He ordered the ship held at port until Buenos Aires starts following the orders of the U.S. court.

But in its recent ruling, which ordered Ghana to release the ship by December 22, the Hamburg court claimed that international law requires immunity for the Argentine "warship," as if Argentina never waived immunity and as if this is an actual warship. On Wednesday, Ghana released the vessel, and the ship set sail from the port of Tema for its trans-Atlantic voyage.

So here we have a case in which a small African nation admirably tried to adhere to the rule of law. Yet it was bullied by a global tribunal serving the ends of Argentina, which has brazenly violated the law in refusing to pay its debts and defying Ghana's court order. The next time the Senate moves to ratify the Law of the Sea Treaty, Ghana should be exhibit A for opponents.

Saturday, December 22, 2012

Novel Drug Approvals Strong in 2012

Novel Drug Approvals Strong in 2012
Dec 21, 2012
http://www.innovation.org/index.cfm/NewsCenter/Newsletters?NID=208

Over the past year, biopharmaceutical researchers' work has continued to yield innovative treatments to improve the lives of patients. In fiscal year (FY) 2012 (October 1, 2011 – September 30, 2012), the U.S. Food and Drug Administration (FDA) approved 35 new medicines, keeping pace with the previous fiscal year’s approvals and representing one of the highest levels of FDA approvals in recent years.[i] For the calendar year FDA is on track to approve more new medicines than any year since 2004.[ii]

A recent report from the FDA highlights the groundbreaking medicines to treat diseases ranging from the very common to the most rare. Some are the first treatment option available for a condition, others improve care for treatable diseases.

Notable approvals in FY 2012 include:
  • A breakthrough personalized medicine for a rare form of cystic fibrosis;
  • The first approved human cord blood product;
  • A total of ten drugs to treat cancer, including the first treatments for advanced basal cell carcinoma and myelofibrosis and a targeted therapy for HER2-positive metastatic breast cancer;
  • Nine treatments for rare diseases; and
  • Important new therapies for HIV, macular degeneration, and meningitis.
The number of new drugs approved this year reflects the continuing commitment of the biomedical research community – from biopharmaceutical companies to academia to government researchers to patient groups – to advance basic science and translate that knowledge into novel treatments that will advance our understanding of disease and improve patient outcomes.

Building on these noteworthy approvals, we look to the new year where continued innovation is needed to leverage our growing understanding of the underpinnings of human disease and to harness the power of scientific research tools to discover and develop new medicines.

To learn more about the more than 3,200 new medicines in development visit http://www.innovation.org/index.cfm/FutureOfInnovation/NewMedicinesinDevelopment.

Thursday, December 20, 2012

IMF's "European Union: Financial Sector Assessment," Preliminary Conclusions

"European Union: Financial Sector Assessment," Preliminary Conclusions by the IMF Staff
Press Release No. 12/500
Dec 20, 2012
http://www.imf.org/external/np/sec/pr/2012/pr12500.htm

A Financial Sector Assessment Program (FSAP) team led by the Monetary and Capital Markets Department of the International Monetary Fund (IMF) visited the European Union (EU) during November 27–December 13, 2012, to conduct a first-ever overall EU-wide assessment of the soundness and stability of the EU’s financial sector (EU FSAP). The EU FSAP builds on the 2011 European Financial Stability Exercise (EFFE) and on recent national FSAPs in EU member states.

The mission arrived at the following preliminary conclusions, which are subject to review and consultation with European institutions and national authorities:

The EU is facing great challenges, with continuing banking and sovereign debt crises in some parts of the Union. Significant progress has been made in recent months in laying the groundwork for strengthening the EU’s finacial sector. Implementation of policy decisions is needed. Although the breadth of the necessary agenda is significant, the details of the agreed frameworks need to be put in place to avoid delays in reaching consensus on key issues.

The present conjuncture makes management of the situation particularly difficult. The crisis reveals that handling financial system problems at the national level has been costly, calling for a Europe-wide approach. Interlinkages among the countries of the EU are particularly pronounced, and the need to provide more certainty on the health of banks has led to proposals for establishing a single supervisory mechanism (SSM) associated with the European Central Bank (ECB), initially for the euro area but potentially more widely in the EU.


The mission’s recommendations include the following:

Steps toward banking union
The December 13 EU Council agreement on the SSM is a strong achievement. It needs to be followed up with a structure that has as few gaps as possible, including with regard to the interaction of the SSM with national authorities under the prospective harmonized resolution and deposit guarantee arrangements. The SSM is only an initial step toward an effective Banking Union—actions toward a single resolution authority with common backstops, a deposit guarantee scheme, and a single rulebook, will also be essential.

Reinvigorating the single financial market in Europe
Harmonization of the regulatory structure across Europe needs to be expedited. EU institutions should accelerate passage of the Fourth Capital Requirements Directive, the Capital Requirements Regulation, the directives for harmonizing resolution and deposit insurance, as well as the regulatory regime for insurance Solvency II at the latest by mid–2013, thus enabling the issuance of single rulebooks for banking, insurance, and securities. Moreover, the European Commission should increase the resources and powers of the European Supervisory Authorities as needed to successfully achieve those mandates, while also enhancing their operational independence.

Improved and expanded stress testing
European stress testing needs to go beyond microprudential solvency, and increasingly serve to identify other vulnerabilities, such as liquidity risks and structural weaknesses. Confidence in the results of stress tests can be enhanced by an asset quality review, harmonized definitions of non-performing loans, and standardized loan classification, while maintaining a high level of disclosure. Experience suggests that the benefits of a bold approach outweigh the risks.

Splitting bank and sovereign risk
Measures must be pursued to separate bank and sovereign risk, including by making the ESM operational expeditiously for bank recapitalizations. Strong capital buffers will be important for the banks to perform their intermediating role effectively, to stimulate growth, and so safeguard financial stability.

Effective crisis management framework to minimize costs to taxpayers
Taxpayers’ potential liability following bank failures can be reduced by resolution regimes that include statutory bail-in powers. A common deposit insurance fund, preferably financed ex ante by levies on the banking sector, could also reduce the cost to taxpayers, even if it takes time to build up reserves. Granting preferential rights to depositor guarantee schemes in the creditor hierarchy could also reduce costs, particularly while guarantee funds are being built.

The European Commission and member states should assess the costs and benefits of the various plans for structural measures aimed at reducing banks’ complexity and potential taxpayer liability with a view towards formulating a coordinated proposal. If adopted, it would be important to ensure that such measures are complementary to the international reform agenda, not cause distortions in the single market, and not lead to regulatory arbitrage.

Lastly, the mission would like to extend their thanks to European institutions for close cooperation and assistance in completing this FSAP analysis.

Wednesday, December 19, 2012

The future of financial globalisation - BIS Annual Conference

The future of financial globalisation
http://www.bis.org/publ/bppdf/bispap69.htm

The BIS 11th Annual Conference took place in Lucerne, Switzerland on 21-22 June 2012. The event brought together senior representatives of central banks and academic institutions, who exchanged views on the conference theme of "The future of financial globalisation". This volume contains the opening address of Stephen Cecchetti (Economic Adviser, BIS), a keynote address from Amartya Sen (Harvard University), and the available contributions of the policy panel on "Will financial globalisation survive?". The participants in the policy panel discussion, chaired by Jaime Caruana (General Manager, BIS), were Ravi Menon (Monetary Authority of Singapore), Jacob Frenkel (JP Morgan Chase International) and José Dario Uribe Escobar (Banco de la Repubblica).

The papers presented at the conference and the discussants' comments are released as BIS Working Papers 397 to 400:

Financial Globalisation and the Crisis, BIS Working Papers No 397
by  Philip R. Lane
Comments by Dani Rodrik

The global financial crisis provides an important testing ground for the financial globalisation model. We ask three questions. First, did financial globalisation materially contribute to the origination of the global financial crisis? Second, once the crisis occurred, how did financial globalisation affect the incidence and propagation of the crisis across different countries? Third, how has financial globalisation affected the management of the crisis at national and international levels?


The great leveraging, BIS Working Papers No 398
by  Alan M. Taylor
Comments by Barry Eichengreen and Dr. Y V Reddy 

What can history can tell us about the relationship between the banking system, financial crises, the global economy, and economic performance? Evidence shows that in the advanced economies we live in a world that is more financialized than ever before as measured by importance of credit in the economy. I term this long-run evolution "The Great Leveraging" and present a ten-point examination of its main contours and implications.


Global safe assets, BIS Working Papers No 399
by Pierre-Olivier Gourinchas and Olivier Jeanne
Comments by Peter R Fisher and Fabrizio Saccomanni

Will the world run out of 'safe assets' and what would be the consequences on global financial stability? We argue that in a world with competing private stores of value, the global economic system tends to favor the riskiest ones. Privately produced stores of value cannot provide sufficient insurance against global shocks. Only public safe assets may, if appropriately supported by monetary policy. We draw some implications for the global financial system.


Capital Flows and the Risk-Taking Channel of Monetary Policy, BIS Working Papers No 400
by Valentina Bruno and Hyun Song Shin
Comments by Lars E O Svensson and John B Taylor

This paper examines the relationship between low interests maintained by advanced economy central banks and credit booms in emerging economies. In a model with crossborder banking, low funding rates increase credit supply, but the initial shock is amplified through the "risk-taking channel" of monetary policy where greater risk-taking interacts with dampened measured risks that are driven by currency appreciation to create a feedback loop. In an empirical investigation using VAR analysis, we find that expectations of lower short-term rates dampen measured risks and stimulate cross-border banking sector capital flows.

Tuesday, December 18, 2012

The rise of the older worker

The rise of the older worker, by Jim Hillage, research director
Institute for Employment Studies
December 12, 2012
http://www.employment-studies.co.uk/press/26_12.php

There are more people working in the UK today than at anytime in our history. Today's labour market statistics show another increase in the numbers employed taking the total to 29,600,000, up 40,000 on the previous quarter and 500,000 on a year ago.

Almost half of the rise has been among people aged 50 or over, with the fastest rate of increase occurring among those 65 or over, particularly among older women.

There are now almost a million people aged 65 or over in jobs, double the number ten years ago and up 13 per cent over the past year. Although these older workers comprise only three percent of the working population, they account for 20 per cent of the recent growth in employment. However this group has a very different labour market profile to the rest of the working population, particularly younger people, and there is no evidence to suggest older workers are gaining employment at the expense of the young generation. For example:
  • 30 per cent of older workers (ie aged 65+) work in managerial and professional jobs, compared with only nine per cent of younger workers (aged 16 to 24). Conversely 34 per cent of young people work in sales, care and leisure jobs, compared with only 14 of their older counterparts.
  • Nearly four in ten older workers are self-employed, compared with five per cent of younger workers.
  • Most (69 per cent) of 65 plus year olds work part-time, compared with 39 per cent of young workers (and 27 per cent of all those in work).
Jim Hillage, Director of Research at the Institute for Employment Studies, explains that:

‘There are a number of reasons why older workers are staying on in work. In some cases employers want to retain their skills and experience and encourage them to stay on, albeit on a part-time basis, and most older employees have been working for their employer for at least ten years and often in smaller workplaces. Conversely, some older people have to stay in work as their pensions are inadequate and it is interesting to note that employment of older workers is highest in London and the South East, where living costs are highest. Finally, there is also a growing group of self-employed who still want to retain their work connections and interests.’

2012 © Institute for Employment Studies


---
Update: Long-Term Jobless Begin to Find Work. By Ben Casselman
The Wall Street Journal, January 11, 2013, on page A2
http://online.wsj.com/article/SB10001424127887323442804578233390580359994.html

The epidemic of long-term unemployment, one of the most pernicious and persistent challenges bedeviling the U.S. economy, is finally showing signs of easing.

The long-term unemployed—those out of work more than six months—made up 39.1% of all job seekers in December, according to the Labor Department, the first time that figure has dropped below 40% in more than three years.

[http://si.wsj.net/public/resources/images/P1-BJ893_ECONOM_NS_20130110190005.jpg]

The problem is far from solved. Nearly 4.8 million Americans have been out of work for more than six months, down from a peak of more than 6.5 million in 2010 but still a level without precedent since World War II.

The recent signs of progress mark a reversal from earlier in the recovery, when long-term unemployment proved resistant to improvement elsewhere in the labor market.

Total unemployment peaked in late 2009 and has dropped relatively steadily since then, while the number of long-term unemployed continued to rise into 2010 and then fell only slowly through much of 2011.

More recently, however, unemployment has fallen more quickly among the long-term jobless than among the broader population. In the past year, the number of long-term unemployed workers has dropped by 830,000, accounting for nearly the entire 843,000-person drop in overall joblessness.

When Michael Leahy lost his job as a manager at a Connecticut bank in 2010, the state had already shed about 10,000 financial-sector jobs in the previous two years and he had difficulty even landing an interview. By the time banks started hiring again, Mr. Leahy, now 59, had been out of work for more than a year and found himself getting passed over for candidates with jobs or ones who had been laid off more recently.

In July, however, Mr. Leahy was accepted into a program for the long-term unemployed run by the Work Place, a local workforce development agency. The program helped Mr. Leahy improve his resume and interviewing skills, and ultimately connected him with a local bank that was hiring.

Mr. Leahy began a new job in December. The chance to work again in his chosen field, he said, was more than worth the roughly 15% pay cut from his previous job.

"The thing that surprised me is this positive feeling I have every day of getting up in the morning and knowing I have a place to go to and a place where people are waiting for me," Mr. Leahy said.

[http://si.wsj.net/public/resources/images/NA-BU523B_ECONO_D_20130110211902.jpg]

The decline in long-term unemployment is good news for the broader economy. Many economists, including Federal Reserve Chairman Ben Bernanke, feared that many long-term unemployed workers would become permanently unemployable, creating a "structural" unemployment problem akin to what Europe suffered in the 1980s. But those fears are beginning to recede along with the ranks of the long-term unemployed.

"I don't think it's the case that the long-term unemployed are no longer employable," said Omair Sharif, an economist for RBS Securities Inc. "In fact, they've been the ones getting the jobs."

Not all the drop in long-term joblessness can be attributed to workers finding positions. In recent years, millions of Americans have given up looking for work, at which point they no longer count as "unemployed" in official statistics.

The recent drop in long-term unemployment, however, doesn't appear to be due to such dropouts. The number of people who aren't in the labor force but say they want a job has risen by only about 400,000 in the past year, while the number of Americans with jobs has risen by 2.4 million. That suggests at least much of the improvement is due to people finding jobs, not dropping out, Mr. Sharif said.

The average unemployed worker has now been looking for 38 weeks, down from a peak of nearly 41 weeks and the lowest level since early 2011.

The long-term unemployed still face grim odds of finding work. About 10% of long-term job seekers found work in April, the most recent month for which a detailed breakdown is available, compared with about a quarter of more recently laid-off workers. The ranks of the short-term jobless are more quickly refreshed by newly laid-off workers, however. As a result, the total number of short-term unemployed has fallen more slowly in recent months, even though individual workers still stand a far better chance of finding work early in their search.

And when the long-term unemployed do find work, their new jobs generally pay less than their old ones—often much less. A recent study from economists at Boston University, Columbia University and the Institute for Employment Research found that every additional year out of work reduces workers' wages when they do find a job by 11%.

Moreover, the recent gains have yet to reach the longest of the long-term unemployed: While the number of people unemployed for between six months and two years has fallen by 12% in the past year, the ranks of those jobless for three years or longer has barely budged at all.

Patricia Soprych, a 51-year-old widow in Skokie, Ill., recently got a job as a grocery-store cashier after more than a year of looking for work. But the job is part-time and pays the minimum wage, which she finds barely enough to make ends meet.

"You say the job market's getting better. Yeah, for these $8.25-an-hour jobs," Ms. Soprych said.

Economists cite several reasons for the drop in long-term unemployment. Most significant is the gradual healing of the broader labor market, which has seen the unemployment rate drop to 7.8% in December from a high of 10% in 2009. After initially benefiting mostly the more recently laid-off, that progress is now being felt among the longer-term jobless as well.

The gradual strengthening in the housing market could lead to more improvement. Many of the long-term unemployed are former construction workers who lost jobs when the housing bubble burst. Rising home building has yet to lead to a surge in construction employment, but many experts expect hiring to pick up in 2013.

Another possible factor behind the recent progress: the gradual reduction in emergency unemployment benefits available to laid-off workers. During the recession, Congress extended unemployment benefits to as long as 99 weeks in some states. Today, benefits last 73 weeks at most, and less time in many states. Research suggests that unemployment payments lead some recipients not to look as hard for jobs, and the loss of benefits may have pushed some job seekers to accept work they might otherwise have rejected, said Gary Burtless, an economist at the Brookings Institution. 

Monday, December 17, 2012

Optimal Oil Production and the World Supply of Oil

Optimal Oil Production and the World Supply of Oil. By Nikolay Aleksandrov, Raphael Espinoza, and Lajos Gyurko
IMF Working Paper No. 12/294
Dec 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=40169.0

Summary: We study the optimal oil extraction strategy and the value of an oil field using a multiple real option approach. The numerical method is flexible enough to solve a model with several state variables, to discuss the effect of risk aversion, and to take into account uncertainty in the size of reserves. Optimal extraction in the baseline model is found to be volatile. If the oil producer is risk averse, production is more stable, but spare capacity is much higher than what is typically observed. We show that decisions are very sensitive to expectations on the equilibrium oil price using a mean reverting model of the oil price where the equilibrium price is also a random variable. Oil production was cut during the 2008–2009 crisis, and we find that the cut in production was larger for OPEC, for countries facing a lower discount rate, as predicted by the model, and for countries whose governments’ finances are less dependent on oil revenues. However, the net present value of a country’s oil reserves would be increased significantly (by 100 percent, in the most extreme case) if production was cut completely when prices fall below the country's threshold price. If several producers were to adopt such strategies, world oil prices would be higher but more stable.

Excerpts:

In this paper we investigate the optimal oil extraction strategy of a small oil producer facing uncertain oil prices. We use a multiple real option approach. Extracting a barrel of oil is similar to exercising a call option, i.e. oil production can be modeled as the right to produce a barrel of oil with the payoff of the strategy depending on uncertain oil prices. Production is optimal if the payoff of extracting oil exceeds the value of leaving oil under the ground for later extraction (the continuation value). For an oil producer, the optimal extraction path corresponds to the optimal strategy of an investor holding a multiple real option with finite number of exercises (finite reserves of oil). At any single point in time, the oil producer is also limited in the number of options he can exercise, because of capacity constraints.

Our first contribution is to present the solution to the stochastic optimization problem as an exercise rule for a multiple real option and to solve the problem numerically using the Monte Carlo methods developed by Longstaff and Schwartz (2001), Rogers (2002), and extended by Aleksandrov and Hambly (2010), Bender (2011), and Gyurko, Hambly and Witte (2011). The Monte Carlo regression method is flexible and it remains accurate even for high-dimensionality problems, i.e. when there are several state variables, for instance when the oil price process is driven by two state variables, when extraction costs are stochastic, or when the size of reserves is a random variable.

We solve the real option problem for a small producer (with reserves of 12 billion barels) and for a large producer (with reserves of 100 bilion barrels) and compute the threshold below which it is optimal to defer production. In our baseline model, we find that the small producer should only produce when prices are high (higher than US$73 per barrel at 2000 constant prices), whereas for the large producer, full production is optimal as soon as prices exceed US$39. Optimal production is found to be volatile given the stochastic process of oil prices. As a result, we show that the net present value of oil reserves would be substantially higher if countries were willing to vary production when oil prices change. This result has important implications for oil production policy and for the design of macroeconomic policies that depend on inter-temporal and inter-generational equity considerations. It also implies that the world supply curve would be very elastic to prices if all countries were optimizing production as in the baseline model—and as a result, prices would tend to be higher but much less volatile.

We investigate why observed production is not as volatile as what is predicted by the baseline calibration of the model. One possible explanation is that producers are risk averse. Under this assumption, production is accelerated and is more stable, but a risk averse producer should also maintain large spare capacity, a result at odds with the evidence that oil producers almost always produce at full capacity. A second potential explanation is that producers are uncertain about the actual size of their oil reserves. Using panel data on recoverable reserves, we show however that, historically, this uncertainty has been diminishing with time and therefore this explanation is incomplete, since even mature oil exporters maintain low spare capacity. A third explanation may be that the oil price process, and in particular the equilibrium oil price, is unknown to the decision makers. Indeed, the optimal reaction to an increase in oil prices depends on whether the price increase is perceived to be temporary or to reflect a permanent shift in prices. If shocks are known to be primarily temporary, production should increase in the face of oil price increases. But if shocks are thought to be accompanied by movements in the equilibrium price, the continuation value jumps at the same time as the immediate payoff from extracting oil. In that case an increase in price may not result in an increase in production. Faced with uncertain views on the optimal strategy, the safe decision might well be to remain prudent with changes in production.

In practice, world oil production is partially cut in the face of negative demand shocks. The last section of the paper investigates whether the reduction in oil production during the 2008–2009 crisis can be explained by the determinants predicted by the model. We find that the cut in production was larger for OPEC, for countries facing a lower discount rate, as predicted by the model, and for countries with government finances less dependent on oil revenues.