Wednesday, February 8, 2012

The Global Macroeconomic Costs of Raising Bank Capital Adequacy Requirements

The Global Macroeconomic Costs of Raising Bank Capital Adequacy Requirements. By Scott Roger & Francis Vitek
IMF Working Paper No. 12/44
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25716.0

Summary: This paper examines the transitional macroeconomic costs of a synchronized global increase in bank capital adequacy requirements under Basel III, as well as a capital increase covering globally systemically important banks. The analysis, using an estimated multi-country model, contributed to the work of the Macroeconomic Assessment Group analysis, especially in estimating the potential international spillovers associated with a global increase in capital requirements. The magnitude of the effects found in this analysis is relatively modest, especially if monetary policies have scope to ease in response to a widening of interest rate spreads by banks.

Excerpts:

Introduction

1. This paper analyzes the transitional macroeconomic costs of strengthening bank capital adequacy requirements, including a general increase in capital requirements as well as an increase specifically for globally systemically important banks (GSIBS). In addition to estimating the impact of introducing higher capital requirements in each of 15 major economies, the analysis also includes estimates of the international spillover effects associated with the simultaneous introduction of higher capital requirements by all 15 countries. The simulations are generated within the framework of an extended and refined version of the multi-country macroeconometric model of the world economy developed and estimated by Vitek (2009).

2. This analysis contributed to the work of the Macroeconomic Assessment Group (MAG), chaired by the Bank for International Settlements (BIS), and the Long-term Economic Impact (LEI) group of the Basel Committee for Banking Stability (BCBS).1 The MAG participants, including the IMF, used a variety of models to estimate the medium-term macroeconomic costs of strengthening capital and liquidity requirements.2 The analysis presented in this paper, reflecting the MAG mandate, focuses solely on the short-term to medium-term output costs of the proposed new regulatory measures. Estimates of the net benefits of these regulatory measures can be found in the LEI report (BCBS 2010).

3. The macroeconomic effects of an increase in capital adequacy requirements are assumed in this analysis to be transmitted exclusively via increases in the spread between commercial bank lending rates and the central bank policy rate. We estimate that, in the absence of any monetary policy response, a permanent synchronized global increase in capital requirements for all banks by 1 percentage point, would cause a peak reduction in GDP of around 0.5 percentage points, of which around 0.1 percentage points would result from international spillovers. Losses in emerging market economies are found to be somewhat higher than in advanced economies. If monetary policy is able to respond, however, the adverse impact of higher capital requirements could be largely offset.

4. With regard to strengthening capital requirements specifically for GSIBs, we estimate that a 1 percentage point increase in capital requirements for the top 30 GSIBs would cause a median peak reduction in GDP of around 0.17 percentage points, of which 0.04 percentage points, or 25 percent, results from international spillovers. The aggregate figures conceal a wide range of outcomes, however, and for some countries, international spillovers would be the main source of macroeconomic effects.

5. It is important to bear in mind the limitations of the model and assumptions used in the analysis. In particular, the analysis does not take account of other possible responses by banks or other financial institutions to changes in capital requirements, or non-linearities in the response of financial systems, monetary policy, or the real economy. Nor does the model allow for changes in the macroeconomic steady state associated with very persistent widening of lending spreads. Additionally, the analysis does not take account of the different initial starting points of different countries in raising capital requirements, or differences in the speed of implementation. 


Concluding comments and caveats

28. The multi-country macroeconomic model used in this analysis contributed importantly to the MAG assessments of the potential impact over the medium term of a global increase in capital requirements, both for all banks and for a smaller group of GSIBs. The results of the multi-country analysis indicate that international spillovers associated with coordinated policy measures are important—our analysis suggests that spillovers typically account for 20- 25 percent of the total impact on output. Moreover, in the case of an increase in capital requirements for GSIBs, international spillovers may be the primary source of macroeconomic effects.

29. At the same time, it is important to recognize the important limitations associated both with the model and with the exercise it was used in. With regard to the model, the main limitations to emphasize are that:
* As discussed earlier, the model is not geared to dealing with changes in the steady state associated with permanent or very persistent shocks. Although the quantitative significance of this does not appear to be large in the context of this exercise, it suggests that the estimated effects of a permanent increase in interest rate spreads should be interpreted with caution, particularly at long horizons.

* The model has only one avenue for the increase in capital requirements to affect the real economy; though a widening of bank lending spreads over the policy rate. As discussed in the MAG reports, there are several ways in which banks can respond to higher capital requirements and some could have much more significant effects on output, while others would be more benign.


30. The exercises themselves have some important limitations that should be borne in mind in assessing the quantitative results and risks surrounding them. These include:
* The implementation of the higher capital requirements is assumed to be linear over the alternative implementation periods. In practice, the speed of implementation is quite likely to be non-linear; indeed, markets may be forcing a front-loading of adjustment.
* The scope for monetary policy responses may well vary over time and differ from one country to another. Not all countries are close to the zero lower bound for interest rates, and even those that are may not remain so over the entire implementation period.  Consequently, macroeconomic outcomes and spillovers are bound to differ from those suggested by the model analysis. The analysis should be thought of as showing bounds for potential outcomes associated with different monetary policies.

* The analyses only consider standardized increases in capital requirements by 1 percentage point. However, the effects of increases in requirements may well be nonlinear, so that increasing requirements by 2 percentage points may be not be simply twice as much as a 1 percentage point increase, and the degree of non-linearity may not be the same across time or countries. The zero lower bound constraint is one such nonlinearity, but there are likely to be others.

* The analysis of the global increase in capital requirements assumed an identical increase in capital requirements in all countries. In reality, banks in some countries will have much further to go in meeting higher capital requirements than banks in other countries. As a consequence, the pace of increases in interest rate spreads will vary across countries. As seen in the exercise with GSIBs, where spreads increased by different amounts in different countries, this would significantly modify the pattern of macroeconomic effects and their spillovers between countries.

Tuesday, February 7, 2012

U.S.-China Competition in Asia: Legacies Help America

U.S.-China Competition in Asia: Legacies Help America. BY ROBERT SUTTER
East-West Center
Feb 2012
http://www.eastwestcenter.org/sites/default/files/private/apb147.pdf

As Sino-American competition for influence enters a new stage with the Obama administration’s re-engagement with Asia, each power’s legacies in the region add to economic, military and diplomatic factors determining which power will be more successful in the competition. How the United States and China deal with their respective histories in regional affairs and the role of their non-government relations with the Asia-Pacific represent important legacies that on balance favor the United States.


The Role of History
From the perspective of many regional government officials and observers, the United States and the People’s Republic of China both have historically very mixed records, often resorting to highly disruptive and violent measures to preserve their interests. The record of the United States in the Cold War and later included major wars in Korea and Vietnam and constant military friction along Asia’s rim as it sought to preserve military balance and deter perceived aggression. Many in Asia benefited from America’s resolve and major sacrifices. Most today see the United States as a mature power well aware of the pros and cons of past behavior as it crafts a regional strategy to avoid a potentially dangerous withdrawal and to preserve stability amid U.S. economic and budget constraints.

In contrast, rising China shows little awareness of the implications of its record in the region. Chinese officials and citizens remain deeply influenced by an officially encouraged erroneous claim that China has always been benign and never expansionist. The highly disruptive policies and practices of the People’s Republic of China under the revolutionary leadership of Mao Zedong and the more pragmatic leadership of Deng Xiaoping are not discussed. Well-educated audiences at foreign policy forums at universities and related venues show little awareness of such legacies as consistent Chinese support for the Khmer Rouge as a means to preserve Chinese interests in Southeast Asia.  China’s military invasion of Vietnam and Chinese directed insurgencies against major governments in Southeast Asia, both Western-aligned states and the strictly neutral government of Burma, seem widely unknown.

Chinese officials who should know better also refuse or are unable to deal honestly with the recent past. Speaking last year to a group of Asian Pacific including Vietnamese, American and Chinese officials and scholars deliberating over recent trends in Asia, a Chinese foreign affairs official emphasized in prepared remarks that China “has always been a source of stability in Asia.” After watching the Vietnamese participants squirm in their seats, others raised objections to such gross inaccuracy.

The Chinese lacuna regarding how it has been perceived by its neighbors encumbers China’s efforts to gain influence in the region. China has a lot to live down. Regional governments need steady reassurance that China will not employ its growing power to return to the domineering and disruptive practices that marked forty of the sixty years of the People’s Republic of China. Educated Chinese citizens and at least some responsible officials appear insensitive to this need because of ignorance. They see no requirement to compensate for the past and many criticize Chinese government actions that try to accommodate concerns of regional neighbors. The nationalistic rhetoric coming from China views neighbors as overly sensitive to Chinese assertions and coercive measures on territorial, trade and other issues which revive regional wariness that the antagonistic China of the recent past may be reemerging with greater power in the current period.


Non-government Relations

Like many countries, China’s interaction with its neighbors relies heavily on the Chinese government and other official organizations. Even areas such as trade, investment, media, education and other interchange are heavily influenced by administrative support and guidance. An exception is the large numbers of ethnic Chinese living for generations in neighboring countries, especially in Southeast Asia, which represent a source of non-government influence for China. On balance, the influence of these groups is positive for China, although suspicions about them remain in some countries.

By contrast, for much of its history, the United States exerted influence in Asia and the Pacific much more through business, religious, media, foundations, educational and other interchange than through channels dependent on government leadership and support. Active American non-government interaction with the region continues today, putting the United States in a unique position where the American non-government sector has such a strong and usually positive impact on the influence the United States exerts in the region. Meanwhile, almost 50 years of generally color-blind U.S.  immigration policy since the ending of discriminatory U.S. restrictions on Asian immigration in 1965 has resulted in the influx of millions of Asia-Pacific migrants who call America home and who interact with their countries of origin in ways that under gird and reflect well on the U.S. position in the region. No other country, with the exception of Canada, has such an active and powerfully positive channel of influence in the Asia-Pacific.


Outlook: Advantage U.S.

The primary concerns in the Asia-Pacific with stability and development mean that U.S.-Chinese competition for influence probably will focus more on persuasion than coercion. The strong American foundation of webs of positive non-government regional interchange and the Obama government’s widely welcomed re-engagement with the region contrasts with rising China’s poor awareness of its historical impact on the region and limited non-government connections.

Friday, February 3, 2012

Why did the U.S. recover faster from the Panic of 1907 than from the 2008 recession and the Great Depression?

Why did the U.S. recover faster from the Panic of 1907 than from the 2008 recession and the Great Depression?
By PHIL GRAMM AND MIKE SOLON
WSJ, Feb 02, 2012
http://online.wsj.com/article/SB10001424052970204740904577193382505500756.html

Commerce Department data released last Friday show that four years after the recession began, real gross domestic product per person is down $1,112, while 5.8 million fewer Americans are working than when the recession started.

Never before in postwar America has either real per capita GDP or employment still been lower four years after a recession began. If in this "recovery" our economy had grown and generated jobs at the average rate achieved following the 10 previous postwar recessions, GDP per person would be $4,528 higher and 13.7 million more Americans would be working today.

Behind the startling statistics of lost income and jobs are the real and painful stories of American families falling further behind: record high poverty levels, record low teenage employment, record high long-term unemployment, shrinking birthrates, exploding welfare benefits, and a crippled middle class.

As the recovery faltered, President Obama first claimed the weakness of the recovery was due to the depth of the recession, saying that it was "going to take a while for us to get out of this. I think even I did not realize the magnitude . . . of the recession until fairly far into it."

But, in fact, the 1981-82 recession was deeper and unemployment was higher. Moreover, the 1982 recovery was constrained by a contractionary monetary policy that pushed interest rates above 21%, a tough but necessary step to break inflation. It was also a recovery that required a painful restructuring of American businesses to become more competitive in the increasingly globalized economy. By way of comparison, our current recovery has benefited from the most expansionary monetary policy in U.S. history and a rapid return to profitability by corporate America.

Despite the significant disadvantages the economy faced in 1982, President Ronald Reagan's policies ignited a recovery so powerful that if it were being repeated today, real per capita GDP would be $5,694 higher than it is now—an extra $22,776 for a family of four. Some 16.9 million more Americans would have jobs.

The most recent excuse for the failed recovery is that financial crises, by their very nature, result in slower, more difficult recoveries. Yet the 1981-82 recession was at least in part financially induced by inflation, record interest rates and the dislocations they generated. The high interest rates wreaked havoc on long-term lenders like S&Ls, whose net worth turned negative in mid-1982. But even if we ignore the financial roots of the 1981-82 recession, the financial crisis rationalization of the current, weak recovery does not stand up to scrutiny.

The largest economic crisis of the 20th century was the Great Depression, but the second most significant economic upheaval was the panic of 1907. It was from beginning to end a banking and financial crisis. With the failure of the Knickerbocker Trust Company, the stock market collapsed, loan supply vanished and a scramble for liquidity ensued. Banks defaulted on their obligations to redeem deposits in currency or gold.

Milton Friedman and Anna Schwartz, in their classic "A Monetary History of the United States," found "much similarity in its early phases" between the Panic of 1907 and the Great Depression. So traumatic was the crisis that it gave rise to the National Monetary Commission and the recommendations that led to the creation of the Federal Reserve. The May panic triggered a massive recession that saw real gross national product shrink in the second half of 1907 and plummet by an extraordinary 8.2% in 1908. Yet the economy came roaring back and, in two short years, was 7% bigger than when the panic started.

It is certainly true that the economy languished in the Great Depression as it has over the past four years. But today's malaise is similar to that of the Depression not because of the financial events that triggered the disease but because of the virtually identical and equally absurd policy prescriptions of the doctors.

Under President Franklin Roosevelt, federal spending jumped by 3.6% of GDP from 1932 to 1936, an unprecedented spending spree, as the New Deal was implemented. Under President Obama, spending exploded by 4.6% of GDP from 2008 to 2011. The federal debt by the end of 1938 was almost 150% above the 1929 level. Publicly held debt is projected to be double the 2008 level by the end of 2012. The regulatory burden mushroomed under Roosevelt, as it has under Mr. Obama.

Tax policy then and now was equally destructive. The top individual income tax rate rose from 24% to 63% and then to 79% during the Hoover and Roosevelt administrations. Corporate rates were increased by 36%. Under Mr. Obama, capital gains taxes are set to rise by one third, the top effective tax rate on dividends will more than triple, and the highest marginal tax rate will effectively rise by 21.4%.

Moreover, the Obama administration's populist tirades against private business are hauntingly similar to the Roosevelt administration's tirades. FDR's demagoguery against "the privileged few" and "economic royalists" has evolved into Mr. Obama's "the richest 1%" and America's "millionaires and billionaires."

Yet, in his signature style, Mr. Obama now claims our weak recovery is not because a Democratic Congress said yes to his policy prescriptions in 2009-10 but because a Republican House said no in 2011. The sad truth is this president sowed his policies and America is reaping the results.

Faced with the failed results of his own governing strategy of tax, spend and control, the president will have no choice but to follow an election strategy of blame, vilify and divide. But come Nov. 6, American voters need only ask themselves the question Reagan asked in 1980: "Are you better off than you were four years ago?"

Sadly, with their income reduced by thousands, the number of U.S. jobs down by millions, and the nation trillions deeper in debt, the answer will be a resounding "No."

Mr. Gramm, a former U.S. senator from Texas, is the senior partner at U.S. Policy Metrics, where Mr. Solon, a former senior budget staffer in both houses of Congress, is also a partner.

Tuesday, January 31, 2012

Macroeconomic and Welfare Costs of U.S. Fiscal Imbalances

Macroeconomic and Welfare Costs of U.S. Fiscal Imbalances. By Bertrand Gruss and Jose L. Torres
IMF Working Paper No. 12/38
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25691.0

Summary: In this paper we use a general equilibrium model with heterogeneous agents to assess the macroeconomic and welfare consequences in the United States of alternative fiscal policies over the medium-term. We find that failing to address the fiscal imbalances associated with current federal fiscal policies for a prolonged period would result in a significant crowding-out of private investment and a severe drag on growth. Compared to adopting a reform that gradually reduces federal debt to its pre-crisis level, postponing debt stabilization for two decades would entail a permanent output loss of about 17 percent and a welfare loss of almost 7 percent of lifetime consumption. Moreover, the long-run welfare gains from the adjustment would more than compensate the initial losses associated with the consolidation period.

The authors start the paper this way:

“History makes clear that failure to put our fiscal house in order will erode the vitality of our
economy, reduce the standard of living in the United States, and increase the risk of economic and financial instability.”

Ben S. Bernanke, 2011 Annual Conference of the Committee for a Responsible Federal Budget


Excerpts
Introduction
One of the main legacies of the Great Recession has been the sharp deterioration of public finances in most advanced economies. In the U.S., the federal debt held by the public surged from 36 percent of GDP in 2007 to around 70 percent in 2011. This rise in debt, however impressive, gets dwarfed when compared to the medium-term fiscal imbalances associated with entitlement programs and revenue-constraining measures. For example, the non-partisan Congressional Budget Office (CBO) foresees the debt held by the public to exceed 150 percent of GDP by 2030 (see Figure 1). Similarly, Batini et al. (2011) estimate that closing the federal “fiscal gap” associated with current fiscal policies would require a permanent fiscal adjustment of about 15 percent of GDP.

While the crisis brought the need to address the U.S. medium-term fiscal imbalances to the center of the policy debate, the costs they entail are not necessarily well understood. Most of the long-term fiscal projections regularly produced in the U.S. and used to guide policy discussions are derived from debt accounting exercises. A shortcoming of such approach is that relative prices and economic activity are unaffected by different fiscal policies, and that it cannot be used for welfare analysis. To overcome those limitations and contribute to the debate, in this paper we use a rational expectations general equilibrium framework to assess the medium-term macroeconomic and welfare consequences of alternative fiscal policies in the U.S. We find that failing to address the federal fiscal imbalances for a prolonged period would result in a significant crowding-out of private investment and drag on growth, entailing a permanent output loss of about 17 percent and welfare loss of almost 7 percent of lifetime consumption. Moreover, we find that the long-run welfare gains from stabilizing the federal debt at a low level more than compensate the welfare losses associated with the consolidation period. Our results also suggest that the crowding-out effects of public debt are an order of magnitude bigger than the policy mix effects: Reducing promptly the level of public debt is significantly more important for activity and welfare than differences in the size of government or the design of the tax reform.

The focus of this study is on the costs and benefits of fiscal consolidation for the U.S. over the medium-term to long-term. In this sense, we explicitly leave aside some questions on fiscal consolidation that, while very relevant for the short-run, cannot be appropriately tackled in this framework. One example is assessing the effects of back-loading the pace of consolidation in the near term—while announcing a credible medium-run adjustment—in the current context of growth below potential and nominal interest rates close to zero. A related relevant question is what mix of fiscal instruments in the near term would make fiscal consolidation less costly in such context. While interesting, these questions are beyond the scope of this paper.

The quantitative framework we use is a dynamic stochastic general equilibrium model with heterogeneous agents, and endogenous occupational choice and labor supply. In the model, ex-ante identical agents face idiosyncratic entrepreneurial ability and labor productivity shocks, and choose their occupation. Agents can become either entrepreneurs and hire other workers, or they can become workers and decide what fraction of their time to work for other entrepreneurs. In order to make a realistic analysis of the policy options, we assume that the government does not have access to lump sum taxation. Instead, the government raises distortionary taxes on labor, consumption, and income, and issues one period non-contingent bonds to finance lump sum transfers to all agents, other noninterest spending, and service its debt. Given that the core issue threatening debt sustainability in the U.S. is the explosive path of spending on entitlement programs, the heterogeneous agents assumption is crucial: Our model allows for a meaningful tradeoff between distortionary taxation and government transfers, as the latter insure households from attaining very low levels of consumption. The complexity this introduces forces us to sacrifice on some dimension: Agents in our model face individual uncertainty but have perfect foresight about future paths of fiscal instruments and prices. Allowing for uncertainty about the timing and composition of the adjustment would be interesting, but would severely increase the computational cost.

We compare model simulations from four alternative fiscal scenarios. The benchmark scenario maintains current fiscal policies for about twenty years. More precisely, in this scenario we feed the model with the spending (noninterest mandatory and discretionary) and revenue projections from CBO’s Alternative Fiscal scenario (CBO 2011)—allowing all other variables to adjust endogenously—until about 2030, when we assume that the government increases all taxes to stabilize the debt at its prevailing level. Three alternative scenarios assume, instead, the immediate adoption of fiscal reform aimed at gradually reducing the federal debt to its pre-crisis level. There are of course many possible parameterizations for such reform reflecting, among other things, different views about the desired size of the public sector and the design of the tax system. We first consider an adjustment scenario assuming the same size of government and tax structure than the benchmark one in order to disentangle the sole effect of delaying fiscal adjustment—and stabilizing the debt ratio at a high level. We then explore the effect of alternative designs for the consolidation plan by considering two alternative adjustment scenarios that incorporate spending and revenue measures proposed by the bipartisan December 2010 Bowles-Simpson Commission.

This paper is related to different strands of the macro literature on fiscal issues. First, it is related to studies using general equilibrium models to analyze the implications of fiscal consolidations. Forni et al. (2010) use perfect-foresight simulations from a two-country dynamic model to compute the macroeconomic consequences of reducing the debt to GDP ratio in Italy. Coenen et al. (2008) analyze the effects of a permanent reduction in public debt in the Euro Area using the ECB NAWM model. Clinton et al. (2010) use the IMF GIMF model to examine the macroeconomic effects of permanently reducing government fiscal deficits in several regions of the world at the same time. Davig et al. (2010) study the effects of uncertainty about when and how policy will adjust to resolve the exponential growth in entitlement spending in the U.S.

The main difference with our paper is that these works rely on representative agent models that cannot adequately capture the redistributive and insurance effects of fiscal policy. As a result, such models have by construction a positive bias towards fiscal reforms that lower transfers, reduce the debt, and eventually lower the distortions by lowering tax rates. Another unappealing feature of the representative agent models for analyzing the merits of a fiscal consolidation is that, in steady state, the equilibrium real interest rate is independent of the debt level, whereas in our model the equilibrium real interest rate is endogenously affected by the level of government debt, which is consistent with the empirical literature.

Second, the paper is related to previous work using general equilibrium models with infinitively lived heterogeneous agents, occupational choice, and borrowing constraints to analyze fiscal reforms, such as Li(2002), Meh (2005) and Kitao (2008). Differently from these papers, that impose a balanced budget every period, we focus on the effects of debt period of time we augment our model to include growth. Moreover and as in Kitao (2008), we explicitly compute the transitional dynamics after the reforms and analyze the welfare costs associated with the transition.  dynamics and fiscal consolidation reforms. Also, since we focus on reforms over an extended period of time we augment our model to include growth. Moreover and as in Kitao (2008), we explicitly compute the transitional dynamics after the reforms and analyze the welfare costs associated with the transition.

Results:The long-run effects


What is the effect of delaying fiscal consolidation on...?
Capital and Labor. The high interest rates in the delay scenario imply that for those entrepreneurs that do not have enough internal funding, the cost of borrowing sufficient capital is too high for them to compensate for their income under the outside option (i.e.  wage income). As a result, the share of entrepreneurs in the delay scenario is roughly one half the share under the passive adjust scenario and the aggregate capital stock is about 17 percent lower. The higher share of workers in the delay scenario implies a higher labor supply. Together with a lower labor demand (due to a lower capital stock), this leads to a real wage that is more than 19 percent lower. Total hours worked are similar in the two steady states as lower individual hours offset the higher share of workers.

Output and Consumption. The crowding-out effect of fiscal policy under the delay scenario leads to large permanent losses in output and consumption. The level of GDP is about 16 percent lower in the delay than in the passive adjust scenario and aggregate consumption is 3.5 percent lower. Moreover and as depicted in Figure 4, the wealth distribution is significantly more concentrated under the delay scenario.

Welfare. The effect of lower aggregate consumption and more concentrated wealth distribution under the delay scenario implies that welfare is significantly lower than in the passive adjust scenario. Using a consumption equivalent welfare metric we find that the average difference in steady state welfare across scenarios would be equivalent to permanently increasing consumption to each agent in the delay scenario economy by 6 percent while leaving their amount of leisure unchanged. We interpret this differential as the permanent welfare gain from stabilizing public debt at its pre-crisis level. A breakup of the welfare comparison of steady states by wealth deciles, shown in Figure 5, suggests that all agents up to the 7th deciles of the wealth distribution would be better off under fiscal consolidation.


What are the effects of alternative fiscal consolidation plans?

Capital and Output. The smaller size of government in the two active adjust scenario relative to the passive one translates into higher capital stocks and higher output, increasing the gap with the delay scenario. Regarding the tax reform, the comparison between the two active adjust scenarios reveals that distributing the higher tax pressure on all taxes, including consumption taxes, lowers distortions and results in a higher capital stock and in a growth friendlier consolidation: The difference in the output level between the delay and active (1) adjust scenario stands at 17.7 percent—while this difference is 17.1 and 15.7 percent for the active (2) adjust and passive adjust scenarios respectively.

Consumption and Welfare. While all adjust scenarios reveal a significant difference in long-run per-capita consumption and welfare with respect to postponing fiscal consolidation, the relative performance among them also favors a smaller size of government and a balanced tax reform. The difference in per-capita consumption with the delay scenario is 3.5, 5.8 and 5.4 percent respectively for the passive, active (1) and active (2) adjustment scenarios. The policy mix under the active (1) adjust scenario also ranks the best in terms of welfare, with the welfare differential with respect to the delay scenario being more than 7 percent of lifetime consumption.

Overall Welfare Cost of Delaying Fiscal Consolidation

In the long-run the average welfare in the adjust scenario is higher than in the delay scenario by 6.7 percent of lifetime consumption. However, along the transition to the new steady state the adjust scenario is characterized by a costly fiscal adjustment that entails a lower path for per capita consumption, so it might not be necessarily true that an adjustment is optimal.

To assess the overall welfare ranking of the alternative fiscal paths, we extend the analysis of section III.A. by computing, for the delay and adjust scenarios, the average expected discounted lifetime utility starting in 2011. We find that even taking into account the costs along the transition, the adjust scenario entails an average welfare gain for the economy. The infinite horizon welfare comparison suggests that consumption under the delay scenario should be raised by 0.8 percent for all agents in the economy in all periods to attain the same average utility than under the adjust scenario (while leaving leisure unchanged). A breakup of this result by wealth deciles (see Figure 9) suggests that, as in the long-run comparison, the wealthiest decile of the population is worse off under the adjust scenario. Differently from the steady state comparison, however, the first four deciles also face welfare losses in the adjust scenario.

A few elements suggest that the average welfare gain reported (0.8 percent in consumptionequivalent terms) can be considered a lower bound. First, the calibrated subjective discount factor from the model used to compute the present value of the utility paths entails a yearly discount rate of about 9.9 percent.20 With such a high discount rate, the long-run benefits from the delay scenario are heavily discounted. Using a discount rate of 3 percent, the one used by CBO for calculating the present value of future streams of revenues and outlays of the government’s trust funds, would imply a consumption-equivalent welfare gain of 5.9 percent (instead of 0.8 percent). Second, the model we are using has infinitely lived agents, so we are not explicitly accounting for the distribution of costs and benefits across generations.

Conclusions
We compare the macroeconomic and welfare effects of failing to address the fiscal imbalances in the U.S. for an extended period with those of reducing federal debt to its precrisis level and find that the stakes are quite high. Our model simulations suggest that the continuous rise in federal debt implied by current policies would have sizeable effects on the economy, even under certainty that the federal debt will be fully repaid. The model predicts that the mounting debt ratio would increase the cost of borrowing and crowd out private capital from productive activities, acting as a significant drag on growth. Compared to stabilizing federal debt at its pre-crisis level, continuation of current policies for two decades would entail a permanent output loss of around 17 percent. The associated drop in per-capita consumption, combined with the worsening of wealth concentration that the model suggests, would cause a large average welfare loss in the long-run, equivalent to about 7 percent of lifetime consumption. Our results also suggest that reducing promptly the level of public debt is significantly more important for activity and welfare than differences in the size of government or the design of the tax reform. Accordingly, even under consensus on the desirability to increase primary spending in the medium-run, it would be preferable to start from a fiscal house in order.

The model adequately captures that the fiscal consolidation needed to reduce federal debt to its pre-crisis level would be very costly. Still, extending the welfare comparison to include also the transition period suggests that a fiscal consolidation would be on average beneficial.  After taking into account the short-term costs, the average welfare gain from fiscal consolidation stands at 0.8 percent of lifetime consumption.

We argue that our welfare results can be interpreted as a lower bound. This is because, first, we abstract from default so our simulations ignore the potential effect of higher public debt on the risk premium. However, as the debt crisis in Europe has revealed, interest rates can soar quickly if investors lose confidence in the ability of a government to manage its fiscal policy. Considering this effect would have magnified the long-run welfare costs of stabilizing the debt ratio at a higher level. Second, the high discount rate we use in the computation of the present value of utility exacerbates the short-term costs. If we recomputed the overall welfare effects in our scenarios using a discount rate of 3 percent, the welfare gain from a consolidation would be 5.9 percent of lifetime utility, instead of 0.8 percent. An argument for considering a lower rate to compute the present value of welfare is that by assuming infinitely lived agents we are not attaching any weight to unborn agents that would be affected by the permanent costs of delaying the resolution of fiscal imbalances and do not enjoy the expansionary effects of the unsustainable policy along the transitional dynamics.

The results in this paper are not exempt from the perils inherent to any model-dependent analysis. In order to address features that we believe are crucial for the issue at hand, we needed to simplify the model on other dimensions. For example, given the current reliance of the U.S. on foreign financing, the closed economy assumption used in this paper may be questionable. However, we believe that it would also be problematic to assume that the world interest rate will remain unaffected if the U.S. continues to considerably increase its financing needs. Moreover and as mentioned before, the model ignores the effect of higher debt on the perceived probability of default, which would likely counteract the effect in our results from failing to incorporate the government’s access to foreign borrowing. The model also abstracts from nominal issues and real and nominal rigidities typically introduced in the new Keynesian models commonly used for policy analysis. However, we believe that while these features are particularly relevant for short-term cyclical considerations, they matter much less for the longer-term issues addressed in this paper.

How Risky Are Banks’ Risk Weighted Assets? Evidence from the Financial Crisis

How Risky Are Banks’ Risk Weighted Assets? Evidence from the Financial Crisis. By Sonali Das & Amadou N. R. Sy
IMF Working Paper No. 12/36
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25687.0

Summary: We study how investors account for the riskiness of banks’ risk-weighted assets (RWA) by examining the determinants of stock returns and market measures of risk. We find that banks with higher RWA had lower stock returns over the US and European crises. This relationship is weaker in Europe where banks can use Basel II internal risk models. For large banks, investors paid less attention to RWA and rewarded instead lower wholesale funding and better asset quality. RWA do not, in general, predict market measures of risk although there is evidence of a positive relationship before the US crisis which becomes negative afterwards.

Introduction:
“The leverage ratio - a simple ratio of capital to balance sheet assets - and the more complex riskbased requirements work well together. The leverage requirement provides a baseline level of capital to protect the safety net, while the risk-based requirement can capture additional risks that are not covered by the leverage framework. The more advanced and complex the models become, the greater the need for such a baseline. The leverage ratio ensures that a capital backstop remains even if model errors or other miscalculations impair the reliability of risk-based capital. This is a crucial consideration - particularly as we work through the implementation of Basel II standard. By restraining balance sheet growth, the leverage ratio promotes stability and resilience during difficult economic periods.”– Remarks by Sheila Bair, Chairman, Federal Deposit Insurance Corporation before the Basel Committee on Banking Supervision, Merida, Mexico, October 4, 2006.

The financial crisis that began in 2007 has exposed a number of important weaknesses in banking regulation. A key challenge is how to appropriately determine the riskiness of banks’ assets. The principle that regulatory capital requirements should be tied to the risks taken by banks was accepted internationally and formalized with the Basel I accord in 1988, and the definition of capital and measurement of risks have undergone several revisions since that time.  The second Basel accord, published in 2004, recommended banks hold total regulatory capital equal to at least 8 percent of their risk-weighted assets (RWA). The recently updated Basel III guidelines emphasize higher quality forms of capital, but makes limited strides in the measurement of risks. Instead, Basel III proposes as a complementary measure, a non-riskweighted leverage ratio.

Risk weighted assets are an important element of risk-based capital ratios. Indeed, banks can increase their capital adequacy ratios in two ways: (i) by increasing the amount of regulatory capital held, which boosts the numerator of the ratio, or (ii) by decreasing risk-weighted assets, which is the denominator of the regulatory ratio. A key concern about current methods of determining risk-weighted assets is that they leave room for individual banks to “optimize” capital requirements by underestimating their risks and thus being permitted to hold lower capital. Jones (2000) discusses techniques banks can use to engage in regulatory capital arbitrage and provides evidence on the magnitude of these activities in the Unites States. Even under the Basel I system, in which particular classes of assets are assigned fixed risk-weights, the capital ratio denominator can be circumvented. Merton (1995) provides an example in which, in place of a portfolio of mortgages, a bank can hold the economic equivalent of that portfolio at a riskweight one-eighth as large. Innovations in financial products since the first Basel accord have also likely made it easier for financial institutions to manipulate their regulatory risk measure.  Acharya, Schnabl, and Suarez (2010) analyze asset-backed commercial paper and find results suggesting that banks used this form of securitization to concentrate, rather than disperse, financial risks in the banking sector while reducing bank capital requirements.

In addition to concerns about underestimating the riskiness of assets, there are differences in calculation of risk weighted assets across countries that may have unintended effects on financial stability. Lord Adair Turner, chairman of the UK Financial Services Authority, warned in June that international differences in the calculation of risk-weighted assets could undermine Basel III3 and Sheila Bair, former chairman of the US Federal Deposit Insurance Corporation, added her concern that Europe and the US may be diverging in their calculation of RWA: “The risk weightings are highly variable in Europe and have led to continuing declines in capital levels, even in the recession. There's pretty strong evidence that the RWA calculation isn't working as it's supposed to.”

In this paper, we study whether equity investors find banks’ reported risk-weighted assets to be a credible measure of risk. First, did banks with lower risk-weighted assets have higher stock returns during the recent financial crisis? And second, do measures of risk based on equity market information correspond to risk-weighted assets? Demirgüç-Kunt, Detragiache, and Merrouche (2010) and Beltratti and Stulz (2010) study banks’ stock return performance during the financial crisis as well, focusing primarily on the effect of different measures of capital and bank governance, respectively. Our paper studies whether markets price bank risk as measured by RWA, to inform the debate on how best to measure the risks embedded in banks’ portfolios.  Addressing the first question, we find that banks with higher RWA performed worse during the severe phase of the crisis, from July 2007 to September 2008, suggesting that equity investors did look at RWA as a determinant of banks’ stock returns in this period. This relationship is weaker in Europe where banks can use Basel II internal risk models. For large banks, investors paid less attention to RWA and rewarded instead lower wholesale funding and better asset quality.

We find as in Demirguc-Kunt, Detragiache, and Merrouche (2010) that markets do not respond to all measures of capital, but respond positively to higher quality measures – that is, capital with greater loss-absorbing potential. We also investigate the possibility of a capital-liquidity trade-off in the market assessment of banks. Our results indicate that there is indeed a capital-liquidity trade-off: (i) banks with more stable sources of short-term funding are not rewarded as highly for having higher capital, and (ii) banks with liquid assets are not rewarded as highly for having higher capital.

Regarding the relationship between RWA and stock market measures of bank risk, we find that RWA do not, in general, predict market measures of banks’ riskiness. There is evidence, however, of a positive relationship between RWA and market risk in the three years prior to the crisis, from 2004 to 2006, and this relationship becomes negative after the crisis. This could result from the large increase in market measures of risk, which reflect the volatility of a bank’s stock price, since the crisis, while banks have not adjusted their RWA to account for increased risk.

Conclusions
There has been a steady decline in the measure of asset-risk that banks report to regulators—riskweighted assets (RWA)—over the last decade. In light of this trend and other indications that banks can “optimize” their capital by under-reporting RWA in an attempt to minimize regulatory burdens, we study how equity market investors account for the riskiness of RWA by examining the determinants of stock returns and stock-market measures of risk of an international panel of banks.

Regarding banking stock returns, we find a negative relationship between RWA and stock returns over periods of financial crisis, suggesting that investors use RWA as an indicator of bank portfolio risk. Indeed, banks with higher risk-weighted assets performed worse during the severe phase of the crisis, from July 2007 to September 2008. We find a similar result when we focus on the ongoing crisis in the Europe.

Comparing regions with different regulatory structures, we find, however, that the relationship between stock returns and RWA is weaker in countries where banks have more discretion in the calculation of RWA. Specifically, in countries that had implemented Basel II before the onset of the recent financial crisis, allowing banks to use their own internal models to assess credit risks, investors look to other balance-sheet measures of risk exposure but not RWA. Our results also suggest that for large banks, investors paid less attention to the quality of capital and RWAs during the crisis and rewarded instead lower reliance on wholesale funding and better asset quality as measured by the relative size of customer deposit and non-performing loans, respectively.

We confirm results from previous studies that only capital with the greatest loss-absorbing potential matters for stock returns. In addition, we find a trade-off between capital and liquidity in terms of their positive effects on bank stock returns. The more stable a bank’s funding, the less positive the effect of higher capital on its stock return; the more liquid a bank’s assets, the less an increase in capital will increase its stock return.

When it comes to stock-market measures of risk, we find that RWA do not, in general, predict market measures of bank risk. There is evidence, however, of a break in the relationship between stock market measures of risk and RWA since the start of the crisis. Indeed, we find a positive relationship between RWA and market risk in the three years prior to the crisis, from 2004 to 2006, and this relationship becomes negative after the crisis. This could result from the large increase in market measures of risk, which reflect the volatility of a bank’s stock price, since the crisis, while banks have not adjusted their RWA to reflect increased risk.

In light of increasing risk-aversion in markets during times of crisis, the question of how market assessments of risk should be incorporated into banking regulation and supervision remains. Indeed, the asymmetry of information between banks, supervisors, and market participants regarding how risky RWA are can lead to increased uncertainty about the adequacy of bank capital, which during a financial crisis, can have damaging effects for financial stability.

Monday, January 30, 2012

Liberals and Conservatives on Padilla's Fourth Circuit appeal

1  Liberals

In Padilla ruling, Fourth Circuit Court ignores U.S. international obligations.
January 24, 2012, 12:30 pm
http://compliancecampaign.wordpress.com/2012/01/24/in-padilla-ruling-fourth-circuit-court-ignores-u-s-international-obligations/

In a decision with international implications, a U.S. court has demonstrated a decided indifference to the United States’ international obligations on matters of human rights. On Monday the Fourth Circuit Court in Richmond, Va., ruled that the military policies of detention without charge and the harsh interrogation methods established by the Bush administration and continued in part by the Obama administration cannot be challenged in damage lawsuits in federal courts.

Issues raised by the case regarding the detention of terrorist suspects – in particular the treatment of Jose Padilla, a U.S. citizen held for nearly four years without charge as an “enemy combatant” – have been addressed specifically by international bodies to which the U.S. belongs, but these concerns did not factor in to the judges’ deliberations.

In dismissing the Padilla case, the court declared that under the Constitution, the making of counter-terrorism policy is entrusted solely to Congress and the President, and the courts may not “trespass” on this authority. The court therfore threw out the lawsuit brought by Padilla, who was seeking damages of one dollar from each of the defendants: Donald H. Rumsfeld, Former Secretary of Defense; Catherine T. Hanft, Former Commander Consolidated Brig; Melanie A. Marr, Former Commander Consolidated Brig; Lowell E. Jacoby, Vice Admiral, Former Director Defense Intelligence Agency; Paul Wolfowitz, Former Deputy Secretary Of Defense; William Haynes, Former General Counsel Department of Defense; Leon E. Panetta, Secretary of Defense.

Padilla had contended that he was entitled to sue the defendants because the government deprived him of other ways to seek remedies for his treatment, even under military code.

In its ruling, however, the court recognized the President’s purported absolute authority to hold terrorist suspects – even U.S. citizens – indefinitely and incommunicado as enemy combatants:

    On June 9, 2002, acting pursuant to his authority under the AUMF [2001 Authorization of Military Force], President George W. Bush issued an order to defendant Donald Rumsfeld, then Secretary of Defense, to detain Padilla as an enemy combatant, the President having determined that Padilla possessed vital intelligence and posed an ongoing threat to the national security of the United States.

    That day, Padilla was removed from civilian custody and transferred to the Naval Consolidated Brig at Charleston, South Carolina. While in military custody, Padilla claims that he was repeatedly abused, threatened with torture, deprived of basic necessities, and unjustifiably cut off from access to the outside world. Over time, these conditions were relaxed, and he was allowed monitored meetings with his attorneys.


The ruling seemed to downplay Padilla’s actual allegations though, which are not simply that he was “threatened with torture,” but in fact that he was tortured. According to his attorneys, Padilla was routinely mistreated and abused in ways designed to cause pain, anguish, depression and ultimately the loss of will to live.

“The extended torture visited upon Mr. Padilla has left him damaged, both mentally and physically,” said a court filing by Orlando do Campo, one of Padilla’s lawyers. The filing says that Padilla was subjected to sleep deprivation and extremes of heat and cold, and forced to stand for extended periods in painful “stress positions.”

His lawyers have also claimed that Padilla was forced to take LSD and PCP to act as truth serums during his interrogations.

As forensic psychiatrist Dr. Angela Hegarty, who interviewed Jose Padilla for 22 hours to determine the state of his mental health, told Democracy Now in 2007:

    What happened at the brig was essentially the destruction of a human being’s mind. That’s what happened at the brig. His personality was deconstructed and reformed.

    And essentially, like many abuse victims, whether it’s torture survivors or battered women or even children who are abused by parents, as long as the parents or the abuser is in control in their minds, essentially they identify with the primary aims of the abuser. And all abusers, whoever they are, have one absolute requirement, and that is that you keep their secret. I mean, it’s common knowledge that people who abuse children or women will say, “Look at what you made me do,” putting the blame on the victim, trying to instill guilt. “People will judge you. People will think you’re crazy if you tell them about this. You will be an enemy. You will be seen as an enemy. You will be seen as a bad person if this comes out. There will be dire and terrible consequences, not only for you.” Jose was very, very concerned that if torture allegations were made on his behalf, that somehow it would it interfere with the government’s ability to detain people at Guantanamo, and this was something he couldn’t sign onto. He was very identified with the goals of the government.

Dr. Hegarty commented specifically on the psychological effect of the prolonged isolation and sensory deprivation that Padilla was subjected to:

    This was the first time I ever met anybody who had been isolated for such an extraordinarily long period of time. I mean, the sensory deprivation studies, for example, tell us that without sleep, especially, people will develop psychotic symptoms, hallucinations, panic attacks, depression, suicidality within days. And here we had a man who had been in this situation, utterly dependent on his interrogators, who didn’t treat him all that nicely, for years. And apart from –- the only people I ever met who had such a protracted experience were people who were in detention camps overseas, that would come close, but even then they weren’t subjected to the sensory deprivation. So, yes, he was somewhat of a unique case in that regard.

Glossing over the specifics of Padilla’s four years of mistreatment, the Fourth Circuit’s decision instead treated these issues as mere policy decisions that were made expeditiously by the Executive and Legislative Branches – decisions that the Judiciary constitutionally has no say in.

The ruling makes clear the court’s opinion that the Judicial Branch has no competence to inject itself into matters that pertain to Congress’s war-making authority or the President’s powers as Commander-in-Chief, even when constitutional rights of U.S. citizens are involved:

    Special factors do counsel judicial hesitation in implying causes of action for enemy combatants held in military detention. First, the Constitution delegates authority over military affairs to Congress and to the President as Commander in Chief. It contemplates no comparable role for the judiciary. Second, judicial review of military decisions would stray from the traditional subjects of judicial competence.

The court noted that:

    Padilla’s complaint seeks quite candidly to have the judiciary review and disapprove sensitive military decisions made after extensive deliberations within the executive branch as to what the law permitted, what national security required, and how best to reconcile competing values. It takes little enough imagination to understand that a judicially devised damages action would expose past executive deliberations affecting sensitive matters of national security to the prospect of searching judicial scrutiny. It would affect future discussions as well, shadowed as they might be by the thought that those involved would face prolonged civil litigation and potential personal liability.

Further,

    This is a case in which the political branches, exercising powers explicitly assigned them by our Constitution, formulated policies with profound implications for national security. One may agree or not agree with those policies. One may debate whether they were or were not the most effective counterterrorism strategy. But the forum for such debates is not the civil cause of action pressed in the case at bar.

So, essentially, the Fourth Circuit Court in Richmond, Va., has washed the Judiciary’s hands of any responsibility in determining the constitutionality of any treatment of U.S. citizens who are designated by the Executive Branch as “enemy combatants.” Anything goes if the government calls you a terrorist, according to the court.

As Padilla’s attorney, Ben Wizner, said in a statement Monday:

    Today is a sad day for the rule of law and for those who believe that the courts should protect American citizens from torture by their own government. By dismissing this lawsuit, the appeals court handed the government a blank check to commit any abuse in the name of national security, even the brutal torture of a U.S. citizen on U.S. soil. This impunity is not only anathema to a democracy governed by laws, but contrary to history’s lesson that in times of fear our values are a strength, not a hindrance.

It could also be pointed out that since the Constitution provides that treaties entered into by the United States are “the supreme law of the land,” the court has issued the U.S. government a blank check to disregard this clause and violate international treaties at will, in particular the  International Covenant on Civil and Political Rights, ratified by the United States in 1992.

As Padilla was held in military custody for nearly four years without charge or trial, it appears the U.S. has violated of Article 9 of the ICCPR, which states:

    1. Everyone has the right to liberty and security of person. No one shall be subjected to arbitrary arrest or detention. No one shall be deprived of his liberty except on such grounds and in accordance with such procedure as are established by law.

    2. Anyone who is arrested shall be informed, at the time of arrest, of the reasons for his arrest and shall be promptly informed of any charges against him.

    3. Anyone arrested or detained on a criminal charge shall be brought promptly before a judge or other officer authorized by law to exercise judicial power and shall be entitled to trial within a reasonable time or to release. It shall not be the general rule that persons awaiting trial shall be detained in custody, but release may be subject to guarantees to appear for trial, at any other stage of the judicial proceedings, and, should occasion arise, for execution of the judgement.

    4. Anyone who is deprived of his liberty by arrest or detention shall be entitled to take proceedings before a court, in order that that court may decide without delay on the lawfulness of his detention and order his release if the detention is not lawful.

By denying Padilla a right to compensation in civil courts, the Fourth Circuit appears to have also overlooked this provision of the ICCPR: “Anyone who has been the victim of unlawful arrest or detention shall have an enforceable right to compensation.”

As a party to the Covenant, the U.S. is required to submit a report to the UN Human Rights Committee every five years on its compliance with the Covenant’s provisions.

The last report submitted by the United States – in 2005 – was seven years overdue. Regarding the matter of indefinite detention, the 2005 report pointed out that the U.S. Supreme Court has stated “that the United States is entitled to detain enemy combatants, even American citizens, until the end of hostilities, in order to prevent the enemy combatants from returning to the field of battle and again taking up arms.”

The U.S. asserted that “the detention of such individuals is such a fundamental and accepted incident of war that it is part of the ‘necessary and appropriate’ force that Congress authorized the President to use against nations, organizations, or persons associated with the September 11 terrorist attacks.”

The Human Rights Committee objected to this “restrictive interpretation made by the State party of its obligations under the Covenant,” and urged the U.S. to “review its approach and interpret the Covenant in good faith, in accordance with the ordinary meaning to be given to its terms in their context, including subsequent practice, and in the light of its object and purpose.”

The HRC had particularly harsh words for the U.S.’s indefinite detention policies: “The State party [the U.S.] should ensure that its counter-terrorism measures are in full conformity with the Covenant and in particular that the legislation adopted in this context is limited to crimes that would justify being assimilated to terrorism, and the grave consequences associated with it.”

The Committee reminded the United States of its obligations under the Covenant to both prosecute those responsible for using torture or cruel, inhuman or degrading treatment, and to provide compensation to the victims of such policies:

    The State party should conduct prompt and independent investigations into all allegations concerning suspicious deaths,  torture or cruel, inhuman or degrading treatment or punishment inflicted by its personnel (including commanders) as well as contract employees, in detention facilities in Guantanamo Bay, Afghanistan, Iraq and other overseas locations.  The State party should ensure that those responsible are prosecuted and punished in accordance with the gravity of the crime.  The State party should adopt all necessary measures to prevent the recurrence of such behaviors, in particular by providing adequate training and clear guidance to its personnel (including commanders) and contract employees, about their respective obligations and responsibilities, in line with articles 7 and 10 of the Covenant.  During the course of any legal proceedings, the State party should also refrain from relying on evidence obtained by treatment incompatible with article 7.  The Committee wishes to be informed about the measures taken by the State party to ensure the respect of the right to reparation for the victims.

By dismissing Padilla’s lawsuit, the Fourth Circuit Court has essentially done the opposite of what the UN Human Rights Committee has recommended to bring the U.S. in compliance with the ICCPR regarding its detention policies. The court has ensured, at least for now, that the right of reparations for the victims of U.S. detention and torture policies will remain unrecognized by the United States. It has ensured that the U.S. will remain in violation of its obligations under international law.


2  Conservatives

'Lawfare' Loses Big
The ACLU loses its nasty suit against former defense officials.
WSJ, Jan 28, 2012
http://online.wsj.com/article/SB10001424052970203718504577181191271527180.html

The guerrilla legal campaign against national security suffered a big defeat this week, and the good news deserves more attention. The victory for legal sanity came Monday when the Fourth Circuit Court of Appeals upheld a lower court decision to toss out a suit brought by aspiring terrorist Jose Padilla against a slew of Bush Administration officials.

Readers may remember that Padilla was arrested in 2002 for plotting to set off a dirty bomb on U.S. soil. He was detained as an enemy combatant, convicted in a Miami court and sentenced to 17 years in prison. But Padilla has been adopted as a legal mascot by the ACLU and the National Litigation Project at Yale Law School, which have sued far and wide alleging mistreatment and lack of due process.

Padilla may in fact have had more due process than any defendant in history. His case has been ruled on by no fewer than 10 civilian courts, and as a prisoner in the Navy brig in Charleston, South Carolina from 2002 to 2006 he received the benefit of protections under the highly disciplined U.S. Code of Military Justice. Your average bank robber should be so lucky.

But the lawyers suing for Padilla aren't interested in justice. They're practicing "lawfare," which is an effort to undermine the war on terror by making U.S. officials afraid to pursue it for fear of personal liability.

The ACLU and the rest of the legal left have failed to persuade several Congresses and two Administrations to agree to their anti-antiterror policies. So instead they're suing former officials in civilian court to harass them and damage their reputations. It's shameful stuff, and if it succeeds it would have the effect of making Pentagon officials look over their shoulder at potential lawsuits every time they had to make a difficult military or interrogation decision.

In Lebron v. Rumsfeld et al., the ACLU sued under the Supreme Court's 1971 Bivens decision, which has been interpreted as creating a right of action against the federal government. Their targets included a retinue of Pentagon officials, starting with former Secretary of Defense Donald Rumsfeld and going down to the Navy brig commander where Padilla was held. Mr. Rumsfeld doesn't have to worry about getting another job, but the ACLU wants to make lower-level officials politically radioactive so they have a difficult time getting promoted or working in any influential position.

The good news is that the Fourth Circuit's three-judge panel saw this for what it was and unanimously rejected the claims. In his 39-page opinion, the influential Judge J. Harvie Wilkinson wrote that the Constitution gives authority over military affairs to Congress and to the President as Commander in Chief, but it never created a similar role for the courts.

"It takes little enough imagination," Judge Wilkinson wrote, "to understand that a judicially devised damages action would expose past executive deliberations . . . [and] would affect future discussions as well, shadowed as they might be by the thought that those involved would face prolonged civil litigation and potential personal liability."

The decision is especially notable because one of the three judges is Clinton appointee Diana Motz, who has been a skeptic of the Bush Administration's detainee policies and has dissented from her colleagues in cases like 2003's Hamdi v. Rumsfeld.

The ACLU may appeal to all of the Fourth Circuit judges, but Judge Wilkinson's ruling is comprehensive enough that an appeal is unlikely to prevail. The judges deserve credit for understanding that the Constitution gave war powers to the political branches, not to courts. The country will be safer for it.

Friday, January 27, 2012

China's Cyber Thievery Is National Policy—And Must Be Challenged

China's Cyber Thievery Is National Policy—And Must Be Challenged. By MIKE MCCONNELL, MICHAEL CHERTOFF AND WILLIAM LYNN
It is more efficient for the Chinese to steal innovations and intellectual property than to incur the cost and time of creating their own.WSJ, Jan 27, 2012
http://online.wsj.com/article/SB10001424052970203718504577178832338032176.html

Only three months ago, we would have violated U.S. secrecy laws by sharing what we write here—even though, as a former director of national intelligence, secretary of homeland security, and deputy secretary of defense, we have long known it to be true. The Chinese government has a national policy of economic espionage in cyberspace. In fact, the Chinese are the world's most active and persistent practitioners of cyber espionage today.

Evidence of China's economically devastating theft of proprietary technologies and other intellectual property from U.S. companies is growing. Only in October 2011 were details declassified in a report to Congress by the Office of the National Counterintelligence Executive. Each of us has been speaking publicly for years about the ability of cyber terrorists to cripple our critical infrastructure, including financial networks and the power grid. Now this report finally reveals what we couldn't say before: The threat of economic cyber espionage looms even more ominously.

The report is a summation of the catastrophic impact cyber espionage could have on the U.S. economy and global competitiveness over the next decade. Evidence indicates that China intends to help build its economy by intellectual-property theft rather than by innovation and investment in research and development (two strong suits of the U.S. economy). The nature of the Chinese economy offers a powerful motive to do so.

According to 2009 estimates by the United Nations, China has a population of 1.3 billion, with 468 million (about 36% of the population) living on less than $2 a day. While Chinese poverty has declined dramatically in the last 30 years, income inequality has increased, with much greater benefits going to the relatively small portion of educated people in urban areas, where about 25% of the population lives.

The bottom line is this: China has a massive, inexpensive work force ravenous for economic growth. It is much more efficient for the Chinese to steal innovations and intellectual property—the source code of advanced economies—than to incur the cost and time of creating their own. They turn those stolen ideas directly into production, creating products faster and cheaper than the U.S. and others.

Cyberspace is an ideal medium for stealing intellectual capital. Hackers can easily penetrate systems that transfer large amounts of data, while corporations and governments have a very hard time identifying specific perpetrators.

Unfortunately, it is also difficult to estimate the economic cost of these thefts to the U.S. economy. The report to Congress calls the cost "large" and notes that this includes corporate revenues, jobs, innovation and impacts to national security. Although a rigorous assessment has not been done, we think it is safe to say that "large" easily means billions of dollars and millions of jobs.

So how to protect ourselves from this economic threat? First, we must acknowledge its severity and understand that its impacts are more long-term than immediate. And we need to respond with all of the diplomatic, trade, economic and technological tools at our disposal.

The report to Congress notes that the U.S. intelligence community has improved its collaboration to better address cyber espionage in the military and national-security areas. Yet today's legislative framework severely restricts us from fully addressing domestic economic espionage. The intelligence community must gain a stronger role in collecting and analyzing this economic data and making it available to appropriate government and commercial entities.

Congress and the administration must also create the means to actively force more information-sharing. While organizations (both in government and in the private sector) claim to share information, the opposite is usually the case, and this must be actively fixed.

The U.S. also must make broader investments in education to produce many more workers with science, technology, engineering and math skills. Our country reacted to the Soviet Union's 1957 launch of Sputnik with investments in math and science education that launched the age of digital communications. Now is the time for a similar approach to build the skills our nation will need to compete in a global economy vastly different from 50 years ago.

Corporate America must do its part, too. If we are to ever understand the extent of cyber espionage, companies must be more open and aggressive about identifying, acknowledging and reporting incidents of cyber theft. Congress is considering legislation to require this, and the idea deserves support. Companies must also invest more in enhancing their employees' cyber skills; it is shocking how many cyber-security breaches result from simple human error such as coding mistakes or lost discs and laptops.

In this election year, our economy will take center stage, as will China and its role in issues such as monetary policy. If we are to protect ourselves against irreversible long-term damage, the economic issues behind cyber espionage must share some of that spotlight.

Mr. McConnell, a retired Navy vice admiral and former director of the National Security Agency (1992-96) and director of national intelligence (2007-09), is vice chairman of Booz Allen Hamilton. Mr. Chertoff, a former secretary of homeland security (2005-09), is senior counsel at Covington & Burling. Mr. Lynn has served as deputy secretary of defense (2009-11) and undersecretary of defense (1997-2001).

Thursday, January 26, 2012

Sovereign Risk, Fiscal Policy, and Macroeconomic Stability

Sovereign Risk, Fiscal Policy, and Macroeconomic Stability. By Giancarlo Corsetti, Keith Kuester, Andre Meier, and Gernot J. Mueller
IMF Working Paper No. 12/33
January, 2012
http://www.imf.org/external/pubs/cat/longres.aspx?sk=25681.0

Abstract
This paper analyzes the impact of strained government finances on macroeconomic stability and the transmission of fiscal policy. Using a variant of the model by Curdia and Woodford (2009), we study a “sovereign risk channel” through which sovereign default risk raises funding costs in the private sector. If monetary policy is constrained, the sovereign risk channel exacerbates indeterminacy problems: private-sector beliefs of a weakening economy may become self-fulfilling. In addition, sovereign risk amplifies the effects of negative cyclical shocks. Under those conditions, fiscal retrenchment can help curtail the risk of macroeconomic instability and, in extreme cases, even stimulate economic activity.

Conclusion
The present paper analyzes how the ”sovereign risk channel” affects macroeconomic dynamics and stabilization policy. Through this channel, rising sovereign risk drives up private-sector borrowing costs, unless higher risk premia are offset by looser monetary policy. If the central bank is constrained in counteracting higher risk premia, sovereign risk becomes a critical determinant of macroeconomic outcomes. Its implications for stabilization policy have not been fully appreciated in earlier formal analyses, although they are likely to be of great importance for many advanced economies currently facing intense fiscal strain.

Building on the model proposed by C´urdia and Woodford (2009), we show that the sovereign risk channel makes the economy (more) vulnerable to problems of indeterminacy. In particular, private-sector beliefs about a weakening economy can become self-fulfilling, driving up risk premia and choking off demand. In this environment, a procyclical fiscal stance—that is, tighter fiscal policy during economic downturns–can help to ensure determinacy.

Further, we find that sovereign risk tends to exacerbate the effects of negative cyclical shocks: recessionary episodes will be deeper the stronger the sovereign risk channel, which in our specification is a nonlinear function of public-sector indebtedness. Moreover, in deep recessions that force the central bank down to the zero lower bound (ZLB) for nominal interest rates, sovereign risk delays the exit from the ZLB, hence prolonging macroeconomic distress.  The sovereign risk channel also has a significant bearing on fiscal multipliers. Specifically, the effect of government spending on aggregate output hinges on (i) the responsiveness of private-sector risk premia to indicators of fiscal strain; and (ii) the length of time during which monetary policy is expected to be constrained. Our analysis suggests that upfront fiscal retrenchment is less detrimental to economic activity (i.e., multipliers are smaller) in the presence of significant sovereign risk, as lower public deficits improve private-sector financing conditions. In relatively extreme cases where fiscal strains are severe and monetary policy is constrained for an extended period, fiscal tightening may even exert an expansionary effect.  That being said, fiscal retrenchment is no miracle cure. Indeed, all our simulations feature a deep recession even if tighter fiscal policy, under the aforementioned conditions, may stimulate economic activity relative to an even bleaker baseline.

As an additional caveat, we note that our analysis has focused on fiscal multipliers under a go-it-alone policy that does not involve external financial support at below-market rates.  Availability of such support could allow countries to stretch out the necessary fiscal adjustment as they benefit from lower funding costs and, possibly, positive credibility effects. Indeed, if and where announcements of future fiscal adjustment are credible, delaying some of the planned spending cuts remains a superior strategy in terms of protecting short-term growth.  How countries end up dealing with the challenges summarized here may prove to be a defining feature of global economic developments over the coming years.

Bank Funding Structures and Risk: Evidence from the Global Financial Crisis

Bank Funding Structures and Risk: Evidence from the Global Financial Crisis. By Francisco Vazquez and Pablo Federico
IMF Working Paper WP/12/29
http://www.imfbookstore.org/ProdDetails.asp?ID=WPIEA2012029
Jan, 2012

Summary: This paper analyzes the evolution of bank funding structures in the run up to the global financial crisis and studies the implications for financial stability, exploiting a bank-level dataset that covers about 11,000 banks in the U.S. and Europe during 2001–09. The results show that banks with weaker structural liquidity and higher leverage in the pre-crisis period were more likely to fail afterward. The likelihood of bank failure also increases with bank risk-taking. In the cross-section, the smaller domestically-oriented banks were relatively more vulnerable to liquidity risk, while the large cross-border banks were more susceptible to solvency risk due to excessive leverage. The results support the proposed Basel III regulations on structural liquidity and leverage, but suggest that emphasis should be placed on the latter, particularly for the systemically-important institutions. Macroeconomic and monetary conditions are also shown to be related with the likelihood of bank failure, providing a case for the introduction of a macro-prudential approach to banking regulation.


Introduction
The global financial crisis raised questions on the adequacy of bank risk management practices and triggered a deep revision of the regulatory and supervisory frameworks governing bank liquidity risk and capital buffers. Regulatory initiatives at the international level included, inter alia, the introduction of liquidity standards for internationally-active banks, binding leverage ratios, and a revision of capital requirements under Basel III (BCBS 2009; and BCBS 2010 a, b).2 In addition to these micro-prudential measures, academics and policymakers argued for the introduction of a complementary macro-prudential framework to help safeguard financial stability at the systemic level (Hanson, Kashyap and Stein, 2010).

This regulatory response was implicitly based on two premises. First, the view that individual bank decisions regarding the size of their liquidity and capital buffers in the run up to the crisis were not commensurate with their risk-taking—and were therefore suboptimal from the social perspective. Second, the perception that the costs of bank failures spanned beyond the interests of their direct stakeholders due, for example, to supply-side effects in credit markets, or network externalities in the financial sector (Brunnermeier, 2009).

The widespread bank failures in the U.S. and Europe at the peak of the global financial crisis provided casual support to the first premise. Still, empirical work on the connection between bank liquidity and capital buffers and their subsequent probability of failure is incipient.  Background studies carried out in the context of Basel III proposals, which are based on aggregate data, concluded that stricter regulations on liquidity and leverage were likely to ameliorate the probability of systemic banking crises (BCBS, 2010b).3 In turn, studies based on micro data for U.S. banks also support the notion that banks with higher asset liquidity, stronger reliance on retail insured deposits, and larger capital buffers were less vulnerable to failure during the global financial crisis (Berger and Bouwman, 2010; Bologna, 2011).  Broadly consistent results are reported in Ratnovski and Huang (2009), based on data for large banks from the OECD.

This paper makes two contributions to previous work. First, it measures structural liquidity and leverage in bank balance sheets in a way consistent with the formulations of the Net Stable Funding Ratio (NSFR), and the leverage ratio (EQUITY) proposed in Basel III. Second, it explores for systematic differences in the relationship between structural liquidity, leverage, and subsequent probability of failure across bank types. In particular, we distinguish between large, internationally-active banks (henceforth Global banks), and (typically smaller) banks that focus on their domestic retail markets (henceforth Domestic banks).

This sample partition is suitable from the financial stability perspective. Global banks are systemically important and extremely challenging to resolve, due to the complexity of their business and legal structures, and because their operations span across borders, entailing differences in bank insolvency frameworks and difficult fiscal considerations. Furthermore, the relative role of liquidity and capital buffers for bank financial soundness is likely to differ systematically across these two types of banks. All else equal, Global banks benefit from the imperfect co-movement macroeconomic and monetary conditions across geographic regions (Griffith-Jones, Segoviano, and Spratt, 2002; Garcia-Herrero and Vazquez, 2007) and may exploit their internal capital markets to reshuffle liquidity and capital between business units.  In addition, Global banks tend to enjoy a more stable funding base than Domestic banks due to flight to safety, particularly during times of market distress. To the extent that these factors are incorporated in bank risk management decisions, optimal choices on structural liquidity and leverage are likely to differ across these two types of banks.

The paper exploits a bank-level dataset that covers about 11,000 U.S. and European banks during 2001-09. This sample coverage allows us to study bank dynamics leading to, and during, the global financial crisis. As a by-product, we document the evolution of structural liquidity and leverage in the pre-crisis period, and highlight some patterns across bank types to motivate further research. Contrary to expectations, the average structural liquidity in bank balance sheets in the run up to the global financial crisis (as measured by a proxy of the NSFR) was close to the target values proposed in Basel III recommendations.4 However, we find a wide dispersion in structural liquidity across banks. A mild (albeit sustained) increase in structural liquidity mismatches in the run up to the crisis was driven by banks located at the lower extreme of the distribution. Pre-crisis leverage was also widely uneven across banks, with the Global banks displaying thinner capital buffers and wider gaps between leverage ratios and Basel capital to risk-weighted assets.

In line with alleged deficiencies in bank risk management practices, we find that banks with weaker structural liquidity and banks with higher leverage ratios in the run up to the crisis were more vulnerable to failure, after controlling for their pre-crisis risk-taking. However, the average effects of stronger structural liquidity and capital buffers on the likelihood of bank failure are not large. On the other hand, there is evidence of substantial threshold effects, and the benefits of stronger buffers appear substantial for the banks located at the lower extremes of the distributions. In addition, we find systematic differences in the relative importance of liquidity and leverage for financial fragility across groups of banks. Global banks were more susceptible to failure on excessive leverage, while Domestic banks were more susceptible to failure on weak structural liquidity (i.e., excessive liquidity transformation) and overreliance on short-term wholesale funding. 

In the estimations, we include bank-level controls for pre-crisis risk taking, and for countryspecific macroeconomic conditions (i.e., common to all banks incorporated in a given country). The use of controls for pre-crisis risk-taking is critical to this study. To the extent that banks perform active risk management, higher risk-taking would tend to be associated with stronger liquidity and capital buffers, introducing a bias to the results. In fact, we find that banks engaging in more aggressive risk taking in the run-up to the crisis—as measured by the rate of growth of their credit portfolios and by their pre-crisis distance to default— were more likely to fail afterward. Macroeconomic conditions in the pre-crisis period are also found to affect bank probabilities of default, suggesting that banks may have failed to internalize risks stemming from overheated economic activity and exuberant asset prices.

All in all, these results provide support to the proposed regulations on liquidity and capital, as well as to the introduction of a macro-prudential approach to bank regulation. From the financial stability perspective, however, the evidence indicates that regulations on capital— particularly for the larger banking groups—are likely to be more relevant.

Concluding remarks
Overall, the findings of this paper provide broad support to Basel III initiatives on structural liquidity and leverage, and show the complementary nature of these two areas. Banks with weaker structural liquidity and higher leverage before the global financial crisis were more vulnerable to subsequent failure. The results are driven by banks in the lower extremes of the distributions, suggesting the presence of threshold effects. In fact, the marginal stability gains associated with stronger liquidity and capital cushions do not appear to be large for the average bank, but seem substantial for the weaker institutions.

At the same time, there is evidence of systematic differences across bank types. The smaller banks were more susceptible to failure on liquidity problems, while the large cross-border banking groups typically failed on insufficient capital buffers. This difference is crucial from the financial stability perspective, and implies that regulatory and supervisory emphasis should be placed on ensuring that the capital buffers of the systemically important banks are commensurate with their risk-taking.

The evidence also indicates that bank risk-taking in the run-up to the crisis was associated with increased financial vulnerability, suggesting that bank decisions regarding the associated liquidity and capital buffers were not commensurate with the underlying risks, resulting in excessive hazard to their business continuity. Country-specific macroeconomic conditions also played a role in the likelihood of subsequent bank failure, implying that banks failed to properly internalize the associated risks in their individual decision-making processes. Thus, while more intrusive regulations entail efficiency costs, the results point to associated gains in terms of financial stability that have to be pondered. This also supports the introduction of a macro-prudential framework as a complement to traditional, microprudential approach. In this regard, further work is needed to deepen the understanding of the role of the macroeconomic environment on financial stability.

Wednesday, January 25, 2012

No More Résumés, Say Some Firms

No More Résumés, Say Some Firms. By RACHEL EMMA SILVERMAN
WSJ, Jan 25, 2012
http://online.wsj.com/article/SB10001424052970203750404577173031991814896.html

Union Square Ventures recently posted an opening for an investment analyst.

Instead of asking for résumés, the New York venture-capital firm—which has invested in Twitter, Foursquare, Zynga and other technology companies—asked applicants to send links representing their "Web presence," such as a Twitter account or Tumblr blog. Applicants also had to submit short videos demonstrating their interest in the position.

Union Square says its process nets better-quality candidates —especially for a venture-capital operation that invests heavily in the Internet and social-media—and the firm plans to use it going forward to fill analyst positions and other jobs.

Companies are increasingly relying on social networks such as LinkedIn, video profiles and online quizzes to gauge candidates' suitability for a job. While most still request a résumé as part of the application package, some are bypassing the staid requirement altogether.

A résumé doesn't provide much depth about a candidate, says Christina Cacioppo, an associate at Union Square Ventures who blogs about the hiring process on the company's website and was herself hired after she compiled a profile comprising her personal blog, Twitter feed, LinkedIn profile, and links to social-media sites Delicious and Dopplr, which showed places where she had traveled.

StickerGiant's John Fischer, right, and interviewee Adam Thackeray shoot a video Monday. Mr. Fischer uses an online survey to screen applicants.

"We are most interested in what people are like, what they are like to work with, how they think," she says.

John Fischer, founder and owner of StickerGiant.com, a Hygiene, Colo., company that makes bumper and marketing stickers, says a résumé isn't the best way to determine whether a potential employee will be a good social fit for the company. Instead, his firm uses an online survey to help screen applicants.

Questions are tailored to the position. A current opening for an Adobe Illustrator expert asks applicants about their skills, but also asks questions such as "What is your ideal dream job?" and "What is the best job you've ever had?" Applicants have the option to attach a résumé, but it isn't required. Mr. Fischer says he started using online questionnaires several years ago, after receiving too many résumés from candidates who had no qualifications or interest. Having applicants fill out surveys is a "self-filter," he says.

A previous posting for an Internet marketing position had applicants rate their marketing and social-media skills on a scale of one to 10 and select from a list of words how friends or co-workers would describe them. Options included: high energy, type-A, laid back, perfect, creative or fun.

In times of high unemployment, bypassing résumés can also help companies winnow out candidates from a broader labor pool.

IGN Entertainment Inc., a gaming and media firm, launched a program dubbed Code Foo, in which it taught programming skills to passionate gamers with little experience, paying participants while they learned. Instead of asking for résumés, the firm posted a series of challenges on its website aimed at gauging candidates' thought processes. (One challenge: Estimate how many pennies lined side by side would span the Golden Gate Bridge.)

It also asked candidates to submit a video demonstrating their love of gaming and the firm's products.

IGN is a unit of News Corp., which also owns The Wall Street Journal.

Nearly 30 people out of about 100 applicants were picked for the six-week Code Foo program, and six were eventually hired full-time. Several of the hires were nontraditional applicants who didn't attend college or who had thin work experience.

"If we had just looked at their résumés at the moment we wouldn't have hired them," says Greg Silva, IGN's vice president of people and places. The company does require résumés for its regular job openings.

At most companies, résumés are still the first step of the recruiting process, even at supposedly nontraditional places like Google Inc., which hired about 7,000 people in 2011, after receiving some two million résumés. Google has an army of "hundreds" of recruiters who actually read every one, says Todd Carlisle, the technology firm's director of staffing.

But Dr. Carlisle says he reads résumés in an unusual way: from the bottom up.

Candidates' early work experience, hobbies, extracurricular activities or nonprofit involvement—such as painting houses to pay for college or touring with a punk rock band through Europe—often provide insight into how well an applicant would fit into the company culture, Dr. Carlisle says.

Plus, "It's the first sample of work we have of yours," he says.