Wednesday, May 27, 2009

Nostalgianomics: income inequality, accuracy and passion

Nostalgianomics, by Brink Lindsey
Cato Institute: Commentary
May 26, 2009

"The America I grew up in was a relatively equal middle-class society. Over the past generation, however, the country has returned to Gilded Age levels of inequality." So sighs Paul Krugman, the Nobel Prize-winning Princeton economist and New York Times columnist, in his recent book The Conscience of a Liberal.

The sentiment is nothing new. Political progressives such as Krugman have been decrying increases in income inequality for many years now. But Krugman has added a novel twist, one that has important implications for public policy and economic discourse in the age of Obama. In seeking explanations for the widening spread of incomes during the last four decades, researchers have focused overwhelmingly on broad structural changes in the economy, such as technological progress and demographic shifts. Krugman argues that these explanations are insufficient. "Since the 1970s," he writes, "norms and institutions in the United States have changed in ways that either encouraged or permitted sharply higher inequality. Where, however, did the change in norms and institutions come from? The answer appears to be politics."

To understand Krugman's argument, we can't start in the 1970s. We have to back up to the 1930s and '40s—when, he contends, the "norms and institutions" that shaped a more egalitarian society were created. "The middle-class America of my youth," Krugman writes, "is best thought of not as the normal state of our society, but as an interregnum between Gilded Ages. America before 1930 was a society in which a small number of very rich people controlled a large share of the nation's wealth." But then came the twin convulsions of the Great Depression and World War II, and the country that arose out of those trials was a very different place. "Middle-class America didn't emerge by accident. It was created by what has been called the Great Compression of incomes that took place during World War II, and sustained for a generation by social norms that favored equality, strong labor unions and progressive taxation."

The Great Compression is a term coined by the economists Claudia Goldin of Harvard and Robert Margo of Boston University to describe the dramatic narrowing of the nation's wage structure during the 1940s. The real wages of manufacturing workers jumped 67 percent between 1929 and 1947, while the top 1 percent of earners saw a 17 percent drop in real income. These egalitarian trends can be attributed to the exceptional circumstances of the period: precipitous declines at the top end of the income spectrum due to economic cataclysm; wartime wage controls that tended to compress wage rates; rapid growth in the demand for low-skilled labor, combined with the labor shortages of the war years; and rapid growth in the relative supply of skilled workers due to a near doubling of high school graduation rates.

Yet the return to peacetime and prosperity did not result in a shift back toward the status quo ante. The more egalitarian income structure persisted for decades. For an explanation, Krugman leans heavily on a 2007 paper by the Massachusetts Institute of Technology economists Frank Levy and Peter Temin, who argue that postwar American history has been a tale of two widely divergent systems of political economy. First came the "Treaty of Detroit," characterized by heavy unionization of industry, steeply progressive taxation, and a high minimum wage. Under that system, median wages kept pace with the economy's overall productivity growth, and incomes at the lower end of the scale grew faster than those at the top. Beginning around 1980, though, the Treaty of Detroit gave way to the free market "Washington Consensus." Tax rates on high earners fell sharply, the real value of the minimum wage declined, and private-sector unionism collapsed. As a result, most workers' incomes failed to share in overall productivity gains while the highest earners had a field day.

This revisionist account of the fall and rise of income inequality is being echoed daily in today's public policy debates. Under the conventional view, rising inequality is a side effect of economic progress—namely, continuing technological breakthroughs, especially in communications and information technology. Consequently, when economists have supported measures to remedy inequality, they have typically shied away from structural changes in market institutions. Rather, they have endorsed more income redistribution to reduce post-tax income differences, along with remedial education, job retraining, and other programs designed to raise the skill levels of lower-paid workers.

By contrast, Krugman sees the rise of inequality as a consequence of economic regress—in particular, the abandonment of well-designed economic institutions and healthy social norms that promoted widely shared prosperity. Such an assessment leads to the conclusion that we ought to revive the institutions and norms of Paul Krugman's boyhood, in broad spirit if not in every detail.

There is good evidence that changes in economic policies and social norms have indeed contributed to a widening of the income distribution since the 1970s. But Krugman and other practitioners of nostalgianomics are presenting a highly selective account of what the relevant policies and norms were and how they changed.

The Treaty of Detroit was built on extensive cartelization of markets, limiting competition to favor producers over consumers. The restrictions on competition were buttressed by racial prejudice, sexual discrimination, and postwar conformism, which combined to limit the choices available to workers and potential workers alike. Those illiberal social norms were finally swept aside in the cultural tumults of the 1960s and '70s. And then, in the 1970s and '80s, restraints on competition were substantially reduced as well, to the applause of economists across the ideological spectrum. At least until now.


Stifled Competition

The economic system that emerged from the New Deal and World War II was markedly different from the one that exists today. The contrast between past and present is sharpest when we focus on one critical dimension: the degree to which public policy either encourages or thwarts competition.

The transportation, energy, and communications sectors were subject to pervasive price and entry regulation in the postwar era. Railroad rates and service had been under federal control since the Interstate Commerce Act of 1887, but the Motor Carrier Act of 1935 extended the Interstate Commerce Commission's regulatory authority to cover trucking and bus lines as well. In 1938 airline routes and fares fell under the control of the Civil Aeronautics Authority, later known as the Civil Aeronautics Board. After the discovery of the East Texas oil field in 1930, the Texas Railroad Commission acquired the effective authority to regulate the nation's oil production. Starting in 1938, the Federal Power Commission regulated rates for the interstate transmission of natural gas. The Federal Communications Commission, created in 1934, allocated licenses to broadcasters and regulated phone rates.

Beginning with the Agricultural Adjustment Act of 1933, prices and production levels on a wide variety of farm products were regulated by a byzantine complex of controls and subsidies. High import tariffs shielded manufacturers from international competition. And in the retail sector, aggressive discounting was countered by state-level "fair trade laws," which allowed manufacturers to impose minimum resale prices on nonconsenting distributors.

Comprehensive regulation of the financial sector restricted competition in capital markets too. The McFadden Act of 1927 added a federal ban on interstate branch banking to widespread state-level restrictions on intrastate branching. The Glass-Steagall Act of 1933 erected a wall between commercial and investment banking, effectively brokering a market-sharing agreement protecting commercial and investment banks from each other. Regulation Q, instituted in 1933, prohibited interest payments on demand deposits and set interest rate ceilings for time deposits. Provisions of the Securities Act of 1933 limited competition in underwriting by outlawing pre-offering solicitations and undisclosed discounts. These and other restrictions artificially stunted the depth and development of capital markets, muting the intensity of competition throughout the larger "real" economy. New entrants are much more dependent on a well-developed financial system than are established firms, since incumbents can self-finance through retained earnings or use existing assets as collateral. A hobbled financial sector acts as a barrier to entry and thereby reduces established firms' vulnerability to competition from entrepreneurial upstarts.

The highly progressive tax structure of the early postwar decades further dampened competition. The top marginal income tax rate shot up from 25 percent to 63 percent under Herbert Hoover in 1932, climbed as high as 94 percent during World War II, and stayed at 91 percent during most of the 1950s and early '60s. Research by the economists William Gentry of Williams College and Glenn Hubbard of Columbia University has found that such rates act as a "success tax," discouraging employees from striking out as entrepreneurs.

Finally, competition in labor markets was subject to important restraints during the early postwar decades. The triumph of collective bargaining meant the active suppression of wage competition in a variety of industries. In the interest of boosting wages, unions sometimes worked to restrict competition in their industries' product markets as well. Garment unions connived with trade associations to set prices and allocate production among clothing makers. Coal miner unions attempted to regulate production by dictating how many days a week mines could be open.

MIT economists Levy and Temin don't mention it, but highly restrictive immigration policies were another significant brake on labor market competition. With the establishment of countryspecific immigration quotas under the Immigration Act of 1924, foreign-born residents of the United States plummeted from 13 percent of the total population in 1920 to 5 percent by 1970. As a result, competition at the less-skilled end of the U.S. labor market was substantially reduced.


Solidarity and Chauvinism

The anti-competitive effects of the Treaty of Detroit were reinforced by the prevailing social norms of the early postwar decades. Here Krugman and company focus on executive pay. Krugman quotes wistfully from John Kenneth Galbraith's characterization of the corporate elite in his 1967 book The New Industrial State: "Management does not go out ruthlessly to reward itself—a sound management is expected to exercise restraint." According to Krugman, "For a generation after World War II, fear of outrage kept executive salaries in check. Now the outrage is gone. That is, the explosion in executive pay represents a social change…like the sexual revolution of the 1960's—a relaxation of old strictures, a new permissiveness, but in this case the permissiveness is financial rather than sexual."

Krugman is on to something. But changing attitudes about lavish compensation packages are just one small part of a much bigger cultural transformation. During the early postwar decades, the combination of in-group solidarity and out-group hostility was much more pronounced than what we're comfortable with today.

Consider, first of all, the dramatic shift in attitudes about race. Open and unapologetic discrimination by white Anglo-Saxon Protestants against other ethnic groups was widespread and socially acceptable in the America of Paul Krugman's boyhood. How does racial progress affect income inequality? Not the way we might expect. The most relevant impact might have been that more enlightened attitudes about race encouraged a reversal in the nation's restrictive immigration policies. The effect was to increase the number of less-skilled workers and thereby intensify competition among them for employment.

Under the system that existed between 1924 and 1965, immigration quotas were set for each country based on the percentage of people with that national origin already living in the U.S. (with immigration from East and South Asia banned outright until 1952). The explicit purpose of the national-origin quotas was to freeze the ethnic composition of the United States—that is, to preserve white Protestant supremacy and protect the country from "undesirable" races. "Unquestionably, there are fine human beings in all parts of the world," Sen. Robert Byrd (D-W.V.) said in defense of the quota system in 1965, "but people do differ widely in their social habits, their levels of ambition, their mechanical aptitudes, their inherited ability and intelligence, their moral traditions, and their capacity for maintaining stable governments."

But the times had passed the former Klansman by. With the triumph of the civil rights movement, official discrimination based on national origin was no longer sustainable. Just two months after signing the Voting Rights Act, President Lyndon Johnson signed the Immigration and Nationality Act of 1965, ending the "un-American" system of national-origin quotas and its "twin barriers of prejudice and privilege." The act inaugurated a new era of mass immigration: Foreign-born residents of the United States have surged from 5 percent of the population in 1970 to 12.5 percent as of 2006.

This wave of immigration exerted a mild downward pressure on the wages of native-born low-skilled workers, with most estimates showing a small effect. Immigration's more dramatic impact on measurements of inequality has come by increasing the number of less-skilled workers, thereby increasing apparent inequality by depressing average wages at the low end of the income distribution. According to the American University economist Robert Lerman, excluding recent immigrants from the analysis would eliminate roughly 30 percent of the increase in adult male annual earnings inequality between 1979 and 1996.

Although the large influx of unskilled immigrants has made American inequality statistics look worse, it has actually reduced inequality for the people involved. After all, immigrants experience large wage gains as a result of relocating to the United States, thereby reducing the cumulative wage gap between them and top earners in this country. When Lerman recalculated trends in inequality to include, at the beginning of the period, recent immigrants and their native-country wages, he found equality had increased rather than decreased. Immigration has increased inequality at home but decreased it on a global scale.

Just as racism helped to keep foreign-born workers out of the U.S. labor market, another form of in-group solidarity, sexism, kept women out of the paid work force. As of 1950, the labor force participation rate for women 16 and older stood at only 34 percent. By 1970 it had climbed to 43 percent, and as of 2005 it had jumped to 59 percent. Meanwhile, the range of jobs open to women expanded enormously.

Paradoxically, these gains for gender equality widened rather than narrowed income inequality overall. Because of the prevalence of "assortative mating"—the tendency of people to choose spouses with similar educational and socioeconomic backgrounds—the rise in dual-income couples has exacerbated household income inequality: Now richer men are married to richer wives. Between 1979 and 1996, the proportion of working-age men with working wives rose by approximately 25 percent among those in the top fifth of the male earnings distribution, and their wives' total earnings rose by over 100 percent. According to a 1999 estimate by Gary Burtless of the Brookings Institution, this unanticipated consequence of feminism explains about 13 percent of the total rise in income inequality since 1979.

Racism and sexism are ancient forms of group identity. Another form, more in line with what Krugman has in mind, was a distinctive expression of U.S. economic and social development in the middle decades of the 20th century. The journalist William Whyte described this "social ethic" in his 1956 book The Organization Man, outlining a sensibility that defined itself in studied contrast to old-style "rugged individualism." When contemporary critics scorned the era for its conformism, they weren't just talking about the ranch houses and gray flannel suits. The era's mores placed an extraordinary emphasis on fitting into the group.

"In the Social Ethic I am describing," wrote Whyte, "man's obligation is…not so much to the community in a broad sense but to the actual, physical one about him, and the idea that in isolation from it—or active rebellion against it—he might eventually discharge the greater service is little considered." One corporate trainee told Whyte that he "would sacrifice brilliance for human understanding every time." A personnel director declared that "any progressive employer would look askance at the individualist and would be reluctant to instill such thinking in the minds of trainees." Whyte summed up the prevailing attitude: "All the great ideas, [trainees] explain, have already been discovered and not only in physics and chemistry but in practical fields like engineering. The basic creative work is done, so the man you need—for every kind of job—is a practical, team-player fellow who will do a good shirt-sleeves job."

It seems entirely reasonable to conclude that this social ethic helped to limit competition among business enterprises for top talent. When secure membership in a stable organization is more important than maximizing your individual potential, the most talented employees are less vulnerable to the temptation of a better offer elsewhere. Even if they are tempted, a strong sense of organizational loyalty makes them more likely to resist and stay put.


Increased Competition, Increased Inequality

Krugman blames the conservative movement for income inequality, arguing that right-wingers exploited white backlash in the wake of the civil rights movement to hijack first the Republican Party and then the country as a whole. Once in power, they duped the public with "weapons of mass distraction" (i.e., social issues and foreign policy) while "cut[ting] taxes on the rich," "try[ing] to shrink government benefits and undermine the welfare state," and "empower[ing] businesses to confront and, to a large extent,crush the union movement."

Obviously, conservatism has contributed in important ways to the political shifts of recent decades. But the real story of those changes is more complicated, and more interesting, than Krugman lets on. Influences across the political spectrum have helped shape the more competitive more individualistic, and less equal society we now live in.

Indeed, the relevant changes in social norms were led by movements associated with the left. The women's movement led the assault on sex discrimination. The civil rights campaigns of the 1950s and '60s inspired more enlightened attitudes about race and ethnicity, with results such as the Immigration and Nationality Act of 1965, a law spearheaded by a young Sen. Edward Kennedy (D-Mass.). And then there was the counterculture of the 1960s, whose influence spread throughout American society in the Me Decade that followed. It upended the social ethic of group-minded solidarity and conformity with a stampede of unbridled individualism and self-assertion. With the general relaxation of inhibitions, talented and ambitious people felt less restrained from seeking top dollar in the marketplace. Yippies and yuppies were two sides of the same coin.

Contrary to Krugman's narrative, liberals joined conservatives in pushing for dramatic changes in economic policy. In addition to his role in liberalizing immigration, Kennedy was a leader in pushing through both the Airline Deregulation Act of 1978 and the Motor Carrier Act of 1980, which deregulated the trucking industry—and he was warmly supported in both efforts by the left-wing activist Ralph Nader. President Jimmy Carter signed these two pieces of legislation, as well as the Natural Gas Policy Act of 1978, which began the elimination of price controls on natural gas, and the Staggers Rail Act of 1980, which deregulated the railroad industry.

The three most recent rounds of multilateral trade talks were all concluded by Democratic presidents: the Kennedy Round in 1967 by Lyndon Johnson, the Tokyo Round in 1979 by Jimmy Carter, and the Uruguay Round in 1994 by Bill Clinton. And though it was Ronald Reagan who slashed the top income tax rate from 70 percent to 50 percent in 1981, it was two Democrats, Sen. Bill Bradley of New Jersey and Rep. Richard Gephardt of Missouri, who sponsored the Tax Reform Act of 1986, which pushed the top rate all the way down to 28 percent.

What about the unions? According to the Berkeley economist David Card, the shrinking of the unionized labor force accounted for 15 percent to 20 percent of the rise in overall male wage inequality between the early 1970s and the early 1990s. Krugman is right that labor's decline stems in part from policy changes, but his ideological blinkers lead him to identify the wrong ones.

The only significant change to the pro-union Wagner Act of 1935 came through the Taft-Hartley Act, which outlawed closed shops (contracts requiring employers to hire only union members) and authorized state right-to-work laws (which ban contracts requiring employees to join unions). But that piece of legislation was enacted in 1947—three years before the original Treaty of Detroit between General Motors and the United Auto Workers. It would be a stretch to argue that the Golden Age ended before it even began.

Scrounging for a policy explanation, economists Levy and Temin point to the failure of a 1978 labor law reform bill to survive a Senate filibuster. But maintaining the status quo is not a policy change. They also describe President Reagan's 1981 decision to fire striking air traffic controllers as a signal to employers that the government no longer supported labor unions.

While it is true that Reagan's handling of that strike, along with his appointments to the National Labor Relations Board, made the policy environment for unions less favorable, the effect of those moves on unionization was marginal.
The major reason for the fall in unionized employment, according to a 2007 paper by Georgia State University economist Barry Hirsch, "is that union strength developed through the 1950s was gradually eroded by increasingly competitive and dynamic markets." He elaborates: "When much of an industry is unionized, firms may prosper with higher union costs as long as their competitors face similar costs. When union companies face low-cost competitors, labor cost increases cannot be passed through to consumers. Factors that increase the competitiveness of product markets increased international trade, product market deregulation, and the entry of low-cost competitors—make it more difficult for union companies to prosper."

So the decline of private-sector unionism was abetted by policy changes, but the changes were not in labor policy specifically. They were the general, bipartisan reduction of trade barriers and price and entry controls. Unionized firms found themselves at a critical disadvantage. They shrank accordingly, and union rolls shrank with them.


Postmodern Progress

The move toward a more individualistic culture is not unique to the United States. As the political scientist Ronald Inglehart has documented in dozens of countries around the world, the shift toward what he calls "postmodern" attitudes and values is a predictable cultural response to rising affluence and expanding choices. "In a major part of the world," he writes in his 1997 book Modernization and Postmodernization, "the disciplined, self-denying, and achievement-oriented norms of industrial society are giving way to an increasingly broad latitude for individual choice of lifestyles and individual self-expression."

The increasing focus on individual fulfillment means, inevitably, less deference to tradition and organizations. "A major component of the Postmodern shift," Inglehart argues, "is a shift away from both religious and bureaucratic authority, bringing declining emphasis on all kinds of authority. For deference to authority has high costs: the individual's personal goals must be subordinated to those of a broader entity."

Paul Krugman may long for the return of selfdenying corporate workers who declined to seek better opportunities out of organizational loyalty, and thus kept wages artificially suppressed, but these are creatures of a bygone ethos—an ethos that also included uncritical acceptance of racist and sexist traditions and often brutish intolerance of deviations from mainstream lifestyles and sensibilities.

The rise in income inequality does raise issues of legitimate public concern. And reasonable people disagree hotly about what ought to be done to ensure that our prosperity is widely shared. But the caricature of postwar history put forward by Krugman and other purveyors of nostalgianomics won't lead us anywhere. Reactionary fantasies never do.

China Global Investment Tracker

China Global Investment Tracker, by Derek Scissors, Ph.D.
Heritage White Paper, May 26, 2009

China's role in the global financial arena is becoming increasingly important to the United States and the worldwide community. The China Global Investment Tracker created by The Heritage Foundation is the only available comprehensive dataset relating to large Chinese foreign investments and construction contracts in all areas of the world. Details are available on all attempted transactions over $100 million--both failed and successful -- in a variety of industries, including energy, transportation and banking.

[China's Global Reach]

Chinese investment and business contracts now span the globe. More business of these types is being done with Africa and the Arab world than with China's traditional partners in East Asia.

[Graph 2]

China's investment total could be higher. Tens of billions of dollars in proposed spending have been rejected by Chinese or foreign regulators or simply fallen through.

Download the dataset on large Chinese foreign investments: Chinese Outward Investment

---
For more information on the growing Chinese investments in the rest of the world:

Chinese Foreign Investment: Insist on Transparency. By Derek Scissors, Ph.D.
Backgrounder #2237

China holds more than $1 trillion in American bonds. According to the new Heritage Foundation database on recent Chinese foreign non-bond investment, China has invested more than $15 billion in the U.S. in addition to bonds. China's SAFE is the largest foreign investor in the U.S., but has refused to make its activities more transparent. An American priority should be to enhance transparency in SAFE's spending.

Conservative views: A Bad Day for Impartiality - empathy as a code word for judicial liberalism

A Bad Day for Impartiality. By Rich Lowry
Obama uses empathy as a code word for judicial liberalism.
National Review Online, May 26, 2009 6:15 PM

It was a historic day when Pres. Barack Obama announced his nomination of Judge Sonia Sotomayor to the Supreme Court. No president had ever nominated a Hispanic woman. Nor had a recent president — or his nominee — expressed less genuine interest in the traditional craft of judging.

Impartiality has been supplanted by empathy. The old-fashioned virtue of objectivity — redolent of dusty law books and the unromantic task of parsing the law and facts — is giving way to an inherently politicized notion of judging based on feelings. Lady Justice is to slip her blindfold and let her decisions be influenced by her life experiences and personal predilections.

Obama and Sotomayor embrace this method of judging with gusto, even though it is deeply antithetical to justice properly understood. This is why Sotomayor is such a radical choice. Not only will she define the court’s left flank, she represents a judicial philosophy that is neither truly judicial nor a philosophy. The political outcome — and the personal biases that drive it — is paramount.

In introducing Sotomayor, Obama said he valued “a rigorous intellect” and “a recognition of the limits of the judicial role,” before pronouncing them both “insufficient.” A justice must have been tested “by hardship and misfortune,” Obama stipulated, so that he has “a common touch and a sense of compassion.”

It’s as if he wants a justice who can break the tension in an oral argument about the intricacies of antitrust law with engaging sports banter. The “Would you want to have a beer with him?” test reasonably applies to a politician, but to a black-robed justice charged with interpreting the Constitution? Justice Clarence Thomas is delightful company. Does that make his opinions any better or worse?

To complement his essentially political conception of the court, Obama has an essentially political conception of a justice. He voted against John Roberts despite Roberts’s qualifications and love of the law. Roberts failed the political test, defined by Obama as “one’s deepest values,” “the depth and breadth of one’s empathy.”

Obama uses empathy as a code word for judicial liberalism, and few nominees could be as starkly empathetic as Sotomayor. She has the requisite inspiring background. She has been a reliable liberal vote (never mind that the Supreme Court has been singularly unimpressed by her reasoning in cases that have reached it). And she believes that her background is one of her most important qualifications.

In a rambling 2001 speech, she disagreed with a colleague who thought judges should transcend their “personal sympathies and prejudices.” Sotomayor said, “I wonder whether achieving that goal is possible in all or even in most cases.” She argued that “the aspiration to impartiality is just that — it’s an aspiration because it denies the fact that we are by our experiences making different choices than others.” In sum, she said, “I would hope that a wise Latina woman with the richness of her experiences would more often than not reach a better conclusion than a white male who hasn’t lived that life.”

This stunning statement of race and gender determinism perhaps explains Sotomayor’s decision in the New Haven firefighter case now before the Supreme Court. A white firefighter studied for an exam to get a promotion. He bought $1,000 worth of books and had someone read them onto audiotapes because he’s dyslexic. He passed, but the city declined to promote him because no blacks had qualified for promotion.

Sotomayor thought this blatantly race-conscious action passed constitutional muster. Does her 2001 speech mean that she would have ruled differently if she were white, dyslexic, or a working-class firefighter struggling to get ahead? If so, she is manifestly unfit for the highest court in a country that puts the law above tribal loyalties.

Sotomayor’s nomination represents an extraordinary personal accomplishment and an important symbolic affirmation for Latinos. Her confirmation, though, would be another step toward eviscerating the constitutional function of the Supreme Court, as empathy trumps impartiality.

— Rich Lowry is the editor of National Review

WaPo: Kudos, and some questions, for Judge Sonia Sotomayor

The President's Pick. WaPo Editorial
Kudos, and some questions, for Judge Sonia Sotomayor
WaPo, Wednesday, May 27, 2009

THERE IS MUCH to admire in the achievements of Sonia Sotomayor, the New York judge tapped by President Obama to fill a Supreme Court vacancy created by the impending retirement of Justice David H. Souter.

Born to immigrant Puerto Rican parents and raised in a housing project in the Bronx, Judge Sotomayor went on to excel at Princeton and earn a law degree from Yale. She worked as a prosecutor and represented corporate interests in private practice before being named to the federal trial court in New York by President George H.W. Bush; she was later elevated to a slot on the New York-based U.S. Court of Appeals for the 2nd Circuit by President Bill Clinton. As a Hispanic woman with such a diversity of legal experience, she would bring a welcome fresh perspective to the bench.

Judge Sotomayor has spoken about how gender, ethnicity and race influence a judge's views, and that should be one subject for her confirmation hearings. In a 2001 speech, she said: "The aspiration to impartiality is just that -- it's an aspiration because it denies the fact that we are by our experiences making different choices than others. . . . Justice [Sandra Day] O'Connor has often been cited as saying that a wise old man and wise old woman will reach the same conclusion in deciding cases . . . . I am not so sure that I agree with the statement. First, . . . there can never be a universal definition of wise. Second, I would hope that a wise Latina woman with the richness of her experiences would more often than not reach a better conclusion than a white male who hasn't lived that life."

Senators could ask her, then, how, when deciding a case, she balances the quest for objectivity with her personal experiences. They might also ask her views on judicial activism. In a panel discussion in 2005, she said that a "court of appeals is where policy is made." Conservative critics have seized on this statement to argue that she is a judicial activist who believes judges should make, rather than interpret, the law. Yet her statement could just as easily be understood to be explaining correctly that the courts of appeals -- and not the Supreme Court -- are the venues where the vast majority of cases and policies are ultimately decided.

We hope Judge Sotomayor also will discuss her thinking in the case of Ricci v. DeStefano, in which a group of white firefighters sued the city of New Haven for failing to certify promotion tests because no African Americans had scored high enough to qualify for advancement. A trial court ruled against the white firefighters, and on appeal, Judge Sotomayor and two colleagues essentially rubber-stamped the lower court decision without elaboration, even though the case presented important and undecided questions of law. That case is now awaiting a decision by the Supreme Court justices whom Judge Sotomayor soon hopes to join as a colleague.

Senators are right to closely scrutinize Judge Sotomayor's philosophy and qualifications. She has produced a rich record of opinions as an appeals court judge for the Judiciary Committee to discuss. Senators also should remember that Mr. Obama, like any president, is entitled to deference in choosing a justice.

WaPo on California's highest court ruling on Proposition 8

Proposition 8 Stands
California's highest court rules that the voters have the right to be wrong.
WaPo, Wednesday, May 27, 2009

THE JUDGES of the California Supreme Court ruled yesterday that they can be overruled by the people of their state. That's the import of their 6 to 1 decision upholding Proposition 8, which bars same-sex marriage. They're probably right on the law, but the outcome is wrong as a matter of fairness, and our guess is that the people of California will reconsider before too long.

This same court ruled in May 2008 that the state's constitution required recognition of same-sex marriage. In November, voters narrowly repudiated that decision by approving Proposition 8, which amended the constitution to provide that "only marriage between a man and a woman is valid or recognized in California."

Yesterday, three of the four justices who had originally ruled in favor of same-sex marriage nonetheless agreed with the three dissenters in the original case that Proposition 8 should stand. This outcome suggests that those challenging the legality of Proposition 8 had the weaker legal case, however wrongheaded the amendment's content. As the court found, those challenging the proposition, including private plaintiffs and state Attorney General Jerry Brown, essentially complained "that it is just too easy to amend the California Constitution through the initiative process." That's probably true, but, as the court noted, the people of California are free to adopt a flawed system.

In the course of voting to uphold Proposition 8, the court made important -- and just -- findings. First, it found that the marriages of the 18,000 same-sex couples who acted before the proposition was approved remain valid. Second, it emphasized that other same-sex couples still enjoy the right to civil unions, allowing gays and lesbians to "choose one's life partner and enter with that person into a committed, officially recognized, and protected family relationship that enjoys all of the constitutionally based incidents of marriage." Rather, said Chief Justice Ronald M. George, the measure "carves out a narrow and limited exception to these state constitutional rights, reserving the official designation of the term 'marriage' for the union of opposite-sex couples as a matter of state constitutional law, but leaving undisturbed all of the other extremely significant substantive aspects of a same-sex couple's" rights. In other words, the terminology is different for same-sex couples, but the rights remain the same as those of other married couples.

This is disappointing; words do matter. Yet we remain confident that the inexorable trend of history is to recognize equality for gay men and lesbians, allowing them to marry rather than relegating them to a separate-but-equal legal status. The recent moves in Vermont, Maine and Iowa in support of same-sex marriage, and efforts in that direction in New Hampshire, should offer some comfort to the disappointed citizens of California, gay and straight alike, that their state will before long undo this unfortunate proposition.

Tuesday, May 26, 2009

Exploding debt threatens America

Exploding debt threatens America. By John Taylor
FT, May 26 2009 20:48

Standard and Poor’s decision to downgrade its outlook for British sovereign debt from “stable” to “negative” should be a wake-up call for the US Congress and administration. Let us hope they wake up.

Under President Barack Obama’s budget plan, the federal debt is exploding. To be precise, it is rising – and will continue to rise – much faster than gross domestic product, a measure of America’s ability to service it. The federal debt was equivalent to 41 per cent of GDP at the end of 2008; the Congressional Budget Office projects it will increase to 82 per cent of GDP in 10 years. With no change in policy, it could hit 100 per cent of GDP in just another five years.

“A government debt burden of that [100 per cent] level, if sustained, would in Standard & Poor’s view be incompatible with a triple A rating,” as the risk rating agency stated last week.

I believe the risk posed by this debt is systemic and could do more damage to the economy than the recent financial crisis. To understand the size of the risk, take a look at the numbers that Standard and Poor’s considers. The deficit in 2019 is expected by the CBO to be $1,200bn (€859bn, £754bn). Income tax revenues are expected to be about $2,000bn that year, so a permanent 60 per cent across-the-board tax increase would be required to balance the budget. Clearly this will not and should not happen. So how else can debt service payments be brought down as a share of GDP?

Inflation will do it. But how much? To bring the debt-to-GDP ratio down to the same level as at the end of 2008 would take a doubling of prices. That 100 per cent increase would make nominal GDP twice as high and thus cut the debt-to-GDP ratio in half, back to 41 from 82 per cent. A 100 per cent increase in the price level means about 10 per cent inflation for 10 years. But it would not be that smooth – probably more like the great inflation of the late 1960s and 1970s with boom followed by bust and recession every three or four years, and a successively higher inflation rate after each recession.

The fact that the Federal Reserve is now buying longer-term Treasuries in an effort to keep Treasury yields low adds credibility to this scary story, because it suggests that the debt will be monetised. That the Fed may have a difficult task reducing its own ballooning balance sheet to prevent inflation increases the risks considerably. And 100 per cent inflation would, of course, mean a 100 per cent depreciation of the dollar. Americans would have to pay $2.80 for a euro; the Japanese could buy a dollar for Y50; and gold would be $2,000 per ounce. This is not a forecast, because policy can change; rather it is an indication of how much systemic risk the government is now creating.

Why might Washington sleep through this wake-up call? You can already hear the excuses.

“We have an unprecedented financial crisis and we must run unprecedented deficits.” While there is debate about whether a large deficit today provides economic stimulus, there is no economic theory or evidence that shows that deficits in five or 10 years will help to get us out of this recession. Such thinking is irresponsible. If you believe deficits are good in bad times, then the responsible policy is to try to balance the budget in good times. The CBO projects that the economy will be back to delivering on its potential growth by 2014. A responsible budget would lay out proposals for balancing the budget by then rather than aim for trillion-dollar deficits.

“But we will cut the deficit in half.” CBO analysts project that the deficit will be the same in 2019 as the administration estimates for 2010, a zero per cent cut.

“We inherited this mess.” The debt was 41 per cent of GDP at the end of 1988, President Ronald Reagan’s last year in office, the same as at the end of 2008, President George W. Bush’s last year in office. If one thinks policies from Reagan to Bush were mistakes does it make any sense to double down on those mistakes, as with the 80 per cent debt-to-GDP level projected when Mr Obama leaves office?

The time for such excuses is over. They paint a picture of a government that is not working, one that creates risks rather than reduces them. Good government should be a nonpartisan issue. I have written that government actions and interventions in the past several years caused, prolonged and worsened the financial crisis. The problem is that policy is getting worse not better. Top government officials, including the heads of the US Treasury, the Fed, the Federal Deposit Insurance Corporation and the Securities and Exchange Commission are calling for the creation of a powerful systemic risk regulator to reign in systemic risk in the private sector. But their government is now the most serious source of systemic risk.

The good news is that it is not too late. There is time to wake up, to make a mid-course correction, to get back on track. Many blame the rating agencies for not telling us about systemic risks in the private sector that lead to this crisis. Let us not ignore them when they try to tell us about the risks in the government sector that will lead to the next one.

The writer, a professor of economics at Stanford and a senior fellow at the Hoover Institution, is the author of ‘Getting Off Track: How Government Actions and Interventions Caused, Prolonged, and Worsened the Financial Crisis’

GM's new owner (the Obama administration) should stop bullying the company's bondholders

Government Motors. WaPo Editorial
GM's new owner (the Obama administration) should stop bullying the company's bondholders.
WaPo, Tuesday, May 26, 2009

IN THEORY, a government bailout should provide a short-term infusion of cash to give a struggling company the chance to right itself. But in its aggressive dealings with U.S. automakers, most recently General Motors, the Obama administration is coming dangerously close to engaging in financial engineering that ignores basic principles of fairness and economic realities to further political goals.

It is now clear that there is no real difference between the government and the entity that identifies itself as GM. For all intents and purposes, the government, which is set to assume a 50 percent equity stake in the company, is GM, and it has been calling the shots in negotiations with creditors. While the Obama administration has been playing hardball with bondholders, it has been more than happy to play nice with the United Auto Workers. How else to explain why a retiree health-care fund controlled by the UAW is slated to get a 39 percent equity stake in GM for its remaining $10 billion in claims while bondholders are being pressured to take a 10 percent stake for their $27 billion? It's highly unlikely that the auto industry professionals at GM would have cut such a deal had the government not been standing over them -- or providing the steady stream of taxpayer dollars needed to keep the factory doors open.

GM is widely expected to file for bankruptcy before the end of this month. If this were a typical bankruptcy, the company would be allowed by law to tear up its UAW collective bargaining agreement and negotiate for drastically reduced wages and benefits. That's not going happen. Phrased another way: The government won't let that happen. Still, the threat of a contract abrogation probably played a role in the union's agreement to cost-cutting measures last week. (The details of the deal have not been made public; union members are scheduled to vote on the proposal early this week.) It's never easy for unions to make concessions, but the sting of handing back money is being softened by the government's desire to give the union a huge ownership stake in GM. Might bondholders be more willing to agree to the kind of quick restructuring the government hopes for if they had been treated more fairly from the outset?

The administration argues that it could not risk alienating the union for fear of triggering a walkout that could permanently cripple GM. It also posits that it had to agree to protect suppliers and fund warranties in order to preserve jobs and reassure prospective buyers that their cars would be serviced. These are legitimate concerns. But it's too bad that the Obama administration has not thought more deeply about how its bullying of bondholders could convince future investors that the last thing they want to do is put money into any company that the government has -- or could -- become involved in.

Unilateral or Worldwide, Waxman-Markey Fails Standard Cost/Benefit Tests (CO2 “leakage” makes bad even worse)

Unilateral or Worldwide, Waxman-Markey Fails Standard Cost/Benefit Tests (CO2 “leakage” makes bad even worse). By Robert Murphy
Master Resource, May 26, 2009

Jim Manzi has a very good post introducing the analysis of costs and benefits of Waxman-Markey. Here I want to follow up on Manzi’s great start, by showing that Chip Knappenberger’s estimate of the climate benefits of Waxman-Markey (W-M) actually erred on the side of optimism in its assumptions.

Specifically, Knappenberger very conservatively ignored the problem of “leakage”–he didn’t model the fact that unilateral U.S. carbon caps would actually increase the rate at which other countries’ own emissions grow. What’s worse, even if the entire world signed on to the aggressive emission schedule in W-M, the resulting environmental benefits would be achieved at a staggering cost in terms of lost economic output.

No matter how you slice it–whether the U.S. goes it alone, or the rest of the world signs on too–the environmental benefits of W-M are swamped by its economic costs.

“Leakage”–An Important Variable. In a MasterResource post that has become a touchstone of the great climate debate, Chip Knappenberger used a standard model to assess the expected reductions in global mean temperatures if the U.S. faithfully adhered to the emission targets in W-M. Knappenberger found that by 2100, the projected global warming under two different “baseline” emission scenarios would be postponed by a handful of years.

The pro-interventionist scientists at RealClimate have conceded the basic validity of Chip’s analysis; they simply accuse him of rigging the game by considering unilateral U.S. action.

What is interesting is that Chip did not assume that the emissions of the rest of the world would grow more quickly because of (the stipulated) unilateral U.S. committment to W-M. Yet this would surely happen, because of a phenomenon referred to in the climate economics literature as leakage. The intuition is quite simple: If the U.S. imposes a steep price on operations that emit carbon, then U.S. industries will produce fewer carbon-intensive goods and services. (That’s the whole point, after all.)

Yet because of the reduction in output of these sectors, the world price of these items will tend to rise, which in turn will call forth greater output from (carbon-intensive) sectors in the unregulated countries.

I caution readers that some cynics of government action to limit climate change draw an unwarranted conclusion from this type of analysis. I have heard such critics say things like, “This is ridiculous! If the U.S. goes it alone, all we’ll do is ship all of our jobs to China, and we won’t affect global emissions one iota.”

Strictly speaking, that is taking it too far. For various reasons, it is not true that every cutback in carbon-intensive production in the U.S. would be perfectly offset by expanded production in an unregulated jurisdiction. However, even though there won’t be a one-for-one offset in terms of final goods produced, the relative carbon emissions is a different matter. This is because Chinese manufacturing operations emit more tons of carbon than American factories do, in order to produce the same physical amount of goods.

Hence, the amount of “leakage” resulting from a unilateral U.S. emissions cap is ultimately an empirical matter, but it would probably be very significant. To repeat, Chip’s analysis (described above) did not take this effect into account. Chip merely took two standard IPCC baseline emission scenarios, and then altered them by reducing the baseline growth in U.S. emissions in order to comply with the targets in W-M. He is consequently overestimating the environmental benefits of unilateral American adherence to the emission targets in W-M. In other words, the dotted lines in the chart above would be even closer to the solid lines, once the model took into account the superior profitability of Chinese carbon-intensive operations after the U.S. government hobbled American operations.

Leakage in the Context of the “Social Cost of Carbon.” Some of the more sophisticated critics of Chip’s analysis asked a reasonable question: Since plenty of economic models show a “social cost of carbon” emissions, it doesn’t really matter what the rest of the world does, right? After all, if emitting an extra ton of carbon today, translates into a (present discounted value of) an expected increase in future climate change damages of (say) $35, then it surely moves us in the direction of efficiency if the U.S. government slaps a penalty on domestic emitters, right?

There are two problems here. First, nobody is defending Waxman-Markey on the basis of cost/benefit analysis, because it can’t be done. There is a bit of a problem in comparing apples with apples (since the integrated assessment models gauging the impact of mitigation policies all assume concerted worldwide action), but it is safe to say that the emissions targets in W-M are far too aggressive, if we are going to be guided by the “social cost of carbon.”

For example, Table 3.10 (page 229) of Working Group III’s contribution to the IPCC Fourth Assessment Report (.pdf) shows that of 177 scenarios surveyed from the peer-reviewed literature, only 6 scenarios assumed worldwide emissions reductions in the steepest category of 50% - 85% by the year 2050. (Recall that W-M imposes a reduction of 83% by 2050. But note that the IPCC reductions are relative to 2000 emissions, while W-M’s 83% target is relative to 2005 emissions.)

If we turn to the specific DICE model by William Nordhaus–who is a pioneer and leader in this field, and who is a definite proponent of a carbon tax–we see that the aggressive emission cutbacks in W-M fail his cost/benefit test by a wide mark. The IPCC’s Table 3.10 and Nordhaus’s own results agree that capping emissions in 2050 at 83% below current levels, would correspond to Nordhaus’s estimates of a policy of capping atmospheric concentrations at no more than 1.5x preindustrial levels. (See Nordhaus’s Table 5-5, p. 96 here [.pdf]. Note that we are being conservative with our choice, because the steep emission cuts in W-M are arguably closer to the “Gore proposal,” which the DICE model finds even more destructive than the policy that we have instead chosen as a surrogate for W-M.)

Yet according to Nordhaus’s DICE model, such an aggressive policy would do far more harm to the economy, than it would yield in benefits of averted climate damage. Specifically, Nordhaus estimates that the policy corresponding to W-M targets would make the world some $15 trillion poorer relative to the business-as-usual baseline of no controls (see Table 5-1, page 82, here [.pdf]). Yes, worldwide commitment to the aggressive emission schedule in W-M would avert climate damages that would otherwise occur, which DICE values as a benefit of $12.6 trillion. But the draconian emission caps would require $27.24 trillion in abatement (compliance) costs. Thus the environmental benefits are swamped by the economic costs.

So we see that using the standard “social costs of carbon” approach, reveals that W-M imposes far too high a price on carbon emissions than is warranted by this Pigovian framework. That is why proponents of steep emission cuts must abandon standard cost/benefit analysis, and instead recommend particular environmental targets (such as stabilizing atmospheric concentrations at a presumed “safe” level) and then try to find the least-cost method of attaining them.

As a final point, we should note how the problem of leakage also influences the “social cost of carbon” as computed in various models. When Nordhaus or other economists calculate the social cost of carbon (SCC), they are asking what happens to the present discounted value of future environmental damages, if someone emits an additional ton of carbon today, while holding the assumed trajectory of all future emissions constant.

Now we see the weakness in this metric, when trying to assess the net benefits of unilateral climate policy. Once we take leakage into account, we see that the standard measure of SCC overstates (possibly grossly so) the true costs to society from an additional unit of emissions. In reality, there are two things going on: When a U.S. manufacturer produces more units of a carbon-intensive good, it is true that he emits more carbon dioxide into the atmosphere. This is what the SCC looks at, and judges him accordingly.

However, the U.S. manufacturer also pushes down the world price of the good in question, and that tends to cause other producers to emit less CO2. Thus, there is a positive externality laid on top of the negative externality. The greater the scope for leakage, the greater the positive externality. In the extreme, where U.S. operations would be completely outsourced to China (in terms of carbon emissions, if not output of final goods), then the correctly measured “social cost of carbon” for U.S. operations would be zero, in the context of a unilateral U.S. cap.


Conclusion. Here are the takeaway messages:

(A) If the U.S. implements Waxman-Markey unilaterally, the environmental benefits will be even less than indicated by Chip Knappenberger’s pessimistic analysis.
(B) If the whole world implements Waxman-Markey, then the loss to economic output will far exceed the reduction in expected environmental damages.
No matter how you slice it, Waxman-Markey fails standard cost/benefit tests. W-M advocates are certainly free to criticize standard cost/benefit tests, but they can’t stop there. They still need to justify quantitatively the steep emission targets in W-M. And to the extent that they invoke U.S. leadership in prodding the rest of the world to follow suit, proponents also need to come up with a plausible story showing the likelihood of worldwide action, with and without Waxman-Markey, versus some other possible U.S. approach.

Yet W-M proponents have done none of these things. Surely they could at least try–even in an informal blog post–to formalize their case, before expecting the American people to sign on to a plan that could cost trillions of dollars in forfeited economic growth, and which on its face will do very little to alter the course of global warming.

Monday, May 25, 2009

Perspectives from India: North Korea thumbs its nuclear nose at Washington

Is Obama Another Jimmy Carter? By Bahukutumbi Raman
North Korea thumbs its nuclear nose at Washington.
Forbes, May 25, 2009, 11:35 AM EDT

During the U.S. Presidential primaries last year, I had expressed my misgivings that Barack Obama might turn out to be another Jimmy Carter, whose confused thinking and soft image paved the way for the success of the Islamic Revolution in Iran.

The subsequent Iranian defiance of the U.S. and Carter's inability to deal effectively with the crisis in which Iranian students raided the U.S. Embassy in Teheran and held a number of U.S. diplomats hostage led to disillusionment with him in sections of the U.S. and to his failure to get re-elected in 1980. The strong line taken by him against the invasion of Afghanistan by the Soviet troops towards the end of 1979 did not help him in wiping out the image of a soft and confused president.

The defiant action of North Korea in testing a long-range missile with military applications last month, and its latest act of defiance in reportedly carrying out an underground nuclear test on May 25, can be attributed--at least partly, if not fully--to its conviction that it will have nothing to fear from the Obama administration for its acts of defiance. It is true that even when George Bush was the president, North Korea had carried out its first underground nuclear test in October 2006. The supposedly strong policy of the Bush administration did not deter it from carrying out its first test.

After Obama assumed office in January, whatever hesitation that existed in North Korea's policy-making circles regarding the likely response of U.S. administration has disappeared, and its leadership now feels it can defy the U.S. and the international community with impunity.

A series of actions taken by the Obama administration have created an impression in Iran, the "Af-Pak" region, China and North Korea that Obama does not have the political will to retaliate decisively to acts that are detrimental to U.S. interests, and to international peace and security.

Among such actions, one could cite: the soft policy toward Iran: the reluctance to articulate strongly U.S. determination to support the security interests of Israel; the ambivalent attitude toward Pakistan despite its continued support to anti-India terrorist groups and its ineffective action against the sanctuaries of Al-Qaida and the Taliban in Pakistani territory; its silence on the question of the violation of the human rights of the Burmese people and the continued illegal detention of Aung San Suu Kyi by the military regime in Myanmar; and its silence on the Tibetan issue.

Its over-keenness to court Beijing's support in dealing with the economic crisis, and its anxiety to ensure the continued flow of Chinese money into U.S. Treasury bonds, have also added to the soft image of the U.S.

President Obama cannot blame the problem-states of the world--Iran, Pakistan, Myanmar and North Korea--if they have come to the conclusion that they can take liberties with the present administration in Washington without having to fear any adverse consequences. North Korea's defiance is only the beginning. One has every reason to apprehend that Iran might be the next to follow.

Israel and India have been the most affected by the perceived soft policies of the Obama administration. Israel is legitimately concerned over the likely impact of this soft policy on the behavior of Iran. South Korea and Japan, which would have been concerned over the implications of the soft policy of the Obama administration, had no national option because they lack independent means of acting against North Korea.

Israel will not stand and watch helplessly if it concludes that Iran might follow the example of North Korea. Israel will not hesitate to act unilaterally against Iran if it apprehends that it is on the verge of acquiring a military nuclear capability. It will prefer to act with the understanding of the U.S., but if there is no change in the soft policy of the Obama administration, it will not hesitate to act even without prior consultation with the U.S.

India, too, has been noting with concern the total confusion, which seems to prevail in the corridors of the Obama administration over its Af-Pak policy. Some of the recent comments of U.S. Secretary of State Hillary Clinton about alleged past incoherence in U.S. policy toward Pakistan--and about the part-responsibility of the U.S. for the state of affairs in the Af-Pak region--have given comfort to the military-intelligence establishment and the political leaders in Pakistan.

Obama's new over-generosity to the Pakistani armed forces and his reluctance to hold them accountable for their sins of commission and omission in the war against terrorism have convinced the Pakistani leaders that they have no adverse consequences to fear from the Obama administration. India would be the first to feel the adverse consequences of this newly found confidence in Islamabad vis-a-vis its relations with the U.S.

Jimmy Carter took a little over three years to create the image of the U.S. as a confused and soft power. Obama is bidding fair to create that image even in his first year in office. The North Korean defiance is the first result of this perceived soft image. There will be more surprises for the U.S. and the international community to follow if Obama and his aides do not embark on corrective actions before it is too late.

Bahukutumbi Raman is a retired officer of the Indian intelligence service and director of the Institute For Topical Studies, in Chennai, India. He is also associated with the Chennai Centre For China Studies.

Remarks by the Federal President on Memorial Day

THE WHITE HOUSE
Office of the Press Secretary
_________________________________________________________
For Immediate Release
May 25, 2009

REMARKS BY THE PRESIDENT ON MEMORIAL DAY
Memorial Amphitheater
Arlington National Cemetery

THE PRESIDENT: Thank you, Admiral Mullen, for that generous introduction and for your sterling service to our country. To members of our armed forces, to our veterans, to honored guests, and families of the fallen -- I am deeply honored to be with you on Memorial Day.

Thank you to the superintendent, John Metzler, Jr., who cares for these grounds just as his father did before him; to the Third Infantry Regiment who, regardless of weather or hour, guard the sanctity of this hallowed ground with the reverence it deserves -- we are grateful to you; to service members from every branch of the military who, each Memorial Day, place an American flag before every single stone in this cemetery -- we thank you as well. (Applause.) We are indebted -- we are indebted to all who tend to this sacred place.

Here lie Presidents and privates; Supreme Court justices and slaves; generals familiar to history, and unknown soldiers known only to God.

A few moments ago, I laid a wreath at their tomb to pay tribute to all who have given their lives for this country. As a nation, we have gathered here to repeat this ritual in moments of peace, when we pay our respects to the fallen and give thanks for their sacrifice. And we've gathered here in moments of war, when the somber notes of Taps echo through the trees, and fresh grief lingers in the air.

Today is one of those moments, where we pay tribute to those who forged our history, but hold closely the memory of those so recently lost. And even as we gather here this morning, all across America, people are pausing to remember, to mourn, and to pray.

Old soldiers are pulling themselves a little straighter to salute brothers lost a long time ago. Children are running their fingers over colorful ribbons that they know signify something of great consequence, even if they don't know exactly why. Mothers are re-reading final letters home and clutching photos of smiling sons or daughters, as youthful and vibrant as they always will be.

They, and we, are the legacies of an unbroken chain of proud men and women who served their country with honor; who waged war so that we might know peace; who braved hardship so that we might know opportunity; who paid the ultimate price so we might know freedom.
Those who rest in these fields fought in every American war. They overthrew an empire and gave birth to revolution. They strained to hold a young union together. They rolled back the creeping tide of tyranny, and stood post through a long twilight struggle. And they took on the terror and extremism that threatens our world's stability.

Their stories are the American story. More than seven generations of them are chronicled here at Arlington. They're etched into stone, recounted by family and friends, and silently observed by the mighty oaks that have stood over burial after burial.

To walk these grounds then is to walk through that history. Not far from here, appropriately just across a bridge connecting Lincoln to Lee, Union and Confederate soldiers share the same land in perpetuity.

Just down the sweeping hill behind me rest those we lost in World War II, fresh-faced GIs who rose to the moment by unleashing a fury that saved the world. Next week, I'll visit Normandy, the place where our fate hung on an operation unlike any ever attempted, where it will be my tremendous honor to address some of the brave men who stormed those beaches 65 years ago.
And tucked in a quiet corner to our north are thousands of those we lost in Vietnam. We know for many the casualties of that war endure -- right now, there are veterans suffering and families tracing their fingers over black granite not two miles from here. They are why we pledge anew to remember their service and revere their sacrifice, and honor them as they deserve.

This cemetery is in and of itself a testament to the price our nation has paid for freedom. A quarter of a million marble headstones dot these rolling hills in perfect military order, worthy of the dignity of those who rest here. It can seem overwhelming. But for the families of the fallen, just one stone stands out -- one stone that requires no map to find.

Today, some of those stones are found at the bottom of this hill in Section 60, where the fallen from Iraq and Afghanistan rest. The wounds of war are fresh in Section 60. A steady stream of visitors leaves reminders of life: photos, teddy bears, favorite magazines. Friends place small stones as a sign they stopped by. Combat units leave bottles of beer or stamp cigarettes into the ground as a salute to those they rode in battle with. Perfect strangers visit in their free time, compelled to tend to these heroes, to leave flowers, to read poetry -- to make sure they don't get lonely.

If the fallen could speak to us, what would they say? Would they console us? Perhaps they might say that while they could not know they'd be called upon to storm a beach through a hail of gunfire, they were willing to give up everything for the defense of our freedom; that while they could not know they'd be called upon to jump into the mountains of Afghanistan and seek an elusive enemy, they were willing to sacrifice all for their country; that while they couldn't possibly know they would be called to leave this world for another, they were willing to take that chance to save the lives of their brothers and sisters in arms.

What is thing, this sense of duty? What tugs at a person until he or she says "Send me"? Why, in an age when so many have acted only in pursuit of the narrowest self-interest, have the soldiers, sailors, airmen and Marines of this generation volunteered all that they have on behalf of others? Why have they been willing to bear the heaviest burden?

Whatever it is, they felt some tug; they answered a call; they said "I'll go." That is why they are the best of America, and that is what separates them from those of us who have not served in uniform -- their extraordinary willingness to risk their lives for people they never met.
My grandfather served in Patton's Army in World War II. But I cannot know what it is like to walk into battle. I'm the father of two young girls -- but I can't imagine what it's like to lose a child. These are things I cannot know. But I do know this: I am humbled to be the Commander-in-Chief of the finest fighting force in the history of the world. (Applause.)

I know that there is nothing I will not do to keep our country safe, even as I face no harder decision than sending our men and women to war -- and no moment more difficult than writing a letter to the families of the fallen. And that's why as long as I am President, I will only send our troops into harm's way when it is absolutely necessary, and I will always provide them with the equipment and support they need to get the job done. (Applause.)

I know that military families sacrifice more than we can understand, and feel an absence greater than we can comprehend. And that's why Michelle and I are committed to easing their burden.
And I know what a grateful nation owes to those who serve under its proud flag. And that's why I promise all our servicemen and women that when the guns fall silent, and you do return home, it will be to an America that is forever here for you, just as you've been there for us. (Applause.)

With each death, we are heartbroken. With each death, we grow more determined. This bustling graveyard can be a restless place for the living, where solace sometimes comes only from meeting others who know similar grief. But it reminds us all the meaning of valor; it reminds us all of our own obligations to one another; it recounts that most precious aspect of our history, and tells us that we will only rise or fall together.

So on this day of silent remembrance and solemn prayer I ask all Americans, wherever you are, whoever you're with, whatever you're doing, to pause in national unity at 3:00 this afternoon. I ask you to ring a bell, or offer a prayer, say a silent "thank you." And commit to give something back to this nation -- something lasting -- in their memory; to affirm in our own lives and advance around the world those enduring ideals of justice, equality, and opportunity for which they and so many generations of Americans have given that last full measure of devotion.

God bless you, God bless the fallen, and God bless the United States of America. (Applause.)

Leo Thorsness: Torture thoughts on Memorial Day

Leo Thorsness: Torture thoughts on Memorial Day. By Scott Johnson
Powerline blog, May 25, 2009 at 10:00 AM

Leo Thorsness is the Minnesota native who was awarded the Medal of Honor for unbelievable heroics in aerial combat over North Vietnam in April 1967. Within a few days of his heroics on his Medal of Honor mission, Col. Thorsness was shot down over North Vietnam and taken into captivity. In captivity he was tortured by the North Vietnamese for 18 straight days and periodically thereafter until his release in 1973.

Col. Thorsness recounts his experiences in Surviving Hell: A POW's Journey, about which I wrote at length here. In its own modest way, it is a great and timely book.

Thinking of Memorial Day in the context of current controversies, Col. Thorsness wrote the folllowing column:

Think Memorial Day and veterans usually come to mind. Think veterans and our national debate about torture comes to mind.

Of the 350 "old timer" Vietnam POWs, the majority were severely tortured by the North Vietnamese. Ironically the Department of Defense did not formally study torture after the POWs were released in 1973. We provided our military an actual "torture database library" but to this day, the Pentagon has never tapped the resource to help clarify national debate about "what is torture."

I and many other Vietnam POWs were tortured severely - some were tortured to death. Several POWs wrote books after our release in 1973 describing the torture in detail. Mike McGrath's book had extensive drawings vividly depicting types of torture the North Vietnamese used. (A gallery of McGrath's drawings is accessible here.)

When I wrote Surviving Hell in 2008, initially I did not include discussions of torture, knowing that others had earlier described it. My editors encouraged me to add it; if our younger population reads only current books, they may perceive that the treatment at Abu Grab and Gitmo was real torture. I added my experience being tortured so that readers will know that there is abuse and humiliation, and there is torture.

If someone surveyed the surviving Vietnam POWs, we would likely not agree on one definition of torture. In fact, we wouldn't agree if waterboarding is torture. For example, John McCain, Bud Day and I were recently together. Bud is one of the toughest and most tortured Vietnam POWs. John thinks waterboarding is torture; Bud and I believe it is harsh treatment, but not torture. Other POWs would have varying opinions. I don't claim to be right; we just disagree. But as someone who has been severely tortured over an extended time, my first hand view on torture is this:

Torture, when used by an expert, can produce useful, truthful information. I base that on my experience. I believe that during torture, there is a narrow "window of truth" as pain (often multiple kinds) is increased. Beyond that point, if torture increases, the person breaks, or dies if he continues to resist.

Everyone has a different physical and mental threshold of pain that he can tolerate. If the interrogator is well trained he can identify when that point is reached - the point when if slightly more pain is inflicted, a person no longer can "hold out," just giving (following the Geneva Convention) name, rank, serial number and date of birth. At that precise point, a very narrow torture "window of truth" exists. At that moment a person may give useful or truthful information to stop the pain. As slightly more pain is applied, the person "loses it" and will say anything he thinks will stop the torture - any lie, any story, and any random words or sounds

This torture "window of truth" is theory to some. Having been there, it is fact to me. While in torture I had the sickening feeling deep within my soul that maybe I would tell the truth as that horrendous pain increased. It is unpleasant, but I can still dredge up the memory of that window of truth feeling as the pain level intensified.

Our world is not completely good or evil. To proclaim we will never use any form of enhanced interrogations causes our friends to think we are naïve and eases our enemies' recruitment of radical terrorists to plot attacks on innocent kids, men and women - or any infidel. If I were to catch a "mad bomber" running away from an explosive I would not hesitate a second to use "enhanced interrogation," including waterboarding, if it would save lives of innocent people.

Our naïveté does not impress radical terrorists like those who slit the throat of Daniel Pearl in 2002 simply because he was Jewish, and broadcast the sight and sound of his dying gurgling. Publicizing our enhanced interrogation techniques only emboldens those who will hurt us.

At the end of the second paragraph of his column, Col. Thorsenss adds the following footnote: "Kepler Space University is beginning a study of Vietnam POW torture, headed by Professor Robert Krone, Col., USAF (ret.)." Thanks to Col. Thorsness for permission to post his column here today.

New Evidence Points to Hezbollah in Hariri Murder

New Evidence Points to Hezbollah in Hariri Murder. By Erich Follath
Der Spiegel, May 23, 2009

The United Nations special tribunal investigating the murder of former Lebanese Prime Minister Rafik al-Hariri has reached surprising new conclusions -- and it is keeping them secret. According to information obtained by SPIEGEL, investigators now believe Hezbollah was behind the Hariri murder.

A tribute to America's war heroes, past and present - Those Who Make Us Say 'Oh!'

Those Who Make Us Say 'Oh!'. By Peggy Noonan
A tribute to America's war heroes, past and present.
WSJ, May 25, 2009

More than most nations, America has been, from its start, a hero-loving place. Maybe part of the reason is that at our founding we were a Protestant nation and not a Catholic one, and so we made "saints" of civil and political figures. George Washington was our first national hero, known everywhere, famous to children. When he died, we had our first true national mourning, with cities and states re-enacting his funeral. There was the genius cluster that surrounded him, and invented us—Jefferson, Adams, Madison, Hamilton. Through much of the 20th century our famous heroes were in sports (Jack Dempsey, Joe Louis, the Babe, Joltin' Joe) the arts (Clark Gable, Robert Frost) business and philanthropy (from Andrew Carnegie to Bill Gates) and religion (Billy Graham). Nobody does fame like America, and they were famous.

The category of military hero—warrior—fell off a bit, in part because of the bad reputation of war. Some emerged of heroic size—Gens. Pershing and Patton, Eisenhower and Marshall. But somewhere in the 1960s I think we decided, or the makers of our culture decided, that to celebrate great warriors was to encourage war. And we always have too much of that. So they made a lot of movies depicting soldiers as victims and officers as brutish. This was especially true in the Vietnam era and the years that followed. Maybe a correction was in order: It's good to remember war is hell. But when we removed the warrior, we removed something intensely human, something ancestral and stirring, something celebrated naturally throughout the long history of man. Also it was ungrateful: They put themselves in harm's way for us.

For Memorial Day, then, three warriors, two previously celebrated but not so known now by the young.

Alvin York was born in 1887 into a Tennessee farming family that didn't have much, but nobody else did, so it wasn't so bad. He was the third of 11 children and had an average life for that time and place. Then World War I came. He experienced a crisis of conscience over whether to fight. His mother's Evangelical church tugged him toward more or less pacifist thinking, but he got a draft notice in 1917, joined the Army, went overseas, read and reread his Bible, and concluded that warfare was sometimes justified.

And click here to order her new book, Patriotic Grace. In the battle of the Argonne in October 1918, the allies were attempting to break German lines when York and his men came upon well-hidden machine guns on high ground. As he later put it, "The Germans got us, and they got us right smart . . . and I'm telling you they were shooting straight." American soldiers "just went down like the long grass before the mowing machine at home."

But Cpl. York and his men went behind the German lines, overran a unit, and captured the enemy. Suddenly there was new machine-gun fire from a ridge, and six Americans went down. York was in command, exposed but cool, and he began to shoot. "All I could do was touch the Germans off just as fast as I could. I was sharp shooting. . . . All the time I kept yelling at them to come down. I didn't want to kill any more than I had to." A German officer tried to empty his gun into York while York fired. He failed but York succeeded, the Germans surrendered, and York and his small band marched 132 German prisoners back to the American lines.

His Medal of Honor citation called him fearless, daring and heroic.

Warriors are funny people. They're often naturally peaceable, and often do great good when they return. York went home to Tennessee, married, founded an agricultural institute (it's still operating as an award-winning public high school) and a Bible school. They made a movie about him in 1941, the great Howard Hawks film "Sergeant York." If you are in Manhattan this week, you may walk down York Avenue on the Upper East Side. It was named for him. He died in Nashville in 1964 at 77.

Once, 25 years ago, my father (U.S. Army, replacement troops, Italy, 1945) visited Washington, a town he'd never been to. There was a lot to see: the White House, the Lincoln Memorial. But he just wanted to see one thing, Audie Murphy's grave.

Audie Leon Murphy was born in 1924 or 1926 (more on that in a moment) the sixth of 12 children of a Texas sharecropper. It was all hardscrabble for him: father left, mother died, no education, working in the fields from adolescence on. He was good with a hunting rifle: he said that when he wasn't, his family didn't eat, so yeah, he had to be good. He tried to join the Army after Pearl Harbor, was turned away as underage, came back the next year claiming to be 18 (he was probably 16) and went on to a busy war, seeing action as an infantryman in Sicily, Salerno and Anzio. Then came southern France, where the Germans made the mistake of shooting Audie Murphy's best friend, Lattie Tipton. Murphy wiped out the machine gun crew that did it.

On Jan. 26, 1945, Lt. Murphy was engaged in a battle in which his unit took heavy fire and he was wounded. He ordered his men back. From his Medal of Honor citation: "Behind him . . . one of our tank destroyers received a direct hit and began to burn. Its crew withdrew to the woods. 2d Lt. Murphy continued to direct artillery fire, which killed large numbers of the advancing enemy infantry. With the enemy tanks abreast of his position, 2d Lt. Murphy climbed on the burning tank destroyer, which was in danger of blowing up at any moment, and employed its .50 caliber machine gun against the enemy. He was alone and exposed to German fire from three sides, but his deadly fire killed dozens of Germans and caused their infantry attack to waver. The enemy tanks, losing infantry support, began to fall back."

Murphy returned to Texas a legend. He was also 5-foot-7, having grown two inches while away. He became an actor (44 films, mostly Westerns) and businessman. He died in a plane crash in 1971 and was buried with full honors at Arlington, but he did a warrior-like thing. He asked that the gold leaf normally put on the gravestone of a Medal of Honor recipient not be used. He wanted a plain GI headstone. Some worried this might make his grave harder to find. My father found it, and he was not alone. Audie Murphy's grave is the most visited site at Arlington with the exception of John F. Kennedy's eternal flame.

I thought of these two men the other night after I introduced at a dinner a retired Air Force general named Chuck Boyd. He runs Business Executives for National Security, a group whose members devote time and treasure to helping the government work through various 21st-century challenges. I mentioned that Chuck had been shot down over Vietnam on his 105th mission in April 1966 and was a POW for 2,488 days. He's the only former POW of the era to go on to become a four-star general.

When I said "2,488 days," a number of people in the audience went "Oh!" I heard it up on the podium. They didn't know because he doesn't talk about it, and when asked to, he treats it like nothing, a long night at a bad inn. Warriors always do that. They all deserve the "Oh!"

WSJ Editorial Page: Malaria, Politics and DDT - The U.N. bows to the anti-insecticide lobby

Malaria, Politics and DDT. WSJ Editorial
The U.N. bows to the anti-insecticide lobby.
WSJ, May 25, 2009

In 2006, after 25 years and 50 million preventable deaths, the World Health Organization reversed course and endorsed widespread use of the insecticide DDT to combat malaria. So much for that. Earlier this month, the U.N. agency quietly reverted to promoting less effective methods for attacking the disease. The result is a victory for politics over public health, and millions of the world's poor will suffer as a result.

The U.N. now plans to advocate for drastic reductions in the use of DDT, which kills or repels the mosquitoes that spread malaria. The aim "is to achieve a 30% cut in the application of DDT worldwide by 2014 and its total phase-out by the early 2020s, if not sooner," said WHO and the U.N. Environment Program in a statement on May 6.

Citing a five-year pilot program that reduced malaria cases in Mexico and South America by distributing antimalaria chloroquine pills to uninfected people, U.N. officials are ready to push for a "zero DDT world." Sounds nice, except for the facts. It's true that chloroquine has proven effective when used therapeutically, as in Brazil. But it's also true that scientists have questioned the safety of the drug as an oral prophylactic because it is toxic and has been shown to cause heart problems.

Most malarial deaths occur in sub-Saharan Africa, where chloroquine once worked but started failing in the 1970s as the parasite developed resistance. Even if the drugs were still effective in Africa, they're expensive and thus impractical for one of the world's poorest regions. That's not an argument against chloroquine, bed nets or other interventions. But it is an argument for continuing to make DDT spraying a key part of any effort to eradicate malaria, which kills about a million people -- mainly children -- every year. Nearly all of this spraying is done indoors, by the way, to block mosquito nesting at night. It is not sprayed willy-nilly in jungle habitat.

WHO is not saying that DDT shouldn't be used. But by revoking its stamp of approval, it sends a clear message to donors and afflicted countries that it prefers more politically correct interventions, even if they don't work as well. In recent years, countries like Uganda, Tanzania and Zambia have started or expanded DDT spraying, often with the help of outside aid groups. But these governments are also eager to remain in the U.N.'s good graces, and donors typically are less interested in funding interventions that WHO discourages.

"Sadly, WHO's about-face has nothing to do with science or health and everything to do with bending to the will of well-placed environmentalists," says Roger Bate of Africa Fighting Malaria. "Bed net manufacturers and sellers of less-effective insecticides also don't benefit when DDT is employed and therefore oppose it, often behind the scenes."

It's no coincidence that WHO officials were joined by the head of the U.N. Environment Program to announce the new policy. There's no evidence that spraying DDT in the amounts necessary to kill dangerous mosquitoes imperils crops, animals or human health. But that didn't stop green groups like the Pesticide Action Network from urging the public to celebrate World Malaria Day last month by telling "the U.S. to protect children and families from malaria without spraying pesticides like DDT inside people's homes."

"We must take a position based on the science and the data," said WHO's malaria chief, Arata Kochi, in 2006. "One of the best tools we have against malaria is indoor residual spraying. Of the dozen or so insecticides WHO has approved as safe for house spraying, the most effective is DDT." Mr. Kochi was right then, even if other WHO officials are now bowing to pressure to pretend otherwise.

The president of the Dallas Fed on inflation risk and central bank independence

Don't Monetize the Debt. By Mary Anastasia O'Grady
The president of the Dallas Fed on inflation risk and central bank independence.
WSJ, May 25, 2009

Dallas

From his perch high atop the palatial Dallas Federal Reserve Bank, overlooking what he calls "the most modern, efficient city in America," Richard Fisher says he is always on the lookout for rising prices. But that's not what's worrying the bank's president right now.

His bigger concern these days would seem to be what he calls "the perception of risk" that has been created by the Fed's purchases of Treasury bonds, mortgage-backed securities and Fannie Mae paper.

Mr. Fisher acknowledges that events in the financial markets last year required some unusual Fed action in the commercial lending market. But he says the longer-term debt, particularly the Treasurys, is making investors nervous. The looming challenge, he says, is to reassure markets that the Fed is not going to be "the handmaiden" to fiscal profligacy. "I think the trick here is to assist the functioning of the private markets without signaling in any way, shape or form that the Federal Reserve will be party to monetizing fiscal largess, deficits or the stimulus program."

The very fact that a Fed regional bank president has to raise this issue is not very comforting. It conjures up images of Argentina. And as Mr. Fisher explains, he's not the only one worrying about it. He has just returned from a trip to China, where "senior officials of the Chinese government grill[ed] me about whether or not we are going to monetize the actions of our legislature." He adds, "I must have been asked about that a hundred times in China."

A native of Los Angeles who grew up in Mexico, Mr. Fisher was educated at Harvard, Oxford and Stanford. He spent his earliest days in government at Jimmy Carter's Treasury. He says that taught him a life-long lesson about inflation. It was "inflation that destroyed that presidency," he says. He adds that he learned a lot from then Fed Chairman Paul Volcker, who had to "break [inflation's] back."

Mr. Fisher has led the Dallas Fed since 2005 and has developed a reputation as the Federal Open Market Committee's (FOMC) lead inflation worrywart. In September he told a New York audience that "rates held too low, for too long during the previous Fed regime were an accomplice to [the] reckless behavior" that brought about the economic troubles we are now living through. He also warned that the Treasury's $700 billion plan to buy toxic assets from financial institutions would be "one more straw on the back of the frightfully encumbered camel that is the federal government ledger."

In a speech at the Kennedy School of Government in February, he wrung his hands about "the very deep hole [our political leaders] have dug in incurring unfunded liabilities of retirement and health-care obligations" that "we at the Dallas Fed believe total over $99 trillion." In March, he is believed to have vociferously objected in closed-door FOMC meetings to the proposal to buy U.S. Treasury bonds. So with long-term Treasury yields moving up sharply despite Fed intentions to bring down mortgage rates, I've flown to Dallas to see what he's thinking now.

Regarding what caused the credit bubble, he repeats his assertion about the Fed's role: "It is human instinct when rates are low and the yield curve is flat to reach for greater risk and enhanced yield and returns." (Later, he adds that this is not to cast aspersions on former Fed Chairman Alan Greenspan and reminds me that these decisions are made by the FOMC.)

"The second thing is that the regulators didn't do their job, including the Federal Reserve." To this he adds what he calls unusual circumstances, including "the fruits and tailwinds of globalization, billions of people added to the labor supply, new factories and productivity coming from places it had never come from before." And finally, he says, there was the 'mathematization' of risk." Institutions were "building risk models" and relying heavily on "quant jocks" when "in the end there can be no substitute for good judgment."

What about another group of alleged culprits: the government-anointed rating agencies? Mr. Fisher doesn't mince words. "I served on corporate boards. The way rating agencies worked is that they were paid by the people they rated. I saw that from the inside." He says he also saw this "inherent conflict of interest" as a fund manager. "I never paid attention to the rating agencies. If you relied on them you got . . . you know," he says, sparing me the gory details. "You did your own analysis. What is clear is that rating agencies always change something after it is obvious to everyone else. That's why we never relied on them." That's a bit disconcerting since the Fed still uses these same agencies in managing its own portfolio.

I wonder whether the same bubble-producing Fed errors aren't being repeated now as Washington scrambles to avoid a sustained economic downturn.

He surprises me by siding with the deflation hawks. "I don't think that's the risk right now." Why? One factor influencing his view is the Dallas Fed's "trim mean calculation," which looks at price changes of more than 180 items and excludes the extremes. Dallas researchers have found that "the price increases are less and less. Ex-energy, ex-food, ex-tobacco you've got some mild deflation here and no inflation in the [broader] headline index."

Mr. Fisher says he also has a group of about 50 CEOs around the U.S. and the world that he calls on, all off the record, before almost every FOMC meeting. "I don't impart any information, I just listen carefully to what they are seeing through their own eyes. And that gives me a sense of what's happening on the ground, you might say on Main Street as opposed to Wall Street."

It's good to know that a guy so obsessed with price stability doesn't see inflation on the horizon. But inflation and bubble trouble almost always get going before they are recognized. Moreover, the Fed has to pay attention to the 1978 Full Employment and Balanced Growth Act -- a.k.a. Humphrey-Hawkins -- and employment is a lagging indicator of economic activity. This could create a Fed bias in favor of inflating. So I push him again.

"I want to make sure that your readers understand that I don't know a single person on the FOMC who is rooting for inflation or who is tolerant of inflation." The committee knows very well, he assures me, that "you cannot have sustainable employment growth without price stability. And by price stability I mean that we cannot tolerate deflation or the ravages of inflation."

Mr. Fisher defends the Fed's actions that were designed to "stabilize the financial system as it literally fell apart and prevent the economy from imploding." Yet he admits that there is unfinished work. Policy makers have to be "always mindful that whatever you put in, you are going to have to take out at some point. And also be mindful that there are these perceptions [about the possibility of monetizing the debt], which is why I have been sensitive about the issue of purchasing Treasurys."

He returns to events on his recent trip to Asia, which besides China included stops in Japan, Hong Kong, Singapore and Korea. "I wasn't asked once about mortgage-backed securities. But I was asked at every single meeting about our purchase of Treasurys. That seemed to be the principal preoccupation of those that were invested with their surpluses mostly in the United States. That seems to be the issue people are most worried about."

As I listen I am reminded that it's not just the Asians who have expressed concern. In his Kennedy School speech, Mr. Fisher himself fretted about the U.S. fiscal picture. He acknowledges that he has raised the issue "ad nauseam" and doesn't apologize. "Throughout history," he says, "what the political class has done is they have turned to the central bank to print their way out of an unfunded liability. We can't let that happen. That's when you open the floodgates. So I hope and I pray that our political leaders will just have to take this bull by the horns at some point. You can't run away from it."

Voices like Mr. Fisher's can be a problem for the politicians, which may be why recently there have been rumblings in Washington about revoking the automatic FOMC membership that comes with being a regional bank president. Does Mr. Fisher have any thoughts about that?

This is nothing new, he points out, briefly reviewing the history of the political struggle over monetary policy in the U.S. "The reason why the banks were put in the mix by [President Woodrow] Wilson in 1913, the reason it was structured the way it was structured, was so that you could offset the political power of Washington and the money center in New York with the regional banks. They represented Main Street.

"Now we have this great populist fervor and the banks are arguing for Main Street, largely. I have heard these arguments before and studied the history. I am not losing a lot of sleep over it," he says with a defiant Texas twang that I had not previously detected. "I don't think that it'd be the best signal to send to the market right now that you want to totally politicize the process."

Speaking of which, Texas bankers don't have much good to say about the Troubled Asset Relief Program (TARP), according to Mr. Fisher. "Its been complicated by the politics because you have a special investigator, special prosecutor, and all I can tell you is that in my district here most of the people who wanted in on the TARP no longer want in on the TARP."

At heart, Mr. Fisher says he is an advocate for letting markets clear on their own. "You know that I am a big believer in Schumpeter's creative destruction," he says referring to the term coined by the late Austrian economist. "The destructive part is always painful, politically messy, it hurts like hell but you hopefully will allow the adjustments to be made so that the creative part can take place." Texas went through that process in the 1980s, he says, and came back stronger.

This is doubtless why, with Washington taking on a larger role in the American economy every day, the worries linger. On the wall behind his desk is a 1907 gouache painting by Antonio De Simone of the American steam sailing vessel Varuna plowing through stormy seas. Just like most everything else on the walls, bookshelves and table tops around his office -- and even the dollar-sign cuff links he wears to work -- it represents something.

He says that he has had this painting behind his desk for the past 30 years as a reminder of the importance of purpose and duty in rough seas. "The ship," he explains, "has to maintain its integrity." What is more, "no mathematical model can steer you through the kind of seas in that picture there. In the end someone has the wheel." He adds: "On monetary policy it's the Federal Reserve."

Ms. O'Grady writes the Journal's Americas column.

Interrogations and Presidential Prerogative - The Executive and substantial discretionary powers

Interrogations and Presidential Prerogative. By Walter Berns
The Founders created an executive with substantial discretionary powers.
WSJ, May 25, 2009

Recently, an Episcopal church in Bethesda, Md., displayed a banner with the following words: "God bless everyone (no exceptions)." I confessed to the rector of my own church that, try as I might, I simply could not obey this injunction. Judging by what he had to say about "enhanced" interrogations, Sen. Lindsey Graham (R., N.C.) seems not to share my difficulty.

Mr. Graham believes that we're either a rule-of-law nation or we're not, and no exceptions. "I don't love the terrorists. I just love what Americans stand for," he said in an interview with Newsweek in 2006. His point was that our definitions of torture should not vary with the sort of person being questioned -- terrorists, for example, or merely prisoners of war.

Mr. Graham's position is similar to the one taken by Chief Justice Roger Brooke Taney during the Civil War. In 1861, Confederate sympathizers in Maryland were burning railroad bridges, tearing up their tracks, and attacking federal troops so as to prevent them from reaching the national capital. Since local officials did nothing about this, Abraham Lincoln did. He ordered the military to suspend the writ of habeas corpus, which led to the arrest and imprisonment of John Merryman, a leader of the sympathizers.

Chief Justice Taney ruled in Ex Parte Merryman (1861) that only Congress could suspend the writ of habeas corpus and ordered Merryman released. Lincoln disobeyed the order, believing that the executive must sometimes do things it would not do in ordinary times. Would he have done this if the issue had been the interrogation of terrorists? Does the law have something to say about this?

And would Taney and Graham find support for their views in the writings of our Founders or their philosophical mentors, particularly John Locke, the 17th century Englishman sometimes referred to as "America's philosopher"? Locke is the source of our attachment to the rule of law and the priority of the legislative power.

Locke argued in the Second Treatise of Civil Government that the "first and fundamental law is the establishment of the legislative power." And so it is that the first article of the U.S. Constitution is devoted to the legislative power. There is safety in law, he said; the law is "promulgated and known to the people," and everyone without exception is subject to it.

But Locke admitted that not everything can be done by law. Or, as he said, there are many things "which the law can by no means provide for." The law cannot "foresee" events, for example, nor can it act with dispatch or with the appropriate subtlety required when dealing with foreign powers. Nor, as we know very well indeed, can a legislative body preserve secrecy.

Such matters, Locke continued in the Second Treatise, should be left to "the discretion of him who has the executive power." It is in this context that he first spoke of the "prerogative": the "power to act according to discretion, for the public good without the prescription of the law, and sometimes even against it." He concluded by saying "prerogative is nothing but the power of doing public good without a rule" (italics in the original).

Did the Framers find a place in our Constitution for this extraordinary power? What, if anything, did they say on the subject or, perhaps more tellingly, what did they not say?

They said nothing about a prerogative or -- apart from the habeas corpus provision -- anything suggesting a need for it. But they provided for an executive significantly different from -- and significantly more powerful than -- the executives provided for in the early state constitutions of the revolutionary era. This new executive is, first of all, a single person, and, as the Constitution has it, "he shall be Commander in Chief of the Army and Navy." This is no mean power; Lincoln used it to imprison insurgents and to free the slaves.

The Framers seemed to be aware of what they were doing when they established the office. I draw this conclusion from their reaction when the office was first proposed.

According to the "Records of the Federal Convention of 1787," on June 1, a mere two weeks into the life of the convention, James Wilson "moved that the Executive consist in a single person." Charles Pinckney seconded the motion. Then, "a considerable pause" ensued, and the chairman asked if he should put the question. "Doc Franklin observed that it was a point of great importance and wished that the gentlemen would deliver their sentiments on it before the question was put and Mr. Rutledge animadverted on the shyness of gentlemen. . . ."

Why the silence? Why were they shy? Apparently because the proposal was so radically different from the executives provided in the state constitutions (and the fact that there was no executive whatsoever under the Articles of Confederation). All of these governmental bodies (except New York), and especially those whose constitutions were written in the years 1776-78, included "almost every conceivable provision for reducing the executive to a position of complete subordination," as Charles C. Thach Jr., noted in "The Creation of the Presidency, 1775-1789." The gentlemen were also shy because the provision for a single executive reminded them of George III and of what he had done.

This new, single executive is also required to take an oath to "preserve, protect and defend the Constitution of the United States." This was the provision of his oath President George W. Bush used to capture, hold and interrogate terrorists.

Questions arise: Was the Constitution or, better, the nation actually in jeopardy after 9/11? Was Mr. Bush entitled to imprison the terrorists in Guantanamo? Were the interrogations justified? Were they more severe than necessary? Did they prove useful in protecting the nation and its citizens? These are the sorts of questions Locke may have had in mind in his chapter on the prerogative. Who, he then asked, shall be judge whether "this power is made right use of?" Initially, of course, the executive but, ultimately, the people.

The executive in our case, at least to begin with, is represented by the three Justice Department officials who wrote the memos that Mr. Graham and many members of the Obama administration have found offensive. They have been accused of justifying torture, but they have not yet been given the opportunity in an official setting or forum to defend what they did.

That forum could be a committee of Congress or a "truth commission" -- so long as, in addition to the assistance of counsel, they would be judged by "an impartial jury," have the right to call witnesses in their favor, to call for the release of evidence including the CIA memos showing the success of enhanced interrogations, and the right to "confront the witnesses" against them as the Constitution's Fifth and Sixth Amendments provide. There is much to be said for a process that, among other things, would require Nancy Pelosi to testify under oath.

Mr. Berns is a resident scholar at the American Enterprise Institute.

Moral Hazard and the Meltdown: Everybody felt too big to fail

Moral Hazard and the Meltdown. By Scott Harrington
Everybody felt too big to fail.
WSJ, May 25, 2009