Showing posts with label state theory. Show all posts
Showing posts with label state theory. Show all posts

Sunday, July 26, 2015

International Courts and the New Paternalism - African leaders are the targets because ambitious jurists consider them to be 'low-hanging fruit'

International Courts and the New Paternalism. By Jendayi Frazer
African leaders are the targets because ambitious jurists consider them to be ‘low-hanging fruit.’
http://www.wsj.com/articles/international-courts-and-the-new-paternalism-1437778048
WSJ, July 24, 2015 6:47 p.m. ET
Nairobi, Kenya

President Obama arrived in Kenya on Friday and will travel from here to Ethiopia, two crucial U.S. allies in East Africa. The region is not only emerging as an economic powerhouse, it is also an important front in the battle with al Qaeda, al-Shabaab, Islamic State and other Islamist radicals.

Yet grievances related to how the International Criminal Court’s universal jurisdiction is applied in Africa are interfering with U.S. and European relations on the continent. In Africa there are accusations of neocolonialism and even racism in ICC proceedings, and a growing consensus that Africans are being unjustly indicted by the court.

It wasn’t supposed to be this way. After the failure to prevent mass atrocities in Europe and Africa in the 1990s, a strong consensus emerged that combating impunity had to be an international priority. Ad hoc United Nations tribunals were convened to judge the masterminds of genocide and crimes against humanity in Yugoslavia, Rwanda and Sierra Leone. These courts were painfully slow and expensive. But their mandates were clear and limited, and they helped countries to turn the page and focus on rebuilding.

Soon universal jurisdiction was seen not only as a means to justice, but also a tool for preventing atrocities in the first place. Several countries in Western Europe including Spain, the United Kingdom, Belgium and France empowered their national courts with universal jurisdiction. In 2002 the International Criminal Court came into force.

Africa and Europe were early adherents and today constitute the bulk of ICC membership. But India, China, Russia and most of the Middle East—representing well over half the world’s population—stayed out. So did the United States. Leaders in both parties worried that an unaccountable supranational court would become a venue for politicized show trials. The track record of the ICC and European courts acting under universal jurisdiction has amply borne out these concerns.

Only when U.S. Defense Secretary Donald Rumsfeld threatened to move NATO headquarters out of Brussels in 2003 did Belgium rein in efforts to indict former President George H.W. Bush, and Gens. Colin Powell and Tommy Franks, for alleged “war crimes” during the 1990-91 Gulf War. Spanish courts have indicted American military personnel in Iraq and investigated the U.S. detention facility in Guantanamo Bay.

But with powerful states able to shield themselves and their clients, Africa has borne the brunt of indictments. Far from pursuing justice for victims, these courts have become a venue for public-relations exercises by activist groups. Within African countries, they have been manipulated by one political faction to sideline another, often featuring in electoral politics.
The ICC’s recent indictments of top Kenyan officials are a prime example. In October 2014, Kenyan President Uhuru Kenyatta became the first sitting head of state to appear before the ICC, though he took the extraordinary step of temporarily transferring power to his deputy to avoid the precedent. ICC prosecutors indicted Mr. Kenyatta in connection with Kenya’s post-election ethnic violence of 2007-08, in which some 1,200 people were killed.

Last December the ICC withdrew all charges against Mr. Kenyatta, saying the evidence had “not improved to such an extent that Mr Kenyatta’s alleged criminal responsibility can be proven beyond reasonable doubt.” As U.S. assistant secretary of state for African affairs from 2005-09, and the point person during Kenya’s 2007-08 post-election violence, I knew the ICC indictments were purely political. The court’s decision to continue its case against Kenya’s deputy president, William Ruto, reflects a degree of indifference and even hostility to Kenya’s efforts to heal its political divisions.

The ICC’s indictments in Kenya began with former chief prosecutor Luis Moreno-Ocampo’s determination to prove the court’s relevance in Africa by going after what he reportedly called “low-hanging fruit.” In other words, African political and military leaders unable to resist ICC jurisdiction.

More recently, the arrest of Rwandan chief of intelligence Lt. Gen. Emmanuel Karenzi Karake in London last month drew a unanimous reproach from the African Union’s Peace and Security Council. The warrant dates to a 2008 Spanish indictment for alleged reprisal killings following the 1994 Rwandan genocide. At the time of the indictment, Mr. Karenzi Karake was deputy commander of the joint U.N.-African Union peacekeeping operation in Darfur. The Rwandan troops under his command were the backbone of the Unamid force, and his performance in Darfur was by all accounts exemplary.

Moreover, a U.S. government interagency review conducted in 2007-08, when I led the State Department’s Bureau of African Affairs, found that the Spanish allegations against Mr. Karenzi Karake were false and unsubstantiated. The U.S. fully backed his reappointment in 2008 as deputy commander of Unamid forces. It would be a travesty of justice if the U.K. were to extradite Mr. Karake to Spain to stand trial.

Sadly, the early hope of “universal jurisdiction” ending impunity for perpetrators of genocide and crimes against humanity has given way to cynicism, both in Africa and the West. In Africa it is believed that, in the rush to demonstrate their power, these courts and their defenders have been too willing to brush aside considerations of due process that they defend at home.

In the West, the cynicism is perhaps even more damaging because it calls into question the moral capabilities of Africans and their leaders, and revives the language of paternalism and barbarism of earlier generations.

Ms. Frazer, a former U.S. ambassador to South Africa (2004-05) and assistant secretary of state for African affairs (2005-09), is an adjunct senior fellow for Africa studies at the Council on Foreign Relations.

Saturday, May 30, 2015

Magna Carta: Eight Centuries of Liberty

June marks the 800th anniversary of Magna Carta, the ‘Great Charter’ that established the rule of law for the English-speaking world. Its revolutionary impact still resounds today, writes Daniel Hannan

http://www.wsj.com/articles/magna-carta-eight-centuries-of-liberty-1432912022 

King John, pressured by English barons, reluctantly signs Magna Carta, the ‘Great Charter,’ on the Thames riverbank, Runnymede, June 15, 1215, as rendered in James Doyle’s ‘A Chronicle of England.’ Photo: Mary Evans Picture Library/Everett Collection http://si.wsj.net/public/resources/images/BN-IQ808_MAGNA_J_20150529103352.jpg

Eight hundred years ago next month, on a reedy stretch of riverbank in southern England, the most important bargain in the history of the human race was struck. I realize that’s a big claim, but in this case, only superlatives will do. As Lord Denning, the most celebrated modern British jurist put it, Magna Carta was “the greatest constitutional document of all time, the foundation of the freedom of the individual against the arbitrary authority of the despot.”

It was at Runnymede, on June 15, 1215, that the idea of the law standing above the government first took contractual form. King John accepted that he would no longer get to make the rules up as he went along. From that acceptance flowed, ultimately, all the rights and freedoms that we now take for granted: uncensored newspapers, security of property, equality before the law, habeas corpus, regular elections, sanctity of contract, jury trials.

Magna Carta is Latin for “Great Charter.” It was so named not because the men who drafted it foresaw its epochal power but because it was long. Yet, almost immediately, the document began to take on a political significance that justified the adjective in every sense.

The bishops and barons who had brought King John to the negotiating table understood that rights required an enforcement mechanism. The potency of a charter is not in its parchment but in the authority of its interpretation. The constitution of the U.S.S.R., to pluck an example more or less at random, promised all sorts of entitlements: free speech, free worship, free association. But as Soviet citizens learned, paper rights are worthless in the absence of mechanisms to hold rulers to account.

Magna Carta instituted a form of conciliar rule that was to develop directly into the Parliament that meets at Westminster today. As the great Victorian historian William Stubbs put it, “the whole constitutional history of England is little more than a commentary on Magna Carta.”

And not just England. Indeed, not even England in particular. Magna Carta has always been a bigger deal in the U.S. The meadow where the abominable King John put his royal seal to the parchment lies in my electoral district in the county of Surrey. It went unmarked until 1957, when a memorial stone was finally raised there—by the American Bar Association.

Only now, for the anniversary, is a British monument being erected at the place where freedom was born. After some frantic fundraising by me and a handful of local councilors, a large bronze statue of Queen Elizabeth II will gaze out across the slow, green waters of the Thames, marking 800 years of the Crown’s acceptance of the rule of law.

Eight hundred years is a long wait. We British have, by any measure, been slow to recognize what we have. Americans, by contrast, have always been keenly aware of the document, referring to it respectfully as the Magna Carta.

Why? Largely because of who the first Americans were. Magna Carta was reissued several times throughout the 14th and 15th centuries, as successive Parliaments asserted their prerogatives, but it receded from public consciousness under the Tudors, whose dynasty ended with the death of Elizabeth I in 1603.

In the early 17th century, members of Parliament revived Magna Carta as a weapon in their quarrels with the autocratic Stuart monarchs. Opposition to the Crown was led by the brilliant lawyer Edward Coke (pronounced Cook), who drafted the first Virginia Charter in 1606. Coke’s argument was that the king was sidelining Parliament, and so unbalancing the “ancient constitution” of which Magna Carta was the supreme expression.
United for the first time, the four surviving original Magna Carta manuscripts are prepared for display at the British Library, London, Feb. 1, 2015.
United for the first time, the four surviving original Magna Carta manuscripts are prepared for display at the British Library, London, Feb. 1, 2015. Photo: UPPA/ZUMA PRESS

The early settlers arrived while these rows were at their height and carried the mania for Magna Carta to their new homes. As early as 1637, Maryland sought permission to incorporate Magna Carta into its basic law, and the first edition of the Great Charter was published on American soil in 1687 by William Penn, who explained that it was what made Englishmen unique: “In France, and other nations, the mere will of the Prince is Law, his word takes off any man’s head, imposeth taxes, or seizes any man’s estate, when, how and as often as he lists; But in England, each man hath a fixed Fundamental Right born with him, as to freedom of his person and property in his estate, which he cannot be deprived of, but either by his consent, or some crime, for which the law has imposed such a penalty or forfeiture.”

There was a divergence between English and American conceptions of Magna Carta. In the Old World, it was thought of, above all, as a guarantor of parliamentary supremacy; in the New World, it was already coming to be seen as something that stood above both Crown and Parliament. This difference was to have vast consequences in the 1770s.

The American Revolution is now remembered on both sides of the Atlantic as a national conflict—as, indeed, a “War of Independence.” But no one at the time thought of it that way—not, at any rate, until the French became involved in 1778. Loyalists and patriots alike saw it as a civil war within a single polity, a war that divided opinion every bit as much in Great Britain as in the colonies.

The American Revolutionaries weren’t rejecting their identity as Englishmen; they were asserting it. As they saw it, George III was violating the “ancient constitution” just as King John and the Stuarts had done. It was therefore not just their right but their duty to resist, in the words of the delegates to the first Continental Congress in 1774, “as Englishmen our ancestors in like cases have usually done.”

Nowhere, at this stage, do we find the slightest hint that the patriots were fighting for universal rights. On the contrary, they were very clear that they were fighting for the privileges bestowed on them by Magna Carta. The concept of “no taxation without representation” was not an abstract principle. It could be found, rather, in Article 12 of the Great Charter: “No scutage or aid is to be levied in our realm except by the common counsel of our realm.” In 1775, Massachusetts duly adopted as its state seal a patriot with a sword in one hand and a copy of Magna Carta in the other.

I recount these facts to make an important, if unfashionable, point. The rights we now take for granted—freedom of speech, religion, assembly and so on—are not the natural condition of an advanced society. They were developed overwhelmingly in the language in which you are reading these words.

When we call them universal rights, we are being polite. Suppose World War II or the Cold War had ended differently: There would have been nothing universal about them then. If they are universal rights today, it is because of a series of military victories by the English-speaking peoples.

Various early copies of Magna Carta survive, many of them in England’s cathedrals, tended like the relics that were removed during the Reformation. One hangs in the National Archives in Washington, D.C., next to the two documents it directly inspired: the Declaration of Independence and the Constitution. Another enriches the Australian Parliament in Canberra.

But there are only four 1215 originals. One of them, normally housed at Lincoln Cathedral, has recently been on an American tour, resting for some weeks at the Library of Congress. It wasn’t that copy’s first visit to the U.S. The same parchment was exhibited in New York at the 1939 World’s Fair, attracting an incredible 13 million visitors. World War II broke out while it was still on display, and it was transferred to Fort Knox for safekeeping until the end of the conflict.

Could there have been a more apt symbol of what the English-speaking peoples were fighting for in that conflagration? Think of the world as it stood in 1939. Constitutional liberty was more or less confined to the Anglosphere. Everywhere else, authoritarianism was on the rise. Our system, uniquely, elevated the individual over the state, the rules over the rulers.

When the 18th-century statesman Pitt the Elder described Magna Carta as England’s Bible, he was making a profound point. It is, so to speak, the Torah of the English-speaking peoples: the text that sets us apart while at the same time speaking truths to the rest of mankind.

The very success of Magna Carta makes it hard for us, 800 years on, to see how utterly revolutionary it must have appeared at the time. Magna Carta did not create democracy: Ancient Greeks had been casting differently colored pebbles into voting urns while the remote fathers of the English were grubbing about alongside pigs in the cold soil of northern Germany. Nor was it the first expression of the law: There were Sumerian and Egyptian law codes even before Moses descended from Sinai.

What Magna Carta initiated, rather, was constitutional government—or, as the terse inscription on the American Bar Association’s stone puts it, “freedom under law.”

It takes a real act of imagination to see how transformative this concept must have been. The law was no longer just an expression of the will of the biggest guy in the tribe. Above the king brooded something more powerful yet—something you couldn’t see or hear or touch or taste but that bound the sovereign as surely as it bound the poorest wretch in the kingdom. That something was what Magna Carta called “the law of the land.”

This phrase is commonplace in our language. But think of what it represents. The law is not determined by the people in government, nor yet by clergymen presuming to interpret a holy book. Rather, it is immanent in the land itself, the common inheritance of the people living there.

The idea of the law coming up from the people, rather than down from the government, is a peculiar feature of the Anglosphere. Common law is an anomaly, a beautiful, miraculous anomaly. In the rest of the world, laws are written down from first principles and then applied to specific disputes, but the common law grows like a coral, case by case, each judgment serving as the starting point for the next dispute. In consequence, it is an ally of freedom rather than an instrument of state control. It implicitly assumes residual rights.

And indeed, Magna Carta conceives rights in negative terms, as guarantees against state coercion. No one can put you in prison or seize your property or mistreat you other than by due process. This essentially negative conception of freedom is worth clinging to in an age that likes to redefine rights as entitlements—the right to affordable health care, the right to be forgotten and so on.

It is worth stressing, too, that Magna Carta conceived freedom and property as two expressions of the same principle. The whole document can be read as a lengthy promise that the goods of a free citizen will not be arbitrarily confiscated by someone higher up the social scale. Even the clauses that seem most remote from modern experience generally turn out, in reality, to be about security of ownership.

There are, for example, detailed passages about wardship. King John had been in the habit of marrying heiresses to royal favorites as a way to get his hands on their estates. The abstruse-sounding articles about inheritance rights are, in reality, simply one more expression of the general principle that the state may not expropriate without due process.

Those who stand awe-struck before the Great Charter expecting to find high-flown phrases about liberty are often surprised to see that a chunk of it is taken up with the placing of fish-traps on the Thames. Yet these passages, too, are about property, specifically the freedom of merchants to navigate inland waterways without having arbitrary tolls imposed on them by fish farmers.

Liberty and property: how naturally those words tripped, as a unitary concept, from the tongues of America’s Founders. These were men who had been shaped in the English tradition, and they saw parliamentary government not as an expression of majority rule but as a guarantor of individual freedom. How different was the Continental tradition, born 13 years later with the French Revolution, which saw elected assemblies as the embodiment of what Rousseau called the “general will” of the people.

In that difference, we may perhaps discern explanation of why the Anglosphere resisted the chronic bouts of authoritarianism to which most other Western countries were prone. We who speak this language have always seen the defense of freedom as the duty of our representatives and so, by implication, of those who elect them. Liberty and democracy, in our tradition, are not balanced against each other; they are yoked together.

In February, the four surviving original copies of Magna Carta were united, for just a few hours, at the British Library—something that had not happened in 800 years. As I stood reverentially before them, someone recognized me and posted a photograph on Twitter with the caption: “If Dan Hannan gets his hands on all four copies of Magna Carta, will he be like Sauron with the Rings?”

Yet the majesty of the document resides in the fact that it is, so to speak, a shield against Saurons. Most other countries have fallen for, or at least fallen to, dictators. Many, during the 20th century, had popular communist parties or fascist parties or both. The Anglosphere, unusually, retained a consensus behind liberal capitalism.

This is not because of any special property in our geography or our genes but because of our constitutional arrangements. Those constitutional arrangements can take root anywhere. They explain why Bermuda is not Haiti, why Hong Kong is not China, why Israel is not Syria.

They work because, starting with Magna Carta, they have made the defense of freedom everyone’s responsibility. Americans, like Britons, have inherited their freedoms from past generations and should not look to any external agent for their perpetuation. The defense of liberty is your job and mine. It is up to us to keep intact the freedoms we inherited from our parents and to pass them on securely to our children.

Mr. Hannan is a British member of the European Parliament for the Conservative Party, a columnist for the Washington Examiner and the author of “Inventing Freedom: How the English-speaking Peoples Made the Modern World.”

Friday, April 3, 2015

The Federal President would not stay in power if he did not talk human rights. So look at it as a political imperative.

Joe Biden on Human Rights
The Vice President tells China’s leaders to ignore the U.S.
WSJ, Apr 01, 2015

White House officials can be oddly candid in talking to their liberal friends at the New Yorker magazine. That’s where an unnamed official in 2011 boasted of “leading from behind,” and where last year President Obama dismissed Islamic State as a terrorist “jayvee team.” Now the U.S. Vice President has revealed the Administration line on human rights in China.

In the April 6 issue, Joe Biden recounts meeting Xi Jinping months before his 2012 ascent to be China’s supreme leader. Mr. Xi asked him why the U.S. put “so much emphasis on human rights.” The right answer is simple: No government has the right to deny its citizens basic freedoms, and those that do tend also to threaten peace overseas, so U.S. support for human rights is a matter of values and interests.

Instead, Mr. Biden downplayed U.S. human-rights rhetoric as little more than political posturing. “No president of the United States could represent the United States were he not committed to human rights,” he told Mr. Xi. “President Barack Obama would not be able to stay in power if he did not speak of it. So look at it as a political imperative.” Then Mr. Biden assured China’s leader: “It doesn’t make us better or worse. It’s who we are. You make your decisions. We’ll make ours.” [not the WSJ's emphasis.]

Mr. Xi took the advice. Since taking office he has detained more than 1,000 political prisoners, from anticorruption activist Xu Zhiyong to lawyer Pu Zhiqiang and journalist Gao Yu. He has cracked down on Uighurs in Xinjiang, banning more Muslim practices and jailing scholar-activist Ilham Tohti for life. Anti-Christian repression and Internet controls are tightening. Nobel Peace laureate Liu Xiaobo remains in prison, his wife Liu Xia under illegal house arrest for the fifth year. Lawyer Gao Zhisheng left prison in August but is blocked from receiving medical care overseas. Hong Kong, China’s most liberal city, is losing its press freedom and political autonomy.

Amid all of this Mr. Xi and his government have faced little challenge from Washington. That is consistent with Hillary Clinton’s 2009 statement that human rights can’t be allowed to “interfere” with diplomacy on issues such as the economy and the environment. Mr. Obama tried walking that back months later, telling the United Nations that democracy and human rights aren’t “afterthoughts.” But his Administration’s record—and now Mr. Biden’s testimony—prove otherwise.

Monday, October 28, 2013

When he was in power, he was unreasonable and arrogant and considered citizens' rights and the law to be nothing

Rejection of Bo Xilai's Appeal Concludes Chinese Drama. By Jeremy Page
'This Is the Final Verdict,' Court Says om Widely Expected RulingWall Street Journal, Oct. 25, 2013 9:51 a.m. ET
http://online.wsj.com/news/articles/SB10001424052702304799404579157354280260862



Edited:

Mr. Bo burnished his political reputation there by presiding over a sweeping campaign against organized crime that many lawyers and rights activists say disregarded legal norms and [other things we won't mention in this blog.]

"When he was in power, he was unreasonable and arrogant and considered citizens' rights and the law to be nothing," wrote Zhou Yongkun, a professor at Suzhou University's law school, on his microblog.

"As soon as he became a prisoner, he realized the importance of rights, and that the law was his umbrella. But it was too late."

Wednesday, October 23, 2013

Hong Kong's Policies of Impoverishment - A poverty line is another step on Hong Kong's road to serfdom

Hong Kong's Policies of Impoverishment. WSJ Editorial
A poverty line is another step on Hong Kong's road to serfdom.WSJ, Oct. 14, 2013 1:02 p.m. ET
http://online.wsj.com/news/articles/SB10001424052702304106704579134973249439240

Hong Kong's decision to create a poverty line puts us in mind of John Cowperthwaite, financial secretary from 1961-71 and one of the chief architects of the territory's free-market system. Sir John famously refused to collect basic economic data on the grounds that statistics only increased the temptation for government to meddle. An arbitrary measure of poverty is a perfect example, since it encourages policies that will undermine the social mobility and economic growth needed to reduce poverty.

Hong Kong's new poverty line was set at one half the median income, which means that 20% of the population is considered poor. The most obvious objection to such a cut-off is that the number of poor will remain relatively stable regardless of their real conditions. If the government gives out money, this will tend to raise the median income and hence the poverty line, necessitating yet more handouts.

Then there's the problem of using income to measure poverty, since many residents, especially the elderly, live on their savings. Those without savings may rely on help from family members. So while poverty is a real problem in Hong Kong that deserves attention, this poverty line is a crude attempt to quantify it.

Nevertheless, many politicians in both the pro-Beijing and pro-democracy camps are eager to expand Hong Kong's small welfare state, and they will no doubt use this new tool to lobby for more benefits. Also, in 2011 a minimum wage came into effect, with the reassurance that it was set low enough to minimize job losses. Now the poverty line is a talking point for raising the minimum wage.

Those in favor of tempering Hong Kong's capitalism with socialist institutions common in the West often argue that they will do less harm since the territory's population has a strong work ethic and the government budget is in surplus. They little consider that these are the results of Sir John's laissez faire framework.

Ironically, the Chinese Communist Party appreciates Hong Kong's capitalist strengths more than local leaders. In the 1990s, after the last British Governor Chris Patten increased social welfare spending 88% in five years, Chinese diplomats warned that "Eurosocialist" policies were like "putting people on a F1 racing car which runs so fast it crashes and kills all its passengers."

Zhou Nan, Beijing's representative in the territory, complained, "The price of the future Special Administrative Region government being forced to live beyond its means would be budgetary imbalance, tax hikes, reduced financial market liquidity which will result in eroded foreign investors' confidence." Sir John couldn't have said it better himself.

Mustafa Alani: "We are learning from our enemies now how to treat the United States."

Our Former Friends the Saudis. WSJ Editorial
So how is that vow to repair America's frayed alliances working out?
Oct. 22, 2013 7:13 p.m. ET
http://online.wsj.com/news/articles/SB10001424052702303902404579151573907253280

President Obama likes to boast that he has repaired U.S. alliances supposedly frayed and battered by the Bush Administration. He should try using that line with our former allies in Saudi Arabia.

As the Journal's Ellen Knickmeyer has reported from Riyadh in recent weeks, the Kingdom is no longer making any secret of its disgust with the Administration's policy drift in the Middle East. Last month, Prince Turki al Faisal, the former Saudi ambassador in Washington, offered his view on the deal Washington struck with Moscow over Syria's chemical weapons.

"The current charade of international control over Bashar's chemical arsenal," the Prince told a London audience, "would be funny if it were not so blatantly perfidious, and designed not only to give Mr. Obama an opportunity to back down, but also to help Assad butcher his people." It's a rare occasion when a Saudi royal has the moral standing to lecture an American President, but this was one of them.

On Monday, Ms. Knickmeyer reported that Saudi intelligence chief Prince Bandar has decided to downgrade ties with the CIA in training Syrian rebels, preferring instead to work with the French and Jordanians. It's a rare day, too, when those two countries make for better security partners than the U.S. But even French Socialists are made of sterner stuff than this Administration.

Bandar's decision means the Saudis will not be inclined to bow any longer to U.S. demands to limit the arms they provide the rebels, including surface-to-air missiles that could potentially be used by terrorists to bring down civilian planes. The Saudis have also told the U.S. they will no longer favor U.S. defense contractors in future arms deals—no minor matter coming from a country that in 2011 bought $33.4 billion of American weapons.

Riyadh's dismay has been building for some time. In the aborted build-up to a U.S. strike on Syria, the Saudis asked the U.S. to beef up its naval presence in the Persian Gulf against a potential Iranian counter-strike, only to be told the U.S. didn't have the ships. In last year's foreign policy debate with Mitt Romney, Mr. Obama was nonchalant about America's shrinking Navy, but this is one of the consequences of our diminishing military footprint: U.S. security guarantees are no longer credible.

Then there is Iran. Even more than Israel, the Saudis have been pressing the Administration to strike Iran's nuclear targets while there's still time. Now Riyadh is realizing that Mr. Obama's diplomacy is a journey with no destination, that there are no real red lines, and that any foreign adversary can call his bluff. Nobody should be surprised if the Saudis conclude they need nukes of their own—probably purchased from Pakistan—as pre-emptive deterrence against the inevitability of a nuclear Tehran.

The Saudis are hardly the first U.S. ally to be burned by an American President more eager to court enemies than reassure friends. The Poles and Czechs found that out when Mr. Obama withdrew ballistic-missile defense sites from their country in 2009 as a way of appeasing the Russians.

The Syrian people have learned the hard way that Mr. Obama does not mean what he says about punishing the use of chemical weapons or supplying moderate rebel factions with promised military equipment. And the Israelis are gradually realizing that their self-advertised "best friend" in the White House will jump into any diplomatic foxhole rather than act in time to stop an Iranian bomb.

Now the Saudis have figured it out, too, and at least they're not afraid to say it publicly. "They [the Americans] are going to be upset—and we can live with that," Saudi security analyst Mustafa Alani told Ms. Knickmeyer last month. "We are learning from our enemies now how to treat the United States."

Tuesday, August 27, 2013

Review of Thomas Healy's The Great Dissent

What Democracy Requires. By Joshua
Review of Thomas Healy's The Great Dissent
Justice Holmes changed his mind about free speech—and rediscovered the original intent of the First Amendment.
The Wall Street Journal, August 23, 2013, on page C5
http://online.wsj.com/article/SB10001424127887324108204579022881137648134.html

In the working sections of the Supreme Court building in Washington, D.C., the quiet places where the justices have their chambers and the staffs go about their work, portraits of the former members of the court peer out from almost every room and hallway. I used to find myself, when I worked there some years ago, pausing beneath the past luminaries and wondering what they might have to say about the court's current cases.

I never got very far with Oliver Wendell Holmes (1841-1935). His portrait didn't invite inquiry. He sat straight-backed in his judicial robes, his lips pursed beneath a virile white mustache, eyes boring directly ahead. He conveyed simultaneously grandeur and skepticism, as if he might interrupt you at any moment to say, "That's nonsense." This is Holmes in his Solomonic pose, the man hailed as the "Master of Sentences," lionized in an early biography as the "Yankee from Olympus," his life made the subject of a 1950s Hollywood film. It was an image that Holmes spent nearly the whole of his adult life cultivating, driven on by his galloping ambition. "I should like to be admitted," he told a correspondent in 1912, "as the greatest jurist in the world."

Holmes would surely have approved of Thomas Healy's "The Great Dissent." The subtitle conveys the narrative's gist: "How Oliver Wendell Holmes Changed His Mind—and Changed the History of Free Speech in America." Mr. Healy recounts Holmes's emergence late in his career as a champion of free speech and tells the story of the coterie of young intellectuals, led by Felix Frankfurter and Harold Laski, who worked assiduously to shape Holmes's views. It is a fascinating tale—and a charming one, of an aging and childless Holmes befriended by a rising generation of legal thinkers, surrogate sons who persuade him over time to take up their cause.

Mr. Healy, a professor of law at Seton Hall, is at his best detailing the younger men's campaign to win Holmes to their view of the First Amendment. In March 1919, Holmes still believed that the government could punish "disloyal" speech and wrote an opinion supporting the 1917 Espionage Act, which made it illegal to criticize the draft or American involvement in World War I. In Debs v. United States, the Supreme Court unanimously upheld the prosecution of Socialist Party leader Eugene Debs for his critical statements about the war. Less than nine months later, Holmes had changed his mind, dramatically. In Abrams v. United States, he broke with his colleagues and with his own earlier views and argued that the Constitution didn't permit the government to punish speech unless it posed a "clear and present danger" of public harm. Laws penalizing any other type of public speech were unconstitutional. Holmes's Abrams opinion is the "great dissent" of Mr. Healy's title.

The youthful acolytes had made the difference. As Mr. Healy elaborates, Holmes had developed a knack for collecting young admirers in his years on the Supreme Court (1902-32). In 1919, Holmes's circle included Frankfurter, a junior professor at Harvard Law School serving in the Wilson administration, and the Englishman Harold Laski, just 25 and like Frankfurter a Jew and a teacher at Harvard. Both men would go on to illustrious careers—Frankfurter on the Supreme Court and Laski as a political theorist and chairman of the British Labour Party. Both admired Holmes for his modernist intellectual outlook: for his skepticism about moral absolutes and dislike of formal legal doctrine; and for what they believed (mistakenly) to be Holmes's progressive political views.

Even before the Debs case, Laski had been plying Holmes with arguments about free speech. After Holmes's disappointing opinion in that case, Laski redoubled his efforts, assisted by letters from Frankfurter and well-timed essays from the pair's allies at the New Republic magazine. As it happened, both Laski and Frankfurter suffered professionally in 1919 for their sometimes outspoken political views—both were briefly in danger of being dismissed from Harvard. Mr. Healy implies that their ordeal may have heightened Holmes's appreciation for free speech. But the more likely turning point came in the summer of 1919, when Laski forwarded to Holmes an article defending freedom of speech for its social value and then introduced Holmes to its author, another young Harvard law professor named Zechariah Chafee Jr.

Chafee, who was no sort of progressive and whose specialty was business law, argued that free speech advanced a vital social interest by promoting the discovery and spread of truth, which in turn allowed democracy to function. Holmes had never been much of a proponent of individual liberty, but he was profoundly committed to majoritarian democracy. Free speech as a social good was a rationale he could buy. And in his Abrams dissent a few months later, he did. He would eventually conclude that the First Amendment shielded speech from both federal and state interference.

Mr. Healy tells this conversion story well, bringing the reader into Holmes's confidence and into the uneasy, war-weary milieu of 1919 America. "The Great Dissent" is compelling, too, for the glimpses it gives of the human Holmes rather than the Olympian public figure. Here is Holmes standing at his writing desk to compose his court opinions, keeping them brief lest his legs tire; waxing rhapsodic each spring about the bloodroot flowers in Rock Creek Park. He was unfailingly decorous to his colleagues—even as he was indifferent to his wife—but quivered and fumed at the merest hint of criticism, unable to acknowledge that he had ever been mistaken about anything of importance.

All too often, however, Mr. Healy lapses into hagiography and an annoyingly Whiggish mode of storytelling, in which our modern free-speech doctrine —which protects the right of individuals and corporations to speak on most any topic at most any time—is portrayed as the Inevitable Truth toward which constitutional history has been marching all along. In this story, Holmes's embrace of free speech emerges as the very culmination of his life's work and its linchpin. "It was almost as if Holmes had been working toward this moment his entire career," Mr. Healy says triumphantly.

Not quite. Holmes's endorsement of free speech as a constitutional principle was far more ambivalent than Mr. Healy lets on and in considerable tension with the rest of his jurisprudence. This is precisely what makes it so interesting. Holmes's struggle to reconcile freedom of speech with his other legal ideas helped him to see connections that contemporary Americans are apt to miss.

Holmes made his name on the court as an advocate of judicial restraint. He thought courts should overturn the judgment of democratic legislatures in only the most extraordinary of circumstances. He was a skeptic. He believed law didn't have much to do with morality—"absolute truth is a mirage," he once said—or even logic. As he saw it, law was nothing more than "the dominant opinion of society." The Constitution placed no firm bounds on the right of the majority to do as it pleased. It was "made for people of fundamentally differing views," he said. The majority could choose the view and pursue the policies it wanted, for the reasons it wanted.

All this being true, the judiciary had no business substituting its views for those of the public. If law was based merely on opinion and raw preference, the people's preferences should count, not judges'.

How then did Holmes come to hold that the First Amendment could be used to strike down laws of Congress and even of the states? The answer is that Holmes came to see the principle of free speech as an essential part of majority rule; it was valuable because it helped majorities get their way.

Mr. Healy notes the influence on Holmes of Chafee's "social argument" for free speech but fails to explain just how central it was to his conversion experience. In his dissenting opinion in Abrams, Holmes wrote: "The best test of truth is the power of the thought to get itself accepted in the competition of the market." Truth was whatever the majority thought it was, but if the majority was going to make up its mind in a sensible way, it needed to have as many options before it as possible. Then too, majorities changed their minds, and protecting speech that was unpopular now preserved opinions that the majority might come to favor in the future. "The only meaning of free speech," Holmes wrote in 1925, is that every idea "be given a chance" to become in time the majority creed.

Such reasoning tethered free speech to majority rule, but it was less than perfectly consistent. Even as he valorized the right to speak, Holmes continued to insist that "the dominant forces in the community" must get what they wanted. Yet if free speech were to mean anything at all as a constitutional right, it would mean that majorities could not get their way in all circumstances. From time to time, Holmes recognized as much; in one of his last opinions he wrote that the "principle of free thought" means at bottom "freedom for the thought we hate." How forcing the majority to tolerate speech it hated facilitated that same majority's right to have its way is a formula Holmes never quite explained.

Mr. Healy suggests that with Holmes's dissent in Abrams, the modern era of First Amendment law had arrived. But Holmes's majoritarianism didn't prevail as the principal rationale for free speech at the Supreme Court, which has instead emphasized individuals' right to speak regardless of the social interests involved. Still, for all its internal tensions, Holmes's unfinished view—he continued to puzzle over the problem right through his retirement from the court in 1932—captures something that the contemporary adulation of free speech has hidden.

Holmes saw that the Constitution's commitment to freedom of speech is inextricably bound up with the project of self-government that the Constitution was designed to make possible. That project depends on an open exchange of ideas, on discussion between citizens and their representatives, on the ability of everyday Americans to talk and reason together.

This sort of government is a way of life, and the First Amendment helps makes it possible by prohibiting the state from censoring the organs of social communication. The government may not control newspapers or printing presses or stop citizens from stating their views. Government may not halt the dissemination of ideas.

In the past half-century, however, the Supreme Court has increasingly spoken of the right to free speech as a right to free expression. Under that rubric, it has expanded the First Amendment to cover all manner of things unconnected to public life, be it art or pornography or commercial advertising. This trend has been even more pronounced in popular culture, where the right to express oneself is now widely regarded as the essence of the freedom to speak.

And to be sure, individual expression is a valuable thing. The danger is in coming to think of free speech as merely expression. That reductionism encourages Americans to see freedom of speech, and freedom generally, as mainly about the pursuit of private aims. But in the end, such thinking represents a loss of confidence, or worse, a loss of interest in the way of living that is self-government—in the shared decisions and mutual persuasion that is how a free people makes a life together. Ours is a country saturated with talk and shouted opinions and personal exhibitionism but one less and less interested in the shared civil discourse that democracy requires.

Holmes wouldn't have described free speech or self-government in such elevated terms. He was too much the skeptic for that. But he came to understand, in his own way, the profound value of free speech to a free people. The story of this discovery is worth revisiting.

—Mr. Hawley, an associate professor of law at the University of Missouri and former judicial clerk to Chief Justice of the United States John G. Roberts Jr., is the author of "Theodore Roosevelt: Preacher of Righteousness" (2008).

Sunday, July 7, 2013

Lord Morris of Borth-y-Gest Memorial Lecture. By Michael Howard, MP. July 6, 2006

Lord Morris of Borth-y-Gest Memorial Lecture. By Michael Howard, MP
http://web.archive.org/web/20070505062753/http://www.michaelhowardmp.com/speeches/lampeter060706.htm
July 6, 2006


It is a great privilege to have been invited to give this lecture.

Lord Morris of Borth-y-Gest – or John Willie as I recall him being almost universally referred to – was one of the giants of the law when I studied it at Cambridge and during the years when I was making my way as a Junior Member of the Bar.

Superficially we had quite a few things in common. We were, of course, both Welsh. We were both members of the Inner Temple. We had both been Presidents of the Cambridge Union. And we both, and this may be particularly encouraging to some, took second-class degrees in law.

But there, I fear, the similarities come to an end. I could not hope, even to begin to match the distinction of John Willie’s attainments at the Bar, on the Bench and as one of our great appeal judges. Nor, let’s be frank about this, could I aspire to his hallmarks of gentleness, patience and universal popularity.

He was a legend in the land. And not just, of course, for what he achieved in his legal career. At the outbreak of war in 1914, at the age of 17 he joined the Royal Welsh Fusiliers, saw service in France, reached the rank of Captain and was awarded the Military Cross. And it is said that, after being appointed a Law Lord in 1960 he walked down Whitehall to the House of Lords every day, lifting his hat as he passed the cenotaph.

Sadly I never had the honour of appearing before him. But I did meet him. When I was an undergraduate at Cambridge he came to see us to encourage us to go to the Bar.

I cannot pretend that this was a decisive influence on my own career because I had already made up my mind that that was what I wanted to do. So none of the blame for my subsequent career can be laid at John Willie’s door.

The Dictionary of National Biography, in describing his judicial characteristics, says that he was 'vigilant in protecting the freedom of the individual when threatened by the executive' and adds that 'he exhibited judicial valour consistently and in full measure.'

These statements are justified. But they must be interpreted in the spirit and context of their time. Thirty years ago judges were also conscious of the constraints which were imposed on their role.

Since then, that role has been greatly expanded, first as a consequence of the enlargement of judicial review, more recently as a result of the Human Rights Act. It is to that trend, its implications and its consequences that I intend to devote the rest of my remarks this evening.

Over thirty years ago, on a visit to Philadelphia, I fell into conversation with a woman who had recently been given a parking ticket. She had been incensed, so incensed that she decided to go to Court to challenge it.

When she appeared in Court she was rather surprised when the magistrate called all the defendants who were due to appear that day to the bar of the Court. He told them his name and asked them to remember it. Then he said, “All cases dismissed.”

The astonishment of my acquaintance at this development was tempered somewhat when she discovered that a few days later the regular election of magistrates in the city was due to take place. The magistrate before whom she had appeared, albeit rather briefly, was re-elected with the biggest majority in the history of the Philadelphia magistracy.

When I was told that story I reacted, I am sorry to say, with a rather superior disdain. “What can you expect” I asked, “if you elect magistrates and judges? We in Britain would never contemplate any such step.”

Thirty years on I am much less sure. The truth is that during that time the power of judges in this country was increased, is increasing and will increase further, if nothing is done to change things.

For the most part this increase in power has been at the expense of elected Governments and elected Parliaments. Our judges, of course, are unelected. They are unaccountable. They cannot be dismissed, save in the most extreme circumstances, and in practice never are.

Moreover they are appointed without regard to their political background and views are without any public scrutiny, parliamentary or otherwise. I believe that this has, in the past, been one of the great strengths of our judiciary. But as they move, increasingly, to the centre of the political stage how long can this state of affairs continue?

It would be wrong to suggest that this shift in power is entirely new or that it is entirely due to the coming into force of the Human Rights Act.

The Courts have traditionally had the power to curb the illegal, arbitrary or irrational exercise of power by the Executive. But, traditionally this power was exercised with restraint.

The Courts would be careful not to quash decisions because they disagreed on the merits with the decisions under challenge.

There is common consent that during the last 50 years this restraint has been eroded. As the previous Lord Chancellor, Lord Irvine put it, in his 1995 Address to the Administrative Law Bar Association:
“The range of circumstances in which decisions may be struck down has been extended beyond recognition.”

That address was essentially a plea for judicial restraint. Indeed in it the future Lord Chancellor referred to what he described as the “constitutional imperative of judicial self-restraint.”

He gave three reasons for it. First he referred to the constitutional imperative – the fact that Parliament gives powers to various authorities, including Ministers, for good reasons and in reliance on the level of knowledge and experience which such authorities possess. Secondly, he referred to the lack of judicial expertise which, he said, made the Courts ill-equipped to take decisions in place of the designated authority. Thirdly, and most pertinently, he referred to what he called the democratic imperative – the fact that elected public authorities derive their authority in part from their electoral mandate.

It is worth quoting his words in full: “The electoral system,” he said, “also operates as an important safeguard against the unreasonable exercise of public powers, since elected authorities have to submit themselves, and their decision-making records, to the verdict of the electorate at regular intervals.”

With respect to Lord Irvine, I couldn’t have put it better myself.

Remarkably enough he even prayed in aid, as one of his arguments against judicial intervention, the fact that it would strengthen objections to the incorporation of the European Convention on Human Rights into our law – the very Human Rights Act which he did so much to introduce.

Rightly describing it as a step which would hugely enhance the role and significance of the judiciary in our society he said this:- “The traditional objection to incorporation has been that it would confer on unelected judges powers which naturally belong to Parliament. That objection, entertained by many across the political spectrum, can only be strengthened by fears of judicial supremacism.”

Lord Irvine was right. My essential objection to the Human Rights Act is that it does involve a very significant shift in power from elected representatives of the people to unelected judges. Members of Parliament, and Ministers are, except for Ministers in the House of Lords like the Lord Chancellor, answerable to their electorates. As I know only too well they can be summarily dismissed by the electorate. They are directly accountable. Judges, as I have already pointed out, are unelected, unaccountable and cannot be dismissed.

The reason why this difficulty arisesin such acute form as a result of the Human Rights Act is because so many of the decisions which our judges now have to make under it are, essentially, political in nature.

Just this week, Charles Clarke, the former Home Secretary, complained that, and I quote:- “One of the consequences of the Human Rights Act is that our most senior judiciary are taking decisions of deep concern to the security of our society without any responsibility for that security.”

What on earth did he expect?

Of course that is one of the consequences of the Human Rights Act. It is an inevitable consequence. It is what the Human Rights Act obliges the senior judiciary to do. It is not the fault of the judges if they perform, as conscientiously as they can, duties which the Government has placed on them.

And it is not as though the Government were not warned.

To select a quote almost at random Appeal Court Judge Sir Henry Brooke predicted that judges would be drawn into making “much more obviously political decisions.” He pointed out that under the Act “for the first time judges would have to decide whether government interference with a human right was 'necessary in a democratic society.’ – and that, of course, is clearly a political value judgement.

How does this arise? In a nutshell the Act requires our courts to apply the European Convention on Human Rights in every decision they make. The rights which the Convention seeks to protect are framed in very wide terms. The Convention was drawn up in the aftermath of the Second World War. Its authors saw it as a safeguard against any revival of Nazism or any other form of totalitarian tyranny. I suspect that many of them would turn in their graves if they were able to see the kind of cases which are being brought in reliance on it today.

None of these rights can be exercised in isolation. Any decision to uphold one right may well infringe someone else’s right. Or it may conflict with the rights of the community at large.

The example that has most recently hit the headlines well illustrates the difficulties that arise.

As David Cameron pointed out in his recent speech on this subject life in the globalised twenty first century world presents two great challenges to governments. The first is to protect our security. The second is protecting our liberty.

We would, I suspect, all agree with his view that 'it is vital that free societies do all they can to maintain people’s human rights and civil liberties, not least because a free society is, in the long term, one of the best protections against terrorism and crime.”

As he said, “The fundamental challenge is to strike the right balance between security and liberty.”

The fundamental question is who is ultimately responsible for striking that balance: elected members of Parliament or unelected judges?

In the cases on terrorism, Parliament twice, after much anxious consideration by both Houses, reached its view. It was not always a view with which I agreed. But it was the view of Parliament.

Yet twice the Judges have held that Parliament got the balance wrong. They thought the balance should be struck differently.

And in doing so they were not deliberately seeking to challenge the supremacy of Parliament. They were simply doing what Parliament has asked them to do.

There are countless other examples. In his recent speech on the subject David Cameron discussed the way in which the Human Rights Act has made the fight against crime harder.

He cited the example of the Assets Recovery Agency, which was set up to seize the assets of major criminals.

The agency has been forced to spend millions of pounds fighting legal challenges brought by criminals under the Human Rights Act.

This has had bogged down cases for years, and the backlog in the courts has grown to 146 uncompleted claims.

The Director of the Agency has directly blamed the human rights “bandwagon” for thwarting its efforts.

He referred to the case of the convicted rapist, Anthony Rice, who was wrongly released on licence and then murdered Naomi Bryant.

The bridges Report set up to investigate the case makes clear that one of the factors that influenced the thinking of officials in dealing with Rice was a concern that he might sue them under the Human Rights Act.

As David Cameron acknowledged there were other elements in the case that had no connection to human rights.

And it is true that any legal challenge by Rice might well have failed.

But it remains the case that officials sought to protect themselves rather than risk defeat in the courts.

The Rice case illustrates a wider trend.

Even without actual litigation, some public bodies are now so frightened of being sued under the Human Rights Act that they try to protect themselves by making decisions that are often absurd and occasionally dangerous.

We saw this recently when the police tried to recapture foreign ex-prisoners who should have been deported and had instead gone on the run.

The obvious thing to do would have been to issue “Wanted” posters but police forces across the country refused to do so on the grounds that it would breach the HRA.

The Association of Chief Police officers says in its guidance to forces: “Article 8 of the Human Rights Act gives everyone the right to respect for their private and family life.....and publication of photographs could be a breach of that.”

According to ACPO, photographs should be released only in “exceptional circumstances”, where public safety needs to override the case for privacy.

These were criminals who had been convicted of very serious offences and who shouldn’t even have been in the UK.

Yet the Metropolitan Police said, “We will use all the tools in our tool box to try and find them without printing their identity – that’s the last recourse.”

Perhaps the most ludicrous recent example occurred a few weeks ago when a suspected car thief clambered onto the roof tops after a high speed chase and began pelting the police who had tried to follow him with roof tiles.

It ended with a siege that would waste the time of 50 police officers, close the street until 9.40pm and culminate in the spectacle of the suspect being handed a bucket of KFC chicken, a two litre bottle of Pepsi and a packet of cigarettes at tax payers expense – all apparently to preserve his “human rights.”

Of course there are examples of cases where the Act has led to results most of us would applaud. But we have to ask whether those results could not have been achieved by effective lobbying of our elected Parliament or a change of Government following an Election.

The Human Rights Act requires the Courts to interpret legislation so that it complies with the Convention if that is at all possible. If in the Court’s view any secondary legislation – passed after due consideration by both Houses of Parliament – is incompatible with the Convention that legislation can be struck down by the Court.

If any primary legislation is held to be incompatible there is a fast-track procedure which would enable the Government to short-circuit the normal processes of parliamentary scrutiny in order to amend or repeal any such legislation.

This surely a direct threat to the very democratic imperative on which the then Lord Chancellor waxed so eloquent 5 years ago.

One of the consequences of this is likely to be the increasing politicisation of judges.

How long, if the Act remains in force, will our present system of selection of judges survive? How long before the political backgrounds of candidates for judicial office become subject to Parliamentary scrutiny? How long before we see demands that these judges submit themselves for election?

The most common argument in favour of the Act is that it 'brings rights home.’ By that its supporters mean that since the Act could in any event be relied upon in an appeal from the English Courts to the European Court of Human Rights it is much better to allow English judges to apply it themselves. Indeed in presenting this argument the impression is sometimes given that the new jurisdiction of the English Courts will in some way replace the jurisdiction of the European Court of Human Rights. This is of course quite untrue. The right to appeal to the ECHR will remain.

I would concede that the previous situation was not ideal.

The ECHR does sometimes reach decisions which are very difficult to understand and sometimes cause considerable frustration.

But there is a remedy for this which the last Government was pursuing. The ECHR recognises the existence of what it calls a 'margin of appreciation.’ By that it means that will make some allowance, in applying the Convention, for the local circumstances and traditions of the country from which the appeal is brought. The last Government had embarked on a campaign to increase this margin of appreciation so that the Court would give greater leeway to countries to decide things for themselves.

Now the very future of the margin of appreciation is uncertain. Academic controversy rages on to whether our courts will apply it. And the ECHR is much less likely to apply it to decisions of our Courts than to decisions of administrative bodies.

It is in this context that David Cameron’s proposal for a British Bill of Rights should be considered.

As Mr Cameron expressly said the existence of a clear and codified British Bill of Rights will tend to lead the European Court of Human Rights to apply, and I would add to enhance, the “margin of appreciation.”

This seems to me to be the key to the continuing application and acceptance of the European Convention. It was intended to be a backstop to ensure that there was no repetition in Western European of Nazi atrocities and to minimise, as far as possible, the danger of future totalitarian outrages. It was not intended to strike down carefully considered judgements by democratically elected authorities of where the balance should be struck between legitimate but competing interests.

The route to this more limited role for the Convention and the Court which adjudicates on it lies through an enhanced margin of appreciation. A British Bill of Rights may well help us to reach this very desirable destination.

It is of course true, as Mr Cameron himself acknowledged, that the drafting of such a Bill would represent a formidable challenge. But this is true of all charters of this kind. If it helps us to achieve a workable solution to our relationship with the European Convention the effort will be well worth while.

And if it also enables us to scrap the discredited Human Rights Act it would be doubly welcome.

As the distinguished Scottish judge, Lord McCluskey predicted, the Act has become:- “A field day for crackpots, a pain in the neck for judges and a goldmine for lawyers.”

It is an experiment that has failed. It should go.

Friday, July 5, 2013

On Mr Lafe Solomon's, National Labor Relations Board's acting general counsel, letter to Cablevision

The Lord of U.S. Labor Policy. By Kimberley Strassel
Lafe Solomon, acting general counsel of the National Labor Relations Board, defies Congress and the courts on behalf of Big Labor.The Wall Street Journal, July 4, 2013, on page A9
http://online.wsj.com/article/SB10001424127887323899704578583671862397166.html

For a true expression of the imperious and extralegal tendencies of the Obama administration, there is little that compares with the Wisdom of Solomon. Lafe Solomon, that is, the acting general counsel of the National Labor Relations Board.

Mr. Solomon's wisdom was on revealing display this week, in the form of a newly disclosed letter that the Obama appointee sent to Cablevision in May. The letter was tucked into Cablevison's petition asking the Supreme Court this week to grant an emergency stay of NLRB proceedings against it. The Supremes unfortunately denied that request, though the exercise may prove valuable for shining new light on the labor board's conceit.

A half-year has passed since the D.C. Circuit Court of Appeals ruled in Noel Canning that President Obama's appointments to the NLRB were unconstitutional, and thus that the board lacks a legal quorum. In May, the Third Circuit affirmed this ruling. Yet the NLRB—determined to keep churning out a union agenda—has openly defied both appeals courts by continuing to issue rulings and complaints.

Regional directors in April filed two such unfair-labor-practice complaints against Cablevision. The company requested that Mr. Solomon halt the proceedings, given the NLRB's invalid status. It is Mr. Solomon's refusal, dated May 28, that provides the fullest expression of the NLRB's insolence.

The acting general counsel begins his letter by explaining that the legitimacy of the board is really neither here nor there. Why? Because Mr. Solomon was himself "appointed by the President and confirmed by the Senate"—and therefore, apparently, is now sole and unchecked arbiter of all national labor policy.

This is astonishing on many levels, the least of which is that it is untrue. Mr. Solomon is the acting general counsel precisely because the Senate has refused to confirm him since he was first nominated in June 2011. Nor will it, ever, given his Boeing BA +1.38% escapades.

Then there is the National Labor Relations Act, which created the NLRB. The law clearly says that the general counsel acts "on behalf of the Board"—a board that is today void, illegitimate, null, illegal. Mr. Solomon admits the "behalf" problem in his letter, though he says he's certain Congress nonetheless meant for him to be "independent" of the board. He says.

The acting general counsel naturally rushes to explain that—his omnipotence aside—the NLRB still has every right to ignore the courts. His argument runs thus: Because a decade ago the 11th Circuit issued an opinion that upholds recess appointments (though it didn't deal with Mr. Obama's breathtaking reading of that power), there exists a "split" in the circuit courts. The NLRB is therefore justified in ignoring any courts with which it disagrees until the Supreme Court has "resolved" the question.

What Mr. Solomon fails to note is the extremes the NLRB has gone to in order to suggest court confusion. The agency has deviated from past procedures, and it refused to ask either the D.C. Circuit or the Third Circuit to "stay" their opinions. Why? Because to do so—and to be rebuffed—would put the NLRB under enormous pressure to acknowledge that those courts have authority over its actions.

The board has likewise ignored the fact that the D.C. Circuit hears more NLRB decisions than any other, and is also the pre-eminent court for reviewing federal agency decisions. This ought to entitle that court, and its Noel Canning ruling, respectful deference from the labor board.

The most revealing part of Mr. Solomon's letter is the section cynically outlining why the NLRB continues to operate at a feverish pace. Mr. Solomon notes that this isn't the first time the board has operated without a quorum.

The NLRB issued 550 decisions with just two board members before the Supreme Court's 2010 ruling in New Process Steel that the NLRB must have a three-person board quorum to operate. Mr. Solomon brags that of these 550, only about 100 were "impacted" by the Supreme Court's ruling—which, he writes, proves that the NLRB is justified in continuing to operate even at times when its "authority" has been challenged.

Mr. Solomon is in fact celebrating that of the 550 outfits harassed by an illegal, two-member board, only about 100 later decided they had the money, time and wherewithal to spend years relitigating in front of the labor goon squad. The NLRB is counting on the same outcome in Cablevision and other recent actions.

The board will push through as many rulings and complaints against companies as it can before the Supreme Court rules on its legitimacy. And it will trust that the firms it has attacked and drained will be too weary to then try for reversals. This is why the Obama administration waited so long to petition the Supreme Court to reverse Noel Canning. The longer this process takes, the more damage the NLRB can inflict on behalf of its union taskmasters.

Right now, the NLRB is the only weapon the administration can wield on behalf of Big Labor. The need to placate that most powerful special interest was behind Mr. Obama's decision to install his illegal recess appointments in the first place, and it explains the NLRB's continuing defiance of courts and Congress. Mr. Solomon's wisdom is the Obama philosophy of raw power, in all its twisted glory.

Saturday, June 8, 2013

How America Lost Its Way. By Niall Ferguson

How America Lost Its Way. By Niall Ferguson
http://online.wsj.com/article/SB10001424127887324798904578527552326836118.htmlThe Wall Street Journal, June 8, 2013, on page C1
It is getting ever harder to do business in the United States, argues Niall Ferguson, and more stimulus won't help: Our institutions need fixing.

Not everyone is an entrepreneur. Still, everyone should try—if only once—to start a business. After all, it is small and medium enterprises that are the key to job creation. There is also something uniquely educational about sitting at the desk where the buck stops, in a dreary office you've just rented, working day and night with a handful of employees just to break even.

As an academic, I'm just an amateur capitalist. Still, over the past 15 years I've started small ventures in both the U.S. and the U.K. In the process I've learned something surprising: It's much easier to do in the U.K. There seemed to be much more regulation in the U.S., not least the headache of sorting out health insurance for my few employees. And there were certainly more billable hours from lawyers.


By the Numbers

    433: Total number of days it takes in the U.S. to start a business, register a property, pay taxes, get an import and export license and enforce a contract
    368: Total number of days it took to do the same in 2006
    7: U.S. ranking, out of 144 countries, on the World Economic Forum's 2012-2013 Global Competitiveness Index
    1: U.S. ranking on the 2008-2009 Global Competitiveness Index
    33: U.S. ranking for its legal system and property rights in 2010 on the Fraser Institute's Economic Freedom index, out of 144 countries
    9: U.S. ranking for its legal system and property rights in 2000

Sources: 'Doing Business'; World Economic Forum; Fraser Institute


This set me thinking. We are assured by vociferous economists that economic growth would be higher in the U.S. and unemployment lower if only the government would run even bigger deficits and/or the Fed would print even more money. But what if the difficulty lies elsewhere, in problems that no amount of fiscal or monetary stimulus can overcome?

Nearly all development economists agree that good institutions—legislatures, courts, administrative agencies—are crucial. When poor countries improve their institutions, economic growth soon accelerates. But what about rich countries? If poor countries can get rich by improving their institutions, is it not possible that rich countries can get poor by allowing their institutions to degenerate? I want to suggest that it is.

Consider the evidence from the annual "Doing Business" reports from the World Bank and International Finance Corporation. Since 2006 the report has published data for most of the world's countries on the total number of days it takes to start a business, get a construction permit, register a property, pay taxes, get an export or import license and enforce a contract. If one simply adds together the total number of days it would take to carry out all seven of these procedures sequentially, it is possible to construct a simple measure of how slowly—or fast—a country's bureaucracy moves.

Seven years of data suggest that most of the world's countries are successfully making it easier to do business: The total number of days it takes to carry out the seven procedures has come down, in some cases very substantially. In only around 20 countries has the total duration of dealing with "red tape" gone up. The sixth-worst case is none other than the U.S., where the total number of days has increased by 18% to 433. Other members of the bottom 10, using this metric, are Zimbabwe, Burundi and Yemen (though their absolute numbers are of course much higher).

Why is it getting harder to do business in America? Part of the answer is excessively complex legislation. A prime example is the 848-page Wall Street Reform and Consumer Protection Act of July 2010 (otherwise known as the Dodd-Frank Act), which, among other things, required that regulators create 243 rules, conduct 67 studies and issue 22 periodic reports. Comparable in its complexity is the Patient Protection and Affordable Care Act (906 pages), which is also in the process of spawning thousands of pages of regulation. You don't have to be opposed to tighter financial regulation or universal health care to recognize that something is wrong with laws so elaborate that almost no one affected has the time or the will to read them.


Who benefits from the growth of complex and cumbersome regulation? The answer is: lawyers, not forgetting lobbyists and compliance departments. For complexity is not the friend of the little man. It is the friend of the deep pocket. It is the friend of cronyism.

We used to have the rule of law. Now it is tempting to say we have the rule of lawyers, which is something different. For the lawyers can also make money even in the absence of complex legislation.

It has long been recognized that the U.S. tort system is exceptionally expensive. Indeed, tort reform is something few people will openly argue against. Yet the plague of class-action lawsuits continues unabated. Regular customers of Southwest Airlines recently received this email: "Did you receive a Southwest Airlines drink coupon through the purchase of a Business Select ticket prior to August 1, 2010, and never redeem it? If yes, a legal Settlement provides a Replacement Drink Voucher, entitling you to a free drink aboard a Southwest flight, for every such drink coupon you did not redeem."

This is not the product of the imagination of some modern-day Charles Dickens. It is a document arising from the class-action case, In re Southwest Airlines Voucher Litigation, No. 11-cv-8176, which came before Judge Matthew F. Kennelly of the District Court for the Northern District of Illinois. As the circular explains: "This Action arose out of Southwest's decision, effective August 1, 2010, to only accept drink coupons received by Business Select customers with the purchase of a Business Select ticket on the date of the ticketed travel. The Plaintiffs in this case allege Southwest, in making that decision, breached its contract with Class Members who previously received drink coupons," etc.

As often happens in such cases, Southwest decided to settle out of court. Recipients of the email will have been nonplused to learn that the settlement "will provide Replacement Drink Vouchers to Class Members who submit timely and valid Claim Forms." One wonders how many have bothered.

Cui bono? The answer is, of course, the lawyers representing the plaintiffs. Having initially pitched for "up to $7 million in fees, costs and expenses," these ingenious jurists settled for fees of $3 million "plus costs not to exceed $30,000" from Southwest.

Canada's Fraser Institute has been compiling an "Economic Freedom" index since 1980, one component of which is a measure of the quality of a country's legal system and property rights. In the light of a case like the one described above, there is nothing surprising about the recent decline in U.S. performance. In 2000 U.S. law scored 9.23 out of 10. The most recent score (for 2010) was 7.12.

Such indexes must be used with caution, but the Fraser index is not the only piece of evidence suggesting that the rule of law in the U.S. is not what it was. The World Justice Project uses a completely separate methodology to assess countries' legal systems. The latest WJP report ranks the U.S. 17th out of 97 countries for the extent to which the law limits the power of government, 18th for the absence of corruption, 19th for regulatory enforcement, 22nd for access to civil justice and the maintenance of order and security, 25th for fundamental rights, and 26th for the effectiveness of criminal justice. Of all the former British colonies in the report, the U.S. ranks behind New Zealand, Australia, Singapore, Canada, Hong Kong and the United Kingdom—though it does beat Botswana.

The decline of American institutions is no secret. Yet it is one of those strange "unknown knowns" that is well documented but largely ignored. Each year, the World Economic Forum publishes its Global Competitiveness Index. Since it introduced its current methodology in 2004, the U.S. score has declined by 6%. (In the same period China's score has improved by 12%.) An important component of the index is provided by 22 different measures of institutional quality, based on the WEF's Executive Opinion Survey. Typical questions are "How would you characterize corporate governance by investors and boards of directors in your country?" and "In your country, how common is diversion of public funds to companies, individuals, or groups due to corruption?" The startling thing about this exercise is how poorly the U.S. fares.

In only one category out of 22 is the U.S. ranked in the global top 20 (the strength of investor protection). In seven categories it does not even make the top 50. For example, the WEF ranks the U.S. 87th in terms of the costs imposed on business by "organized crime (mafia-oriented racketeering, extortion)." In every single category, Hong Kong does better.

At the same time, the U.S. has seen a marked deterioration in its World Governance Indicators. In terms of "voice and accountability," "government effectiveness," "regulatory quality" and especially "control of corruption," the U.S. scores have all gone down since the WGI project began in the mid-1990s. It would be tempting to say that America is turning Latin, were it not for the fact that a number of Latin American countries have been improving their governance scores over the same period.

What is the process at work here? Perhaps this is a victory from beyond the grave for classical Western political theory. Republics, after all, were regarded by most ancient political philosophers as condemned to decadence, or to imperial corruption. This was the lesson of Rome. Democracy was always likely to give way to oligarchy or tyranny. This was the lesson of the French Revolution. The late Mancur Olson had a modern version of such cyclical models, arguing that all political systems were bound to become the captives, over time, of special interests. The advantage enjoyed by West Germany and Japan after World War II, he suggested, was that all the rent-seeking elites of the pre-1945 period had been swept away by defeat. This was why Britain won the war but lost the peace.

Whatever the root causes of the deterioration of American institutions, smart people are starting to notice it. Last year Michael Porter of Harvard Business School published a report based on a large-scale survey of HBS alumni. Among the questions he asked was where the U.S. was "falling behind" relative to other countries. The top three lagging indicators named were: the effectiveness of the political system, the K-12 education system and the complexity of the tax code. Regulation came sixth, efficiency of the legal framework eighth.

Asked to name "the most problematic factors for doing business" in the U.S., respondents to the WEF's most recent Executive Opinion Survey put "inefficient government bureaucracy" at the top, followed by tax rates and tax regulations.

All this should not be interpreted as yet another prophecy of the imminent decline and fall of the U.S., however. There is some light in the gloom. According to the most recent United Nations projections, the share of the U.S. population that is over 65 will reach 25% only at the very end of this century. Japan has already passed that milestone; Germany will be next. By midcentury, both countries will have around a third of their population age 65 or older.

More imminently, a revolution in the extraction of shale gas and tight oil, via hydraulic fracking, is transforming the U.S. from energy dependence to independence. Not only could the U.S., at least for a time, re-emerge as the world's biggest oil producer; the lower electricity costs resulting from the fossil-fuel boom are already triggering a revival of U.S. manufacturing in the Southeast and elsewhere.

In a functioning federal system, the pace of institutional degeneration is not uniform. America's four "growth corridors"—the Great Plains, the Gulf Coast, the Intermountain West and the Southeast—are growing not just because they have natural resources but also because state governments in those regions are significantly more friendly to business. There are already heartening signs of a great regeneration in states like Texas and North Dakota.

"In America you have a right to be stupid—if you want to be." Secretary of State John Kerry made that remark off the cuff in February, speaking to a group of students in Berlin. It is not a right the founding fathers felt they needed explicitly to enshrine. But it has always been there, and America's leaders have frequently been willing to exercise it.

Yes, we Americans have the right to be stupid if we want to be. We can carry on pretending that our economic problems can be solved with the help of yet more fiscal stimulus or quantitative easing. Or we can face up to the institutional impediments to growth I have described here.

Not many economists talk about them, it's true. But that's because not many economists run businesses.


Adapted from Mr. Ferguson's new book, "The Great Degeneration: How Institutions Decay and Economies Die," to be published by Penguin Press on Thursday.

Saturday, May 25, 2013

Reading Hayek in Beijing. Bret Stephens on Yang Jisheng

Reading Hayek in Beijing. By Bret Stephens
A chronicler of Mao's depredations finds much to worry about in modern China.The Wall Street Journal, May 25, 2013, on page A11
http://online.wsj.com/article/SB10001424127887324659404578501492191072734.html

On Yang Jisheng

In the spring of 1959, Yang Jisheng, then an 18-year-old scholarship student at a boarding school in China's Hubei Province, got an unexpected visit from a childhood friend. "Your father is starving to death!" the friend told him. "Hurry back, and take some rice if you can."

Granted leave from his school, Mr. Yang rushed to his family farm. "The elm tree in front of our house had been reduced to a barkless trunk," he recalled, "and even its roots had been dug up." Entering his home, he found his father "half-reclined on his bed, his eyes sunken and lifeless, his face gaunt, the skin creased and flaccid . . . I was shocked with the realization that the term skin and bones referred to something so horrible and cruel."

Mr. Yang's father would die within three days. Yet it would take years before Mr. Yang learned that what happened to his father was not an isolated incident. He was one of the 36 million Chinese who succumbed to famine between 1958 and 1962.

It would take years more for him to realize that the source of all the suffering was not nature: There were no major droughts or floods in China in the famine years. Rather, the cause was man, and one man in particular: Mao Zedong, the Great Helmsman, whose visage still stares down on Beijing's Tiananmen Square from atop the gates of the Forbidden City.

Mr. Yang went on to make his career, first as a journalist and senior editor with the Xinhua News Agency, then as a historian whose unflinching scholarship has brought him into increasing conflict with the Communist Party—of which he nonetheless remains a member. Now 72 and a resident of Beijing, he's in New York this month to receive the Manhattan Institute's Hayek Prize for "Tombstone," his painstakingly researched, definitive history of the famine. On a visit to the Journal's headquarters, his affinity for the prize's namesake becomes clear.

"This book had a huge impact on me," he says, holding up his dog-eared Chinese translation of Friedrich Hayek's "The Road to Serfdom." Hayek's book, he explains, was originally translated into Chinese in 1962 as "an 'internal reference' for top leaders," meaning it was forbidden fruit to everyone else. Only in 1997 was a redacted translation made publicly available, complete with an editor's preface denouncing Hayek as "not in line with the facts," and "conceptually mixed up."

Mr. Yang quickly saw that in Hayek's warnings about the dangers of economic centralization lay both the ultimate explanation for the tragedies of his youth—and the predicaments of China's present. "In a country where the sole employer is the state," Hayek had observed, "opposition means death by slow starvation."

So it was in 1958 as Mao initiated his Great Leap Forward, demanding huge increases in grain and steel production. Peasants were forced to work intolerable hours to meet impossible grain quotas, often employing disastrous agricultural methods inspired by the quack Soviet agronomist Trofim Lysenko. The grain that was produced was shipped to the cities, and even exported abroad, with no allowances made to feed the peasants adequately. Starving peasants were prevented from fleeing their districts to find food. Cannibalism, including parents eating their own children, became commonplace.

"Mao's powers expanded from the people's minds to their stomachs," Mr. Yang says. "Whatever the Chinese people's brains were thinking and what their stomachs were receiving were all under the control of Mao. . . . His powers extended to every inch of the field, and every factory, every workroom of a factory, every family in China."

All the while, sympathetic Western journalists—America's Edgar Snow and Britain's Felix Greene in particular—were invited on carefully orchestrated tours so they could "refute" rumors of mass starvation. To this day, few people realize that Mao's forced famine was the single greatest atrocity of the 20th century, exceeding by orders of magnitude the Rwandan genocide, the Cambodian Killing Fields and the Holocaust.

The power of Mr. Yang's book lies in its hauntingly precise descriptions of the cruelty of party officials, the suffering of the peasants, the pervasive dread of being called "a right deviationist" for telling the truth that quotas weren't being met and that millions were being starved to death, and the toadyism of Mao lieutenants.

Yet the book is more than a history of a uniquely cruel regime at a receding moment in time. It is also a warning of what lies at the end of the road for nations that substitute individualism with any form of collectivism, no matter what the motives. Which brings Mr. Yang to the present day.

"China's economy is not what [Party leaders] claim as the 'socialist-market economy,' " he says. "It's a 'power-market' economy."

What does that mean?

"It means the market is controlled by the power. . . . For example, the land: Any permit to enter any sector, to do any business has to be approved by the government. Even local government, down to the county level. So every county operates like an enterprise, a company. The party secretary of the county is the CEO, the president."

Put another way, the conventional notion that the modern Chinese system combines political authoritarianism with economic liberalism is mistaken: A more accurate description of the recipe is dictatorship and cronyism, with the results showing up in rampant corruption, environmental degradation and wide inequalities between the politically well-connected and everyone else. "There are two major forms of hatred" in China today, Mr. Yang explains. "Hatred toward the rich; hatred toward the powerful, the officials." As often as not they are one and the same.

Yet isn't China a vastly freer place than it was in the days of Mr. Yang's youth? He allows that the party's top priority in the post-Mao era has been to improve the lot of the peasantry, "to deal with how to fill the stomach."

He also acknowledges that there's more intellectual freedom. "I would have been executed if I had this book published 40 years ago," he notes. "I would have been imprisoned if this book was out 30 years ago. Now the result is that I'm not allowed to get any articles published in the mainstream media." The Chinese-language version of "Tombstone" was published in Hong Kong but is banned on the mainland.

There is, of course, a rational reason why the regime tolerates Mr. Yang. To survive, the regime needs to censor vast amounts of information—what Mr. Yang calls "the ruling technique" of Chinese leaders across the centuries. Yet censorship isn't enough: It also needs a certain number of people who understand the full truth about the Maoist system so that the party will never repeat its mistakes, even as it keeps the cult of Mao alive in order to preserve its political legitimacy. That's especially true today as China is being swept by a wave of Maoist nostalgia among people who, Mr. Yang says, "abstract Mao as this symbol of social justice," and then use that abstraction to criticize the current regime.

"Ten million workers get laid off in the state-owned enterprise reforms," he explains. "So many people are dissatisfied with the reforms. Then they become nostalgic and think the Mao era was much better. Because they never experienced the Mao era!" One of the leaders of that revival, incidentally, was Bo Xilai, the powerful former Chongqing party chief, brought down in a murder scandal last year.

But there's a more sinister reason why Mr. Yang is tolerated. Put simply, the regime needs some people to have a degree of intellectual freedom, in order to more perfectly maintain its dictatorship over everyone else.

"Once I gave a lecture to leaders at a government bureau," Mr. Yang recalls. "I told them it's a dangerous job, you guys, being officials, because you have too much power. I said you guys have to be careful because those who want approval from you to get certain land and projects, who bribe you, these are like bullets, ammunition, coated in sugar, to fire at you. So today you may be a top official, tomorrow you may be a prisoner."

How did the officials react to that one?

"They said, 'Professor Yang, what you said, we should pay attention.' "

So they should. As Hayek wrote in his famous essay on "The Use of Knowledge in a Society," the fundamental problem of any planned system is that "knowledge of circumstances of which we must make use never exists in concentrated or integrated form but solely as the dispersed bits of incomplete and frequently contradictory knowledge which all the separate individuals possess."

The Great Leap Forward was an extreme example of what happens when a coercive state, operating on the conceit of perfect knowledge, attempts to achieve some end. Even today the regime seems to think it's possible to know everything—one reason they devote so many resources to monitoring domestic websites and hacking into the servers of Western companies. But the problem of incomplete knowledge can't be solved in an authoritarian system that refuses to cede power to the separate people who possess that knowledge.

"For the last 20 years, the Chinese government has been saying they have to change the growth mode of the economy," Mr. Yang notes. "So they've been saying, rather than just merely expanding the economy they should do internal changes, meaning more value-added services and high tech. They've been shouting such slogans for 20 years, and not many results. Why haven't we seen many changes? Because it's the problem that lies in the very system, because it's a power-market economy. . . . If the politics isn't changed, the growth mode cannot be changed."

That suggests China will never become a mature power until it becomes a democratic one. As to whether that will happen anytime soon, Mr. Yang seems doubtful: The one opinion widely shared by rulers and ruled alike in China is that without the Communist Party's leadership, "China will be thrown into chaos."

Still, Mr. Yang hardly seems to have given up hope that he can play a role in raising his country's prospects. In particular, he's keen to reclaim two ideas at risk of being lost in today's China.

The first is the meaning of rights. A saying attributed to the philosopher Lao Tzu, he says, has it that a ruler should fill the people's stomachs and empty their heads. The gambit of China's current rulers is that they can stay in power forever by applying that maxim. Mr. Yang hopes they're wrong.

"People have more needs than just eating!" he insists. "In China, human rights means the right to survive, and I argue with these people. This is not human rights, it's animal rights. People have all sorts of needs. Spiritual needs, the need to be free, the freedoms."

The second is the obligation of memory. China today is a country galloping into a century many people believe it will define, one way or the other. Yet the past, Mr. Yang insists, also has its claims.

"If a people cannot face their history, these people won't have a future. That was one of the purposes for me to write this book. I wrote a lot of hard facts, tragedies. I wanted people to learn a lesson, so we can be far away from the darkness, far away from tragedies, and won't repeat them."

Hayek would have understood both points well.

Mr. Stephens writes "Global View," the Journal's foreign-affairs column.