Saturday, March 28, 2009

Nosophobia and Fear of Invisible Toxins

Nosophobia and Fear of Invisible Toxins. By Jack Dini
ACSH, March 27, 2009

“If it smells bad, it’s bad; if it smells good, it’s bad,” says Aileen Gagney, asthma and environmental health manager with the American Lung Association in Seattle. (1) Obviously then, the key to a healthy life is to have no smells around you. How unfortunate, since we are excellent smellers!

The tongue can detect sweetness at a dilution of one part in 200, saltiness at one in 400, sourness at one in 130,000, and bitterness at one in 2 million. (2) All of this pales when compared with our ability to detect extremely low levels of smells (i.e., in the range of 50 parts per trillion to 800 parts per billion. (3)

If you are inclined to agree with Ms. Gagney, perhaps you have nosophobia, the irrational fear of contracting a disease -- or perhaps I should say nose-ophobia, since we so often hear of people fearful of contracting a disease by smelling. Let’s talk about fragrances and smoking to provide some present-day examples.

Elizabeth Whelan reports, “Fragrances now join a growing list of allegedly harmful products -- plastic bottles, rubber duckies, shower curtains, Astroturf, traditional produce raised with agricultural chemicals, aspartame, acrylamide, etc. The list seems to be growing like…well, ‘toxic mold.’ Nosophobia is causing us to abandon safe, useful products of modern technology to avoid phantom risks while obscuring the real risks around us. While parents are fretting over BPA traces in baby bottles or phthalates in plastic toys, they may well be giving short shrift to the real threats to their children’s health, including failure to use seatbelts, bike helmets, smoke detectors, vaccinations, proper nutrition and exercise.” (4)

When University of Washington professor Anne Steinemann analyzed a variety of fragranced consumer products such as air fresheners, laundry supplies, personal care products and cleaners, she found 100 different volatile organic compounds (VOCs) measuring 300 micrograms/m3, or more. (5)

Wow, you say! Sounds scary! But, let’s look at concentrations. VOCs were identified from GC/MS headspace analysis. Only those with a headspace concentration of greater than 300 micrograms/m3 were reported. Average headspace concentrations of VOCs for the six products tested ranged from 1,000 micrograms/m3 to 74,000 micrograms/m3. In Steinemann’s work, 10 of the 100 volatile organic compounds identified qualified under federal rules as toxic or hazardous, and two of those; 1,4-dioxane and acetaldehyde are classified as Hazardous Air Pollutants (HAPs). (5)

What are OSHA permissible exposure limits for 1,4-dioxane and acetaldehyde? The current OSHA permissible exposure limit for dioxane is 100 ppm (350 milligrams/m3) as an 8-hour time-weighted average. Note: THIS IS 1000 TIMES THE LEVEL reported in Steinemann’s paper. A similar case can be made for acetaldehyde. The OSHA 8-hour time-weighted average for acetaldehyde is 200 ppm (360 milligrams/m3). So, for the fragrance scare we’re talking about the amounts are considerably below allowable regulations. Yet, the comment is made; “Consumers are breathing these chemicals. No one is doing anything about it.” (1) Great scare tactics!

These days scientists can find anything in any thing and as this example shows, it can lead to a problem. The minute that something is found in food, in someone’s blood, in the air, etc., some folks get very concerned and start creating a lot of fuss. The very act of being able to measure something can give the impression that if it’s quantifiable, it’s dangerous. How unfortunate, since scientists are getting more clever all the time. Folks forget the old adage that ‘the dose makes the poison,’ and act on the principle that just the fact that anything is found is cause for alarm.

Here’s another one for nosophobiasts. There is now such a thing as ‘third hand smoke.’ Dr. Jonathan Winickoff, lead author of a recent paper in Pediatrics, says the following, “If the smell of the cigarettes lingers, so does the danger. Your nose isn’t lying. If your body detects it, then it’s there,” he says. Sounds like the opening comments in this article from Aileen Gagney.

What was the scientific study, which incidentally was given great TV coverage by Dr. Nancy Snyderman of the Today Show? Dr. Winickoff and his colleagues surveyed 1,500 households across the US and asked folks if they agreed with the statements:

• Breathing air in a car today where people smoked yesterday can harm the health of babies and children.
• Breathing air in a room where people smoked yesterday can harm the health of babies and children.

Those surveyed who stated they agreed or agreed strongly were categorized as believing third hand smoke harmed the health of babies and children. This is the “evidence” that third hand smoke had been “identified” as a health danger. (6)

Think about this for a moment. You could have been called as part of this survey and had your chance to play ‘scientist’ and provide data for this analysis. You may not like third hand smoke, but is a telephone solicitation of non-scientists really scientific evidence that it is bad?

There turned out to be much more to this story. What consumers didn’t hear from reporters was that it was conducted by the National Social Climate Survey of Tobacco Control, a special interest group working to legislate bans on tobacco. The Tobacco Consortium, which sets the group’s agenda, is chaired by Dr. Winickoff of Harvard and the lead author of this paper. Makes one wonder about the review process for papers that appear in the journal Pediatrics.

For that matter, a fair number of scientists debate whether second hand smoke exposure during childhood is harmful to long-term health. Sandy Szwarc reports, “The world’s largest study ever done to examine the association between exposure to environmental tobacco smoke (ETS) and lung cancer was conducted by the International Agency for Research on Cancer in Lyon, France, and published in 1998 in the Journal of the National Cancer Institute. (7) It included lung cancer patients up to 74 years of age, and a control group, in 12 centers from seven European countries, looking at cases of lung cancer and exposures to ETS. They found ‘no association between childhood exposure and ETS and lung cancer risk.’” (8)

The authors of the Pediatrics article suggested dangers at exposure levels far below the levels of second hand smoke -- third hand smoke exposures -- even without regard to ventilation, the number of cigarettes parents smoked, or length of exposure.

Szwarc adds, “Any exposure at all, and at levels barely detectable with modern instrumentation, is now being suggested as able to cause cancer and brain damage in children. This turns everything that science knows about toxicology on its head and denies the most fundamental law: that the dose makes the poison. In other words, there is no credible medical evidence to support the suggestion that trace exposures lingering in the air or on clothing are harming children.” (8)

In an entertaining and insightful opinion piece for the Daily Mail, Tom Utley, a smoker described what scare tactics like these, that are such obvious hoaxes, actually undermine the effectiveness of efforts to reduce smoking and exposures to children. No one is arguing that smoking is a healthful habit or would encourage young people to take up smoking, but he ‘most vehemently challenged Dr. Winickoff’s right to dress up this insulting, scaremongering, palpable drivel as science.’ Utley adds, “The very last thing I want is to encourage anybody to take up my disgusting and ruinously expensive habit, which I’m sure will be the death of me. But then I hear the latest hysterical rubbish from the anti-smoking lobby and my determination to remain silent goes the same way as my annual New Year’s resolution to give up the vile weed. Why, when there are so many excellent reasons to quit, do these fanatics feel obliged to keep on inventing new and obviously bogus ones? (9)

Szwarc concludes, “This is one of the most egregious examples of the increasingly common and unethical practice of politicizing science and using a ‘study’ to advance an agenda of a biased media failing to do its job. Neither is in the interest of the facts or truth. The healthiest thing for all of us might be a helpful dose of common sense and respect for other people’s choices. Otherwise, we may next hear about the dangers of fourth hand smoke, a term coined by John Boston of the Santa Clarita Valley Signal: someone sitting next to someone who is thinking about someone else smoking. And someone will believe it and report is as news.” (8)


References

1. Lisa Stiffler, “Fresh scent may hide toxic secret,” seattlepi.com, July 23, 2008
2. Peter Farb amd George Armelagos, Consuming Passions, (Boston, Houghton Mifflin Company, 1980), 22
3. Herbert S. Rosenkranz and Albert R. Cunningham, “Environmental odors and health hazards,” The Science of the Total Environment, 313, 15, 2003
4. Elizabeth M. Whelan, “Toxics (from Seattle Post-Intelligencer)”, American Council on Science and Health, July 25, 2008
5. Anne C. Steinemann, “Fragranced consumer products and undisclosed ingredients,” Environ. Impact Asses. Rev., (2008), doi;10.1016/j.eiar.2008.05.002
6. Jonathan P Winickoff et al., “Beliefs About the Health Effects of ‘Third hand’ Smoke and Home Smoking Bans,” Pediatrics, 123, e74, January 1, 2009
7. B. P. Agudo et al., “Multicenter case-control study of exposure to environmental tobacco smoke and lung cancer in Europe,” J. Natl. Cancer Inst., 90, 1440, October 7, 1998
8. Sandy Szwarc, “Third hand smoke and chemtrials -- invisible toxin fears,” junkfoodscience.com, January 10, 2009
9. Tom Utley, dailymail.co.uk, January 9, 2009

Jack Dini is a scientist and science writer living in Livermore, CA.

Three Mile Island and Chernobyl: What Went Wrong and Why Today’s Reactors Are Safe

Three Mile Island and Chernobyl: What Went Wrong and Why Today’s Reactors Are Safe. By Jack Spencer and Nicolas Loris
Heritage, WebMemo #2367, March 27, 2009

This Saturday marks the 30th anniversary of the partial meltdown of the Three Mile Island (TMI) nuclear reactor. This ofccasion is a good time to consider the advances in nuclear power safety since that time and discuss the misinformation about this incident and the 1986 nuclear accident in Chernobyl, Ukraine, which is often associated with TMI.


Three Mile Island: What Happened

On March 28, 1979, a cooling circuit pump in the non-nuclear section of Three Mile Island's second station (TMI-2) malfunctioned, causing the reactor's primary coolant to heat and internal pressure to rise. Within seconds, the automated response mechanism thrust control rods into the reactor and shut down the core. An escape valve opened for 10 seconds to vent steam into a pressurizer, as it was supposed to, but it failed to close. Control room operators only saw that a "close" command was sent to the relief valve, but nothing displayed the valve's actual position.[1] With the valve open, too much steam escaped into the pressurizer, sending misinformation to operators that there was too much pressure in the coolant system. Operators then shut down the water pumps to relieve the "pressure."

Operators allowed coolant levels inside the reactor to fall, leaving the uranium core exposed, dry, and intensely hot. Even though inserting control rods halted the fission process, the TMI-2 reactor core continued to generate about 160 megawatts of "decay" heat, declining over the next three hours to 20 megawatts.[2] Approximately one-third of the TMI-2 reactor was exposed and began to melt.

By the time operators discovered what was happening, superheated and partially radioactive steam built up in auxiliary tanks, which operators then moved to waste tanks through compressors and pipes. The compressors leaked. The steam leakage released a radiation dose equivalent to that of a chest X-ray scan, about one-third of the radiation humans absorb in one year from naturally occurring background radiation.[3] No damage to any person, animal, or plant was ever found.[4]


The Outcome

The local population of 2 million people received an average estimated dose of about 1 millirem--miniscule compared to the 100-125 millirems that each person receives annually from naturally occurring background radiation in the area. Nationally, the average person receives 360 millirems per year.[5]

No significant radiation effects on humans, animals, or plants were found. In fact, thorough investigation and sample testing of air, water, milk, vegetation, and soil found that there were negligible effects and concluded that the radiation was safely contained.[6] The most recent and comprehensive study was a 13-year evaluation of 32,000 people living in the area that found no adverse health effects or links to cancer.[7]


Technological Improvements and Lessons Learned

A number of technological and procedural changes have been implemented by industry and the Nuclear Regulatory Commission (NRC) to considerably reduce the risk of a meltdown since the 1979 incident. These include:


  • Plant design and equipment upgrades, including fire protection, auxiliary feedwater systems, containment building isolation, and automatic plant shut down capabilities;
  • Enhanced emergency preparedness, including closer coordination between federal, state, and local agencies;
  • Integration of NRC observations, findings, and conclusions about plant performance and management into public reports;
  • Regular plant performance analysis by senior NRC managers who identify plants that require additional regulatory attention;
  • Expansion of NRC's resident inspector program, whereby at least two inspectors live nearby and work exclusively at each plant;
  • Expanded performance- and safety-oriented inspections;
  • Additional accident safety equipment to mitigate and monitor conditions during accidents; and[8]
  • Establishment of the Institute for Nuclear Power Operators, an industry-created non-profit organization that evaluates plants, promotes training and information sharing, and helps individual plants overcome technical issues.

Chernobyl: What Happened

Seven years after the incident at Three Mile Island, on April 25, 1986, a crew of engineers with little background in reactor physics began an experiment at the Chernobyl nuclear station. They sought to determine how long the plant's turbines' inertia could provide power if the main electrical supply to the station was cut. Operators chose to deactivate automatic shutdown mechanisms to carry out their experiment.[9]

The four Chernobyl reactors were known to become unstable at low power settings,[10] and the engineers' experiment caused the reactors to become exactly that. When the operators cut power and switched to the energy from turbine inertia, the coolant pump system failed, causing heat and extreme steam pressure to build inside the reactor core. The reactor experienced a power surge and exploded, blowing off the cover lid of the reactor building, and spewed radioactive gasses and flames for nine days.

The episode was exacerbated by a second design flaw: The Chernobyl reactors lacked fully enclosed containment buildings, a basic safety installation for commercial reactors in the U.S.[11]


The Outcome

Chernobyl was the result of human error and poor design. Of the approximately 50 fatalities, most were rescue workers who entered contaminated areas without being informed of the danger.

The World Heath Organization says that up to 4,000 fatalities could ultimately result from Chernobyl-related cancers. Though these could still emerge, as yet, they have not. The primary health effect was a spike in thyroid cancer among children, with 4,000-5,000 children diagnosed with the cancer between 1992 and 2002. Of these, 15 children unfortunately died. Though these deaths were unquestionably tragic, no clear evidence indicates any increase in other cancers among the most heavily affected populations.

Interestingly, the World Health Organization has also identified a condition called "paralyzing fatalism," which is caused by "persistent myths and misperceptions about the threat of radiation."[12] In other words, the propagation of ignorance by anti-nuclear activists has caused more harm to the affected populations than has the radioactive fallout from the actual accident. Residents of the area assumed a role of "chronic dependency" and developed an entitlement mentality because of the meltdown.[13]


Technology Improvements and Lessons Learned

Comparing the technology of the nuclear reactor at Chernobyl to U.S. reactors is not fair. First, the graphite-moderated, water-cooled reactor at Chernobyl maintained a high positive void coefficient. While the scientific explanation[14] of this characteristic is not important, its real-life application is. Essentially, it means that under certain conditions, coolant inefficiency can cause heightened reactivity. In other words, its reactivity can rapidly increase as its coolant heats (or is lost) resulting in more fissions, higher temperatures, and ultimately meltdown.[15]

This is in direct contrast to the light-water reactors used in the United States, which would shut down under such conditions. U.S. reactors use water to both cool and moderate the reactor. The coolant keeps the temperature from rising too much, and the moderator is used to sustain the nuclear reaction. As the nuclear reaction occurs, the water heats up and becomes a less efficient moderator (cool water facilitates fission better than hot water), thus causing the reaction to slow down and the reactor to cool. This characteristic makes light water reactors inherently safe and is why a Chernobyl-like reactor could never be licensed in the U.S.

Given the inherent problems with the Chernobyl reactor design, many technological changes and safety regulations were put in place to prevent another Chernobyl-like meltdown from occurring. Designers renovated the reactor to make it more stable at lower power, have the automatic shutdown operations activate quicker, and have automated and other safety mechanisms installed.[16]

Chernobyl also led to the formation of a number of international efforts to promote nuclear power plant safety through better training, coordination, and implantation of best practices. The World Association of Nuclear Operators is one such organization and includes every entity in the world that operates a nuclear power plant.


Myths Persist

The circumstances, causes, and conditions of the Chernobyl meltdown are far removed from the American experience. Important lessons should be taken from both accidents. Thankfully, many improvements in the technology and regulatory safety of nuclear reactors are among them.

Jack Spencer is Research Fellow in Nuclear Energy and Nicolas D. Loris is a Research Assistant in the Thomas A. Roe Institute for Economic Policy Studies at The Heritage Foundation.


Notes

[1]World Nuclear Association, "Three Mile Island: 1979," March 2001, at http://www.world-nuclear.org/info/inf36.html (March 26, 2009).
[2]Smithsonian Institution, National Museum of American History, "Three Mile Island: The Inside Story," at http://americanhistory.si.edu/tmi/tmi03.htm (March 26, 2009).
[3]American Nuclear Society, "What Happened and What Didn't in the TMI-2 Accident," at http://www.ans.org/pi/resources/sptopics/tmi/whathappened.html (March 26, 2009).
[4]U.S. Nuclear Regulatory Commission, "Fact Sheet on the Three Mile Island Accident," http://www.nrc.gov/reading-rm/doc-collections/fact-sheets/3mile-isle.html (March 26, 2009). [5]United States Department of Energy, Office of Civilian Radioactive Waste Management, "Facts About Radiation," OCRWM Fact Sheet, January 2005, at http://www.ocrwm.doe.gov/factsheets/doeymp0403.shtml (November 6, 2008).
[6]Nuclear Regulatory Commission, "Fact Sheet on the Three Mile Island Accident."
[7]World Nuclear Association, "Three Mile Island: 1979."
[8] Fact Sheet on the Three Mile Island Accident" Nuclear Regulatory Commission at http://www.nrc.gov/reading-rm/doc-collections/fact-sheets/3mile-isle.html (June 24, 2008).
[9]For full description of what caused the accident at Chernobyl, see Richard Rhodes, Nuclear Renewal (New York: Penguin Books, 1993), ch. 5.
[10]World Nuclear Association, "Chernobyl Accident," May 2008, at http://www.world-nuclear.org/info/chernobyl/inf07.html (March 26, 2009).
[11]Simon Rippon et al., "The Chernobyl Accident," Nuclear News, June 1986, at http://www.ans.org/pi/resources/sptopics/chernobyl/docs/nn-1986-6-chernobyl-lores.pdf (March 26, 2009).
[12]Press release, "Chernobyl: The True Scale of the Accident," World Health Organization, International Atomic Energy Agency, and U.N. Development Programme, September 5, 2005, at http://www.who.int/mediacentre/news/releases/2005/pr38/en/print.html (November 6, 2008).
[13]World Nuclear Association, "Chernobyl Accident."
[14]"Neutron Kinetics of the Chernobyl Accident," ENS News, Summer 2006, at http://www.euronuclear.org/e-news/e-news-13/neutron-kinetics.htm (March 27, 2009).
[15]International Atomic Energy Agency, "The Chernobyl Accident: Updating of INSAG-1," 1992, at http://www-pub.iaea.org/MTCD/publications/PDF/Pub913e_web.pdf (August 27, 2008).

New Global Currency Proposal: Good Diplomatic Theater but Bad Policy

New Global Currency Proposal: Good Diplomatic Theater but Bad Policy. By Ambassador Terry Miller
Heritage WebMemo #2364, March 26, 2009

Recently, both China and Russia have called for the replacement of the dollar as the international reserve currency of choice, suggesting use of IMF Special Drawing Rights (SDRs) instead. Don't rush to sell your greenbacks, however: The proposal has far more to do with the theater of international diplomacy than the workings of the world economy.

Even as he made the proposal, Chinese central banker Zhou Xiaochun acknowledged that it would require "extraordinary political vision and courage." That is diplomatic speak for "We know this is impossible." The fact that Zhou and his Russian counterpart in proposing the idea, Finance Minister Alexei Kudrin, placed the timeline for the change far in the future--30 years in the case of Kudrin--offers an additional strong clue that the proposal is politically motivated rather than intended to address a real and pressing economic problem.

For both Zhou and Kudrin, an attack on the dollar just before the G-20 economic summit is a great theatrical device with which to express displeasure at U.S. dominance of the international financial system. It is also a marker of their unhappiness with the ineffective U.S. approach to restoring world growth and protecting international trade and financial flows, sending a clear signal that they have no intention of rubber-stamping U.S. proposals at the summit.


The Dollar Works for Everybody

The U.S. dollar is currently the principal international reserve currency, a role it assumed following the collapse of the gold standard and the dissolution of the British Empire in the second half of the 20th century. The U.S. gains significant advantages from the use by others of the dollar as a reserve currency: Worldwide demand for dollars helps keep the value of the dollar high, meaning imported goods--including basic commodities like oil--are cheaper for Americans. The willingness of foreigners to hold dollar stocks (or dollar-denominated assets such as U.S. Treasury securities) makes possible the long-running U.S. trade deficit, to the benefit of American consumers.

Some assert that there is a corresponding cost to U.S. producers, who lose export opportunities or even jobs to foreign producers because of the high value of the dollar--a logical idea that holds true for countries whose currency is not held as a reserve. In most countries, if its currency appreciates in value, its exports become more expensive and demand for them decreases. However, demand for U.S. exports has consistently risen, even as the value of the dollar remained high. The reason is that the value to other countries of the dollar as a reserve currency creates additional demand for the dollar over and above what would be generated by normal trade flows. Thus people are willing to sell Americans more goods than they otherwise would in order to get extra dollars to hold as reserves. But they still want to buy U.S. goods and services, too.

The benefits for other countries are substantial as well. The utility of the reserve currency function--having a stable and readily convertible commodity (the dollar in this case) in which to hold wealth--is self-evident. The dollar provides a readily available medium of exchange; it can be used to pay for almost anything. Furthermore, consumers can also be confident that the dollar will buy about as much tomorrow as it buys today. It is hard to imagine international trade without some medium of exchange like the dollar.

Even more important, however, is the additional demand created in the United States for exports from other countries--a byproduct of strong global desire for dollars. This U.S. consumer demand has served as the primary engine of growth for the world economy. Worldwide, that growth has lifted millions of people out of poverty over the last decade. For China, this export-led growth has fueled job creation that promotes social stability and raises incomes, at least in certain sectors and geographic areas that have been permitted by the authoritarian Chinese government to link to the global economy.


An Unrealistic Solution to a Non-Existent Problem

The creation of a new international reserve currency to replace the dollar is a solution looking for a problem. So far, the dollar is working just fine as a reserve currency. The continued strength of the dollar testifies to its continued utility as a reserve currency and the confidence of the markets in its future value. That confidence extends, by the way, to the Russian and Chinese governments, both of which continue to hold large stocks of dollars.

In contrast, there are many reasons why moving to an SDR makes no sense:

  • The SDR has no intrinsic value. The SDR is backed by nothing other than the good faith and credit, if you will, of the IMF. It has no intrinsic value and, at the moment at least, can't be used to purchase anything. It is true that people like to say that the dollar is backed only by the good faith and credit of the U.S. government. In reality, however, the dollar is backed by the goods and services produced by the American people and their willingness to trade those goods and services for dollars. With this willingness to trade real things for dollars extending to people around the world, the value of the dollar becomes backed not just by the U.S. people or the U.S. government but by literally all the world's producers and consumers interconnected through global supply chains: Arab oil traders, Bangladeshi textile producers, Japanese and Korean auto manufacturers, and, yes, even Russian finance ministers and Chinese central bankers. The IMF, by contrast, produces nothing.
  • A one-size-fits-all international currency will not meet diverse world needs. Countries growing at different rates have different monetary policy needs. Faster-growing countries need a more rapidly increasing supply of money. Slower-growing countries must have less in order to prevent inflation. The SDR could not accommodate these differing needs. Its value is set by the policies of the IMF, which in turn are subject to the competing political and economic interests of international diplomats.
  • Embracing the SDR will result in a loss of transparency. The process of setting the supply and value of the dollar is highly transparent. People around the world know exactly what the Federal Reserve is doing as it adds dollars to the system or adjusts interest rates. Even the rationale for the changes is quickly apparent from Fed statements and the open grilling to which the U.S. subjects Fed governors. The IMF is far more opaque: Each country's representative would likely tell a different story about what was done and why.
  • The SDR will create new financial complexities and opportunities for corruption. Use of the SDR would add an additional step to every international transaction. Buyers and sellers would have to convert their local currency into SDRs. Some would have to change first into a convertible currency and then into SDRs. This would create new opportunities for arbitrage and corruption or, at the very least, make the process more expensive with an additional transaction fee. Fans of derivatives will love the SDR: Like derivatives, SDRs are an additional layer away from anything of real value. They will provide wonderful opportunities for manipulation and skimming of value by currency traders and financial speculators. They will also be less transparent and harder for normal people to understand. And they will be controlled by an international organization that has little if any democratic legitimacy or accountability.

A Silly Idea from Serious People

Why would the Russians and Chinese propose such a thing? They are serious people, and serious people do not usually propose silly ideas. The easy answer would be resentment toward the U.S. for its unique and dominant role in the system. This may be an element especially for the Russians. Far more likely, however, is that the Chinese and Russians are motivated instead by fear that the U.S. is not doing a very good job of managing its economy and its international economic role right now. Floating an unrealistic but provocative proposal is precisely the way to diplomatically express that concern without actually threatening the current system on which they depend just as does the rest of the world.

One would have to guess that the U.S. response--categorical defense of the dollar and its role as a reserve currency by both Treasury Secretary Geithner and Fed Chairman Bernanke[1]--was exactly what the Chinese and Russians were trying to inspire. It is a diplomatic coup for them as well as a warning to the U.S. that what this nation does right and what it does wrong has major implications for other nations as well as itself. America should not expect other countries to remain idle if U.S. policies begin hurting their economic growth as well as its own.

Ambassador Terry Miller
is Director of the Center for International Trade and Economics at The Heritage Foundation.


Notes

[1]Geithner, in either a huge mistake or (hopefully) a simple misunderstanding of a reporter's question, initially gave a positive response to the idea of expanding the role of SDRs, but he, the President, and FED Chairman Bernanke quickly set things straight. See "Geithner and Bernanke Reject New Global Currency Idea," Reuters, March 24, 2009, at http://uk.news.yahoo.com/22/20090324/tpl-uk-forex-usa-geithner-sb-d1a0d5d.html (March 26, 2009).

Let's Put Bylines on Our 'National' Intelligence Estimates. Anonymity leads to mediocrity and politicization

Let's Put Bylines on Our 'National' Intelligence Estimates. By Reuel Marc Gerecht
Anonymity leads to mediocrity and politicization.
WSJ, Mar 28, 2009

Charles Freeman's withdrawal from his appointment as the chairman of the National Intelligence Council (NIC) offers an opportunity to assess whether personal views should have any role in intelligence analysis. Mr. Freeman's opinions on Israel, the Middle East and China proved too strong for critics. Yet the NIC's National Intelligence Estimates (NIEs) are often politicized and debased precisely because their anonymous authors need take no personal responsibility for their opinions.

No one knows if the upcoming new Iran estimate will be as politicized as was its 2007 predecessor, which damaged the diplomacy of both the U.S. and our European allies. In any case, the Obama administration likely will have one day a politically convulsive NIE that will make the president wonder why these estimates are ever drafted.

Anonymity is the byword of the intelligence profession. In operations, it is usually mandatory. In analysis, however, a collective effort that hides individual authorship is a dubious approach.
In theory, anonymity gives analysts protection from political pressure. And a collective judgment -- NIEs are the most consensual intelligence products that the executive branch puts out -- is supposed to be more reliable and convincing than the views of a particular analyst or agency.

In practice, however, this anonymous, collective approach has guaranteed that mediocre analysis goes unchallenged, and that analysts and senior officials within the NIC go unscarred when they're wrong. No one would want to invest money with a stockbroker who consistently gives bad advice. No one would want to retain a coach who loses more than he wins. Yet obtaining an analytical score card on analysts and their managers within the NIC, the CIA's Directorate of Intelligence or the office of the Director of National Intelligence is nearly impossible.

NIEs rarely offer insights not available elsewhere at far less cost. They have often been egregiously wrong -- my personal favorites were the NIEs written before Iran's Islamic Revolution that predicted the Pahlavi dynasty's survival into the 21st century -- and when right, often unspectacularly so (seeing enmity among the Serbs, Croats and Muslim Bosnians in post-Cold War Yugoslavia wasn't particularly perspicacious). Yet where once NIEs attracted little attention, they now can become political footballs even before they are finished.

In part, this is because the nature of Washington has changed. Estimates were once either closely guarded or easily forgotten -- the secrecy of estimates was better kept and no one expected presidents or members of Congress to accept them as guides for foreign policy. Today, Americans have unprecedented access to secret information. And within the State Department and Congress, where partisan policy battles are fierce, members feel no hesitation in using NIEs as political battering rams. At dizzying speeds, politicians and their allied journalists can denigrate an NIE for its "group think," as was the case with the 2003 report on Iraqi WMD. Or they can applaud another for its supposed willingness to speak truth to power -- as we heard with the Iran "no-nuke" NIE of 2007. With the system we have, this isn't going to change.

President George W. Bush missed an enormous opportunity to reassess the CIA's operational and analytical branches -- the vital center of the American intelligence community -- after 9/11. He embraced the status quo, putting it on steroids by increasing its funding, adding more personnel, and canning no one.

Identifying the primary drafters of NIEs -- or any major analytical report requested by Congress -- could significantly improve the quality of these analyses and diminish the potential for politicians, the media and the intelligence community to politically exploit the reports. Senior managers at the CIA, the NIC or in any of America's other intelligence agencies should have their names appended to an assessment if they've had any substantive role in writing its conclusions. Although everyone in the intelligence community likes to get their fingers into an NIE, there are usually just a small number of individuals who do the lion's share of the work. They should all be known, and should be expected to defend their conclusions in front of Congress and senior executive-branch officials.

Contrary to what some journalists suggested about the prelude to the Iraq war, good analysts live to be questioned by senior officials. Intrepid analysts want to get out of their cubicles.
Think tankers can generally run circles around government analysts and managers on substance, and especially in "strategic" vision, because they operate in more open, competitive and intellectually rigorous environments. Anonymous collective official analysis tends to smother talent and parrot the current zeitgeist in Washington. Liberating first-rate analysts from the bureaucratic disciplining and expectations of their own agencies by allowing them to build public reputations is probably the most efficient and inexpensive way to introduce "contrarian views," an oft-stated reason for Mr. Freeman's appointment.

Mr. Freeman's strongly held personal views proved to be his appointment's stumbling block. If Dennis Blair, the director of National Intelligence, believes that views such as those held by Mr. Freeman would help the intellectual mix at the NIC, then he should allow these views to be heard and argued in an open environment.

In an environment where analysts have publicly tracked reputations, we are likely to see people take their tasks more seriously instead of hiding behind their agencies. Those who are confident in their assessments won't fear change. But those who are fence-sitters, more concerned about melding their views into what is bureaucratically palatable and politically acceptable, will likely drift to the rear as they grope for what is accepted orthodoxy.

A more open system still may not make the intelligence community's product competitive with the best from the outside on the big issues ("Whither China, Russia and Iran?"). But it will certainly make estimates more interesting to read than what we have now. Denied the imprimatur of saying "the intelligence community believes," estimates will come down to earth. And from that angle, it will be much harder for anyone again to use an NIE to damage their political opponents.

Mr. Gerecht, a former Central Intelligence Agency officer, is a senior fellow at the Foundation for Defense of Democracies.

President Obama's plan for Afghanistan and Pakistan is ambitious and expensive. It is also hard-headed.

The Price of Realism. WaPo Editorial
President Obama's plan for Afghanistan and Pakistan is ambitious and expensive. It is also hard-headed.
Saturday, March 28, 2009; page A12

THE STRATEGY for Afghanistan and Pakistan announced by President Obama yesterday is conservative as well as bold. It is conservative because Mr. Obama chose to embrace many of the recommendations of U.S. military commanders and the Bush administration, based on the hard lessons of seven years of war. Yet it is bold -- and politically brave -- because, at a time of economic crisis and war-weariness at home, Mr. Obama is ordering not just a major increase in U.S. troops, but also an ambitious effort at nation-building in both Afghanistan and Pakistan. He is right to do it.

Few Americans would dispute Mr. Obama's description yesterday of the continuing threat from Afghanistan and Pakistan's tribal areas. "Multiple intelligence estimates have warned that al-Qaeda is actively planning attacks on the U.S. homeland from its safe haven in Pakistan," he said. "And if the Afghan government falls to the Taliban -- or allows al-Qaeda to go unchallenged -- that country will again be a base for terrorists who want to kill as many of our people as they possibly can." The goal he stated was similarly simple and clear: "to disrupt, dismantle and defeat al-Qaeda in Pakistan and Afghanistan, and to prevent their return to either country in the future."

What distinguishes the president's plan -- and opens him to criticism from some liberals as well as conservatives -- is its recognition that U.S. goals cannot be achieved without a major effort to strengthen the economies and political institutions of Pakistan and Afghanistan. The Bush administration tried to combat the al-Qaeda threat with limited numbers of U.S. and NATO troops, targeted strikes against militants, and broad, mostly ineffective, aid programs. It provided large sums of money to the Pakistani army, with few strings attached, in the hope that action would be taken against terrorist camps near the Afghan border. The strategy failed: The Taliban has only grown stronger, and both the Afghan and Pakistani governments are dangerously weak.

The lesson is that only a strategy that aims at protecting and winning over the populations where the enemy operates, and at strengthening the armies, judiciaries, and police and political institutions of Afghanistan, can reverse the momentum of the war and, eventually, allow a safe and honorable exit for U.S. and NATO troops. This means more soldiers, more civilian experts and much higher costs in the short term: Mr. Obama has approved a total of 21,000 more U.S. troops and several hundred additional civilians for Afghanistan, and yesterday he endorsed two pieces of legislation that would provide Pakistan with billions of dollars in nonmilitary aid as well as trade incentives for investment in the border areas. More is likely to be needed: U.S. commanders in Afghanistan hope to obtain another brigade of troops and a division headquarters in 2010, and to double the Afghan army again after the expansion now underway is completed in 2011. Mr. Obama should support those plans.

Such initiatives are not the product of starry-eyed idealism or an attempt to convert either country into "the 51st state" but of a realistic appreciation of what has worked -- and failed -- during the past seven years. As Mr. Obama put it, "It's far cheaper to train a policeman to secure his or her own village or to help a farmer seed a crop than it is to send our troops to fight tour after tour of duty with no transition to Afghan responsibility." That effort will be expensive and will require years of steadiness. But it offers the best chance for minimizing the threat of Islamic jihadism -- to this country and to the world.

Friday, March 27, 2009

Did the Fed Cause the Housing Bubble? WSJ Symposium

Did the Fed Cause the Housing Bubble? Symposium
WSJ, Mar 27, 2009

Don't Blame Greenspan. By David Henderson

It's become conventional wisdom that Alan Greenspan's Federal Reserve was responsible for the housing crisis. Virtually every commentator who blames Mr. Greenspan points to the low interest rates during his last few years at the Fed.

The link seems obvious. Everyone knows that the Fed can drive interest rates lower by pumping more money into the economy, right? Well, yes. But it doesn't follow that that's why interest rates were so low in the early 2000s. Other factors affect interest rates too. In particular, a sudden increase in savings will drive down interest rates. And such a shift did occur. As Mr. Greenspan pointed out on this page on March 11, there was a surge in savings from other countries. Although he names only China, some of the Middle Eastern oil-producing countries were also responsible for much of this new saving. Shift the supply curve to the right and, wonder of wonders, the price falls. In this case, the price of saving and lending is the interest rate.

But how do we know that it was an increase in saving, not an increase in the money supply, that caused interest rates to fall? Look at the money supply.

Since 2001, the annual year-to-year growth rate of MZM (money of zero maturity, which is M2 minus small time deposits plus institutional money market shares) fell from over 20% to nearly 0% by 2006. During that time, M2 (which is M1 plus time deposits) growth fell from over 10% to around 2%, and M1 (which is currency plus demand deposits) growth fell from over 10% to negative rates.

The annual growth rate of the monetary base, the magnitude over which the Fed has the most control, fell from 10% in 2001 to below 5% in 2006. Moreover, nearly all of the growth in the monetary base went into currency, an increasing proportion of which is held abroad.
Moreover, if the Fed was the culprit, why was the housing bubble world-wide? Do Mr. Greenspan's critics seriously contend that the Fed was responsible for high housing prices in, say, Spain?

This is not to say that the Greenspan Fed was blameless. Particularly disturbing is the way the lender-of-last-resort function has increased moral hazard, a trend to which Mr. Greenspan contributed and which current Fed Chairman Ben Bernanke has put on steroids.
But to the extent that the federal government is to blame, the main fed culprits are the beefed up Community Reinvestment Act and the run-amok Fannie Mae and Freddie Mac. All played a key role in loosening lending standards.

I'm not claiming that we should have a Federal Reserve. We simply can't depend on getting another good chairman like Mr. Greenspan, and are more likely to get another Arthur Burns or Ben Bernanke. Serious work by economists Lawrence H. White of the University of Missouri, St. Louis, and George Selgin of West Virginia University makes a persuasive case that abolishing the Fed and deregulating money would improve the macroeconomy. I'm making a more modest claim: Mr. Greenspan was not to blame for the housing bubble.

Mr. Henderson is a research fellow with the Hoover Institution, an economics professor at the Naval Postgraduate School, and editor of "The Concise Encyclopedia of Economics" (Liberty Fund, 2008).

What Savings Glut? By Gerald P. O'Driscoll Jr.

Alan Greenspan responded to his critics on these pages on March 11. He singled out an op-ed by John Taylor a month earlier, "How Government Created the Financial Crisis" (Feb. 9), for special criticism. Mr. Greenspan's argument defending his policy is two-fold: (1) the Fed controls overnight interest rates, but not "long-term interest rates and the home-mortgage rates driven by them"; and (2) a global excess of savings was "the presumptive cause of the world-wide decline in long-term rates."

Neither argument stands up to scrutiny. First, Mr. Greenspan writes as if mortgages were of the 30-year variety, financed by 30-year money. Would that it were so! We would not be in the present mess. But the post-2002 period was characterized by one-year adjustable-rate mortgages (ARMs), teaser rates that reset in two or three years, etc. Five-year ARMs became "long-term" money.

The Fed only determines the overnight, federal-funds rate, but movements in that rate substantially influence the rates on such mortgages. Additionally, maturity-mismatches abounded and were the source of much of the current financial stress. Short-dated commercial paper funded investment banks and other entities dealing in mortgage-backed securities.

Second, Mr. Greenspan offers conjecture, not evidence, for his claim of a global savings excess. Mr. Taylor has cited evidence from the IMF to the contrary, however. Global savings and investment as a share of world GDP have been declining since the 1970s. The data is in Mr. Taylor's new book, "Getting Off Track."

The former Fed chairman also cautions against excessive regulation as a policy response to the crisis. On this point I concur. He does not directly address, however, the Fed's policy response. From the beginning, the Fed diagnosed the problem as lack of liquidity and employed every means at its disposal to supply liquidity to credit markets. It has been to little avail and, in the process, the Fed has loaded up its balance sheet with dubious assets.

The credit crunch continues because many banks are capital-impaired, not illiquid. Treasury's policy shifts and inconsistencies under both administrations have sidelined potential private capital. Treasury became the capital provider of last resort. It was late to recognize the hole in banks' balance sheets and consistently underestimated its size. The need to provide second- and even third-round capital injections proves that.

In summary, Fed policy did help cause the bubble. Subsequent policy responses by that institution have suffered from sins of commission and omission. As Mr. Taylor argued, the government (including the Fed) caused, prolonged, and worsened the crisis. It continues doing so.

Mr. O'Driscoll is a senior fellow at the Cato Institute. He was formerly a vice president at the Federal Reserve Bank of Dallas.

Low Rates Led to ARMs. By Todd J. Zywicki

Alan Greenspan's argument that the Federal Reserve's policies on short-term interest rates had no impact on long-term mortgage interest rates overlooks the way in which its policies changed consumer behavior.

A simple yet powerful pattern emerges from survey data of the past 25 years collected by HSH Associates (the financial publishers): The spread between fixed-rate mortgages (FRMs) and ARMs typically hovers between 100 and 150 basis points, representing the premium that a borrower has to pay to induce the lender to bear the risk of interest-rate fluctuations. At times, however, the spread between FRMs and ARMs breaks out of this band and becomes either larger or smaller than average, leading marginal consumers to prefer one to the other. Sometimes the adjustment in the market share of ARMs lags behind changes in the size of the spread, but over time when the spread widens, the percentage of ARMs increases and vice-versa.

In 1987, before subprime lending was even a gleam in Angelo Mozilo's eye, the spread rose to 300 basis points and the share of ARMs eventually rose to almost 70%, according to the Federal Finance Housing Board. When the spread shrunk to near 100 basis points in the late-1990s, the percentage of ARMs fell into the single digits. Other periods of time show similar dynamics.
In the latest cycle the spread rose from under 50 basis points at the end of 2000 to 230 basis points in mid-2004 and the percentage of ARMs rose from 10% to 40%. The Fed's subsequent increases on short-term rates caused short- and long-term rates to converge, squeezing the spread to about 50 points by 2007 and reducing ARMs to less than 10% of the market.

Record-low ARM interest rates kept housing generally affordable even as buyers could stretch to pay higher prices. Low short-term interest rates, combined with tax and other policies, also drew speculative, short-term home-flippers into certain markets. As the Fed increased short-term rates in 2005-07, interest rate resets raised monthly payments, triggering the initial round of defaults and falling home prices. Foreclosure rates initially soared on both prime and subprime ARMS much more than for FRMs.

Why did the ARM substitution result in a wave of foreclosures this time, unlike prior times? During previous times with high percentages of ARMs, the dip in short-term interest rates was a leading indicator of an eventual decline in long-term rates, reflecting the general downward trend in rates of the past 25 years. By contrast, during this housing bubble the interest rate on ARMs were artificially low and eventually rose back to the level of FRMs. There were other factors that exacerbated the problem -- most notably increased risk-layering and a decline in underwriting standards -- but the Fed's artificial lowering of short-term interest rates and the resulting substitution by consumers to ARMs triggered the bubble and subsequent crisis.

Mr. Zywicki is a professor of law at George Mason University School of Law and a senior scholar at the university's Mercatus Center. He is writing a book on consumer bankruptcy and consumer credit.

The Fed Provided the Fuel. By David Malpass

The blame for the current crisis extends well beyond the Fed -- to banks, regulators, bond raters, mortgage fraud, the Bush administration's weak-dollar policy and Lehman bankruptcy decisions, and Congress's reckless housing policies through Fannie Mae and Freddie Mac and the Community Reinvestment Act.

But the Fed provided the key fuel with its 1% interest rate choice in 2003 and 2004 and "measured" (meaning inadequate) rate hikes in 2004-2006. It ignored inflationary dollar weakness, higher interest rate choices abroad, the Taylor Rule, and the booming performance of the U.S. and global economies.

Even by the Fed's own backward-looking inflation metrics, the core consumption deflator exceeded the Fed's 2% limit for 18 quarters in a row beginning with the second quarter of 2004, while 12-month Consumer Price Index (CPI) inflation hit 4.7% in September 2005 and 5.4% in July 2008. This despite the Fed's constant assurances that inflation would moderate (unlikely given the crashing dollar.)

Despite its role as regulator and rate-setter, the Fed claimed that it could not identify asset bubbles until they popped (see my rebuttal on this page "The Fed's Moment of Weakness," Sept. 25, 2002). It is clear that the Fed's interest rate polices cause wide swings in the value of the dollar and huge momentum-based capital flows. These bring predictable -- and avoidable -- deflations, inflations and asset bubbles.

Beginning in 2003, the Fed filled the liquidity punch bowl. Low rates and the weakening dollar created a monumental carry trade (borrow dollars, buy anything). This transmitted the Fed's monetary excess abroad and into commodities. As the punch bowl overflowed, even global bonds bubbled (prices rose, yields fell), contributing to the global housing boom. Alan Greenspan singled out this correlation in his March 11 op-ed on this page, "The Fed Didn't Cause the Housing Bubble."

Given this power, the Fed should itself stop the current deflation and the economic freefall. It has to add enough liquidity to offset frozen credit markets, the collapse in the velocity of money, and bank deleveraging (which has reversed the normal money multiplier.)

The Fed was on the right track in late November when it committed to purchasing $600 billion in longer-term, government-guaranteed securities. Equities rose globally, and some credit markets thawed, including a decline in mortgage rates and corporate bond spreads. However, the Fed reversed course in January, delaying its asset purchases and shrinking its balance sheet. Growth in the money supply stopped. Since then, the Fed increased the amount of assets it intends to purchase, but lengthened the time period rather than accelerating the pace of purchases.

Given the magnitude of the crisis and the stakes, the Fed should be buying safe assets fast, not parceling out a few billion. Confidence and money velocity would also increase if the Fed committed itself to dollar stability, not instability, to avoid causing future inflations and deflations.

Mr. Malpass is president of Encima Global LLC.

Loose Money and the Derivative Bubble. By Judy Shelton

The Fed owns this crisis. The buck stops there -- but it didn't.

Too many dollars were churned out, year after year, for the economy to absorb; more credit was created than could be fruitfully utilized. Some of it went into subprime mortgages, yes, but the monetary excess that fueled the most threatening "systemic risk" bubble went into highly speculative financial derivatives that rode atop packaged, mortgage-backed securities until they dropped from exhaustion.

The whole point of having a central bank is to calibrate the money supply to the genuine needs of an economy -- to purchase goods and services, to fund productive investment -- with the aim of achieving maximum sustainable long-term growth. Since price stability is a key factor toward that end, central bankers attempt to finesse the amount of money and credit in the system; if interest rates are kept too low too long, it causes an unwarranted expansion of credit. As the money supply increases relative to real economic production, the spillage of excess purchasing power results in higher prices for goods and services.

But not always. Sometimes the monetary excess finds its way into a narrow sector of the economy -- such as real estate, or equities, or rare art. This time it was the financial derivatives market.

In the last six years, according to the Bank for International Settlements, the derivatives market exploded as a global haven for speculative investment, its aggregate notional value rising more than fivefold to $684 trillion in 2008 from $127 trillion in 2002. Financial obligations amounting to 12 times the value of the entire world's gross domestic product were written and traded and retraded among financial institutions -- playing off every instance of market turbulence, every gyration in exchange rates, every nuanced statement uttered by a central banker in Washington or Frankfurt -- like so many tulip contracts.

The sheer enormity of this speculative bubble, let alone the speed at which it inflated, testifies to inordinately loose monetary policy from the Fed, keeper of the world's predominant currency. The fact that Fannie Mae and Freddie Mac provided the "underlying security" for many of the derivative contracts merely compounds the error of government intervention in the private sector. Politicians altered normal credit risk parameters, while the Fed distorted housing prices through perpetual inflation.

At this point, dickering over whether Alan Greenspan should have formulated monetary policy in strict accordance with an econometrically determined "rule," or whether the Fed even has the power to influence long-term rates, raises a more fundamental question: Why do we need a central bank?

"There are numbers of us, myself included, who strongly believe that we did very well in the 1870 to 1914 period with an international gold standard." That was Mr. Greenspan, speaking 17 months ago on the Fox Business Network.

In the rules-versus-discretion debate over how best to achieve sound money, that is the ultimate answer.

Ms. Shelton, an economist, is author of "Money Meltdown" (Free Press, 1994).

To Change Policy, Change The Law. By Vincent Reinhart

Anyone seeking an application of the principle that fame is fleeting need look no further than the assessment of Federal Reserve policy from 2002 to 2005.

At the beginning, capital spending was anemic, and considerable wealth had been destroyed by the equity crash. The recovery from the 1990-91 recession was "jobless," and the current one was following the same script. Moreover, inflation was so distinctly pointed down that deflation seemed a palpable threat.

Keeping the federal-funds rate low for a long time was viewed as appropriately balancing the risks to the Fed's dual objectives of maximum employment and price stability. Indeed, the Fed was seen as extending the stable economic performance since 1983 that had been dubbed the "Great Moderation."

Over the period 2002-2005, the federal-funds rate ran below the recommendation of the policy rule made famous by Stanford Professor John Taylor. No doubt, the Taylor Rule provides important guidance on how that rate should change in response to changes in the two mandated goals of policy. First, it should move up or down by more than any change in inflation. Second, the Fed should respond to changes in resource slack. That is, caring about unemployment is not a sign of weakness in a central banker but rather that of strength in better achieving good results.

The Taylor Rule is less helpful to practitioners of policy in anchoring the level of the federal-funds rate. The rule is fit to experience based on a notion of the rate that should prevail if inflation were at its goal and resources fully employed, which is known as the equilibrium funds rate. That is an important technicality. Using a faulty estimate of the equilibrium funds rate is like flying a plane that is otherwise perfect except for an unreliable altimeter. The exception looms large when flying over a mountainous region.

From 2002 to 2005, the economic landscape appeared especially changeable, with the contours shaped by lower wealth, lingering job losses, and looming disinflation. To Fed officials at the time, this indicated that the equilibrium funds rate was unusually low. Simply, the only way to provide lift to an economy in which resource use was slack and inflation pointed down was to keep policy accommodative relative to longer-term standards.

That was then. Now, policy during the period is seen as fueling a housing bubble.

The Fed is guilty as charged in setting policy to achieve the goals mandated in the law. Fed policy makers cannot be held responsible for the fuel to speculative fires provided by foreign saving and the thin compensation for risk that satisfied global investors. Nor can the chain of subsequent mistakes that drove a downturn into a debacle be laid at the feet of the Federal Open Market Committee of 2002 to 2005. If the results seem less than desirable in retrospect, change the law those policy makers were following, but do not blame them for following prevailing law.

Mr. Reinhart is a resident scholar at the American Enterprise Institute. From August 2001 to June 2007, he was the secretary and economist of the Federal Open Market Committee.

How Korea Solved Its Banking Crisis

How Korea Solved Its Banking Crisis. By LEE MYUNG-BAK
The world can learn from our experience in the late '90s.
WSJ, Mar 27, 2009

When world leaders met at the G-20 summit in Washington, D.C., last November, our hope was that by the first quarter of this year we would have largely overcome the financial crisis. At that time, leaders were primarily concerned with macroeconomic stimulus -- primarily fiscal stimulus -- to shorten the severe global recession.

Unfortunately, we are still struggling to deal with the financial turmoil, and financial institutions have yet to regain investors' confidence. The U.S. government recently announced its expanded plan to buy troubled assets that have been burdening banks. While I join others in hoping for the success of this plan, I believe that a true recovery requires all countries to do everything they can to stabilize the economy. If world leaders fail to come up with creative ways to deal with the current difficulties, credit will not flow.

For this reason, when the G-20 leaders meet in London next week, solving the financial meltdown -- with a special focus on removing impaired assets from the balance sheets of financial institutions -- must be our priority.

In the late 1990s, Korea was hit by a financial crisis, and having successfully overcome it, we have valuable lessons to offer. By committing to the following basic principles based on the Korean experience, world leaders will be well prepared as they create a plan to remove impaired assets.

First, bold and decisive measures, rather than incremental ones, are required to regain market confidence. Korea's successful experience illustrates this point. The Korean government tapped various sources to raise a public fund of $127.6 billion (159 trillion KRW) during the period from 1997 to 2002 -- equivalent to 32.4% of Korea's GDP in 1997 -- to resolve impaired assets and recapitalize financial institutions. Given the magnitude of the current challenges, the world cannot afford a minimalist approach.

Second, our experience suggests that bank recapitalization and creating a "bad bank" are not mutually exclusive options; the simultaneous application of both can have a positive effect. Korea established a specialized independent agency, the Korea Asset Management Corporation (Kamco) as a bad bank, while at the same time, the Korea Deposit Insurance Corporation was involved in recapitalizing financial institutions. Kamco purchased the impaired assets and settled the gains or losses with the financial institutions involved once the assets recovered their value. It acquired impaired assets at $30.9 billion -- the book value of which amounted to $85.1 billion by 2002 -- and recovered $33.9 billion by 2008 by reselling to private investors through various methods, including public auctions, direct sales, international tenders, securitization and debt-equity swaps.

Third, it is critical to ensure that the implemented measures are made politically acceptable and that moral hazard is minimized. A special mechanism should be devised for shareholders, managers, workers and asset holders to bear their fair share of the burden. In the case of Korea, capital injections were limited to financial institutions that were systemically important and deemed to be viable after recapitalization.

Fourth, these measures should have built-in exit strategies with clear time frames. There should be a plan for shares of entities that are held by the government to be turned over to the private sector. Additionally, nationalization of banks shouldn't be a goal, but a temporary measure.
Fifth, although government will take the lead in such plans, private capital should be encouraged to fully participate in the process. Obviously, the process itself must be transparent. Korea's experience suggests that it would be useful, on a temporary basis, for governments to purchase impaired assets at a price agreed to with the troubled financial institutions, and then settle the gains or losses with the financial institutions after reselling. The problem of impaired assets today may be of a different nature, since they arise from off-balance sheet bundled derivatives. But this difficulty makes the ex post settlement scheme all the more useful.

Sixth, all forms of financial protectionism should be rejected in the process. Ideally, countries would have a common method for dealing with impaired assets. However, since countries have different financial realities, we should leave it up to each country to craft their own policy. And a coordinated effort is needed to ensure that regular cross-border capital flows are not interrupted.

To that effect, I welcome the G-20 finance ministers' agreement called "Restoring Lending: A Framework for Financial Repair and Recovery," which reflects Korea's proposal. Without abiding by these principles, macroeconomic stimulus measures will not do much good in alleviating the severe global economic recession.

Mr. Lee is president of the Republic of Korea.

Recent Demographic Trends in Metropolitan America

Recent Demographic Trends in Metropolitan America. By William H. Frey, Alan Berube, Audrey Singer, & Jill H. Wilson
The Brookings Institution, Mar 27, 2009

The new administration taking shape in Washington inherits not only an economic crisis, but also a mammoth apparatus of agencies and programs, many of which were developed a generation or more ago. In view of that, a president and Congress striving to "build a smarter government" should develop new policies or retool old programs with the latest population trends in mind, especially those shaping and re-shaping metropolitan areas-our nation's engines of economic growth and opportunity. These include:
  • Migration across states and metro areas has slowed considerably in the past two years due to the housing crisis and looming recession. About 4.7 million people moved across state lines in 2007-2008, down from a historic high of 8.4 million people at the turn of the decade. Population growth has dropped in Sun Belt migration magnets such as Las Vegas, NV, and Riverside, CA, and the state of Florida actually experienced a net loss of domestic migrants from 2007 to 2008. Meanwhile, out-migration has slowed in older regions such as Chicago and New York. Many Midwestern and Northeastern cities experienced greater annual population gains, or reduced population losses, in the past year.
  • The sources and destinations of U.S. immigrants continue their long-run shifts. About 80 percent of the nation’s foreign-born population in 2007 hailed from Latin America and Asia, up from just 20 percent in 1970. The Southeast, traditionally an area that immigrants avoided, has become the fastest-growing destination for the foreign-born, with metro areas such as Raleigh, NC; Nashville, TN; Atlanta, GA; and Orlando, FL ranking among those with the highest recent growth rates. As they arrived in these new destinations, immigrants also began to move away from traditional communities in the urban core. Today, more than half of the nation’s foreign-born residents live in major metropolitan suburbs, while one-third live in large cities.
  • Racial and ethnic minorities are driving the nation’s population growth and increasing diversity among its younger residents. Hispanics have accounted for roughly half the nation’s population growth since 2000. Already, racial and ethnic minorities represent 44 percent of U.S. residents under the age of 15, and make up a majority of that age group in 31 of the nation’s 100 largest metro areas (and a majority of the entire population in 15). Hispanic populations are growing most rapidly in the Southeast; Asian populations are rising in a variety of Sun Belt and high-tech centers; and the black population continues its move toward large Southern metro areas like Atlanta, Houston, and Washington, D.C.
  • The next decade promises massive growth of the senior population, especially in suburbs unaccustomed to housing older people. As the first wave of baby boomers reaches age 65 in less than two years, the senior population is poised to grow by 36 percent from 2010 to 2020. Their numbers will grow fastest in the Intermountain West, the Southeast, and Texas, particularly in metro areas such as Raleigh, NC; Austin, TX; Atlanta, GA; and Boise, ID that already have large pre-senior populations (age 55 to 64). Because the boomers were the nation’s first fully “suburban generation,” their aging in place will cause many major metropolitan suburbs— such as those outside New York and Los Angeles—to “gray” faster than their urban counterparts.
  • Amid rising educational attainment overall, the U.S. exhibits wide regional and racial/ethnic disparities. While 56 percent and 38 percent of Asian and white adults, respectively, held post-secondary degrees in 2007, the same was true of only 25 percent and 18 percent of blacks and Hispanics. These deep divides by race and ethnicity coincide with growing disparities across metropolitan areas owing to economic and demographic change. In knowledge-economy areas such as Boston, MA; Washington, D.C.; and San Francisco, CA, more than 40 percent of adults hold a bachelor’s degree. Meanwhile, in metro areas that have attracted large influxes of immigrants, such as Houston, TX; Greenville, NC; and most of California’s Central Valley, more than 20 percent of adults lack a high school diploma. And some Sun Belt metro areas, such as Las Vegas, NV, and Riverside, CA, have fast-growing populations at both ends of the attainment spectrum.
  • Even before the onset of the current recession, poverty rose during the 2000s, and spread rapidly to suburban locations. Both the overall number of people living in poverty and the poverty rate rose from 2000 to 2007; today, working-age Americans account for a larger share of the poor than in the last 30 years. After diverging in the 1970s and 1980s, the gap between central-city and suburban poverty rates has narrowed somewhat. More notably, the suburban poor surpassed the central-city poor in number during this decade, and now outnumber them by more than 1.5 million. The suburban poor have spread well beyond older, inner-ring suburbs, which in 2005-2007 housed less than 40 percent of all poor suburban dwellers. Yet even as poverty spreads throughout the metropolis, the concentration of poverty in highly distressed communities—after dropping in the 1990s—appears to be rising once again in the 2000s.
Even as the nation enters an extended period of economic uncertainty, the continued demographic dynamism of our metropolitan areas raises key policy and program issues for the new government in Washington. Steps to implement the recovery package wisely, pursue immigrant integration alongside immigration reform, close educational achievement and attainment gaps, combine the planning of transportation and housing, and provide needed support to low-income workers and families should take account of our constantly evolving and changing metropolitan populations.

Download the full report » (PDF)

Job Losses From Obama Green Stimulus Foreseen in Spanish Study

Job Losses From Obama Green Stimulus Foreseen in Spanish Study. By Gianluca Baratti
Bloomberg, Mar 27, 2009

Subsidizing renewable energy in the U.S. may destroy two jobs for every one created if Spain’s experience with windmills and solar farms is any guide.

For every new position that depends on energy price supports, at least 2.2 jobs in other industries will disappear, according to a study from King Juan Carlos University in Madrid.

U.S. President Barack Obama’s 2010 budget proposal contains about $20 billion in tax incentives for clean-energy programs. In Spain, where wind turbines provided 11 percent of power demand last year, generators earn rates as much as 11 times more for renewable energy compared with burning fossil fuels.

The premiums paid for solar, biomass, wave and wind power - - which are charged to consumers in their bills -- translated into a $774,000 cost for each Spanish “green job” created since 2000, said Gabriel Calzada, an economics professor at the university and author of the report.

“The loss of jobs could be greater if you account for the amount of lost industry that moves out of the country due to higher energy prices,” he said in an interview.

Spain’s Acerinox SA, the nation’s largest stainless-steel producer, blamed domestic energy costs for deciding to expand in South Africa and the U.S., according to the study.

“Microsoft and Google moved their servers up to the Canadian border because they benefited from cheaper energy there,” said the professor of applied environmental economics.

The Government's Influence on the Stock Market

The Government's Influence on the Stock Market, by Alan Reynolds
Forbes, March 25, 2009.

Treasury Secretary Tim Geithner seemed to go from zero to hero in one day when the stock market soared on March 23, ostensibly because of his latest plan to help banks unload illiquid securities of uncertain worth. The Wall Street Journal headline shouted, "Toxic-Asset Plan Sends Stocks Soaring."

But homebuilder stocks jumped as much as bank stocks, suggesting the same day's news about a 5.1% jump in existing-home sales deserves much of the credit. Any remaining credit should go to Fed Chairman Ben Bernanke, not Geithner.

The rally in financial stocks began after Ben Bernanke's March 11 speech to the Council on Foreign Relations. He came out strongly against the nationalization of banks and admitted that the bookkeeping problems of many banks are largely an artifact of foolish federal regulations. Bernanke said, "capital standards, accounting rules and other regulations have made the financial sector excessively procyclical."

Together with similar comments from Barney Frank and the Financial Accounting Standards Board (FASB) on March 16, Bernanke's comments hinted that regulators might avoid imposing crippling capital requirements and loan loss write-offs on the basis of dubious mark-to-market accounting for illiquid assets. He also made it clear for the first time that Treasury's stress test for banks "is not pass-fail." Many had assumed the stress test was just that--a way to decide which banks were worth saving and which would be allowed to fail.

The S&P500 index for financial stocks rose 29% from 96.82 on March 10 to 126.01 on March 18. But the index suddenly sunk to 109.71 by Friday, March 20, on worries that Geithner's remarks the following Monday would be poorly received. Despite Monday's rally, financials closed at 120.75 on the following day--4% lower than they had been on March 18.

Until recent headlines gave Geithner undue credit for a one-day rally, the administration officials had correctly insisted such daily moves could be misleading. Weekly stock market moves, however, are not so easy to brush off. The administration must learn to respect sustained market reactions to its policy proposals because the loss of stockholder wealth has had a devastating effect on consumers, banks and businesses.

This table compares the S&P500 index for financial stocks on the day when federal policies were announced to the level of that index one week later.

[graph]

A 31% drop in the market capital of financial stocks followed nationalization of Fannie Mae and Freddie Mac on Sept. 7, 2008, in just seven days. It suddenly became very dangerous to hold shares in any bank or insurance company large enough to be suddenly expropriated without shareholder approval.

On Oct. 3, 2008, the House reluctantly authorized the $700 billion Paulson-Geithner-Bernanke TARP slush fund. Many votes switched from "no" to "yes" because the S&P 500 stock index had dropped to 1,100 the day before. A week after the law was enacted, however, financial stocks were 23% lower.

On Oct. 14, 2008, Treasury pulled a "bait and switch" trick with its capital purchase plan. Nine banks were compelled to sell preferred shares to the government despite protests from Wells Fargo and Morgan Stanley. The teaser rate was a 5% dividend, later raised to 8%-9%.

Although the new plan was described as "injecting capital," it is a thinly disguised loan. Few noticed this deception except Binyamin Appelbaum of the Washington Post who on explained on Feb. 25, 2009, that, "The Bush administration went so far last year as to rewrite the regulatory definition of capital to include the federal aid, which comes in the form of preferred shares. So far, investors have not been swayed. To them, the government aid doesn't look like reliable capital. It looks a lot like a loan that the government wants back."

Faced with new risks from dilution, warrants and federal caps on dividends, investors fled from TARP-impaired stocks. As a result, those costly "capital injections" probably added zero capital, on balance. They just replaced high-quality equity capital with bogus public capital (i.e., more debt). Financial stocks collapsed after the government began this incremental quasi-nationalization, falling 46% in five weeks.

A few days after the presidential election, FDIC Chairman Sheila Bair began pushing a plan to persuade or compel mortgage lenders to modify loans to slash monthly payments. If regulators or bankruptcy judges could do that, the added risk of devaluation of the assets behind mortgage-backed securities must make those assets more toxic. So, financial stocks fell 24%.

A week before the Inauguration, President-elect Obama asked Congress for the second $350 billion of TARP funds. Financials fell another 25%. The day after the Inauguration, the Wall Street Journal's front page featured this perceptive headline: "Banks Hit by Nationalization Fears: Financials Plunge as U.S. Considers New Rescue Options."

On Feb. 10, 2009, Treasury Secretary Geithner announced his "Plan A," which confirmed those fears of creeping nationalization. Geithner implicitly acknowledged that government preferred stock was not the sort of genuine capital needed to pass his stress test. Yet his new plan to convert such shares into common stock made it even more obvious that private shareholders would be first to be sacrificed. A Feb. 19 Wall Street Journal report by Peter Eavis declared, "Government capital injections sit like ill-disguised Trojan horses in the nation's largest banks."

Although Geithner's Feb. 10 presentation did prompt a sell-off of bank stocks (Citi shares fell 50% by Feb. 20), its impact is difficult to disentangle from the Feb. 13 Congressional agreement on a "stimulus package." That so-called stimulus bill was, of course, a $787 billion deferred tax increase, plus interest. Although investors could not be sure when the tax bill would be presented, Obama warned us that the first installment would come from higher tax rates on dividends, capital gains, stock options and the profits from foreign affiliates of multinational corporations. S&P financials were at 133.13 the day before Geithner spoke and 123.17 the day before Congress hiked future taxes by $787 billion. The index then fell to 96.18 on Feb. 23 and to 81.74 by March 6. If anything was stimulated, it was short-selling.

Since last September, the federal government has justified numerous costly and heavy-handed programs by claiming each new intervention would help the banks and restore confidence to financial markets. The only objective measure of success or failure is the market value of financial stocks. By that standard, government solutions have been the biggest problems.

Alan Reynolds is a senior fellow with the Cato Institute.

US Gov't Sponsors Technology That Enhances Recovery of Natural Gas

DOE-Sponsored Technology Enhances Recovery of Natural Gas in Wyoming
Researchers Seek Patent for Isotopic Ratio to Evaluate Water in Coalbeds
US Energy Dept, March 26, 2009

Washington —Research sponsored by the U.S. Department of Energy (DOE) Oil and Natural Gas Program has found a way to distinguish between groundwater and the water co-produced with coalbed natural gas, thereby boosting opportunities to tap into the vast supply of natural gas in Wyoming as well as Montana.

In a recently completed project, researchers at the University of Wyoming used the isotopic carbon-13 to carbon-12 ratio to address environmental issues associated with water co-produced with coalbed natural gas. The research resulted in a patent application for this unique use of the ratio. An added benefit of the project, which was managed by the National Energy Technology Laboratory for the DOE Office of Fossil Energy, was the creation of 27 jobs over the project’s 2+ years.

"The co-mingling of groundwater and coalbed natural gas co-produced water has placed environmental limits on recovering natural gas and limited the Nation’s ability to make full use of its domestic energy resources," said Dr. Victor K. Der, Acting Assistant Secretary for Fossil Energy. "The University of Wyoming’s success provides a technical opportunity to drill new wells in Wyoming and Montana, while monitoring the quantity and quality of water at the well sites and protecting freshwater resources."

Dealing with co-produced water has been one of the most difficult issues for researchers involved in finding the best, most environmentally sound methods for recovering natural gas in Wyoming’s Powder River Basin. That issue is significant for the nation’s energy and environmental health because the number of coalbed methane wells in the basin increased from 18,077 total wells in December 2004 to 27,280 in November 2008—an increase of 50 percent.

To produce gas from coalbed natural gas wells, operators must first pump out some of the water that is naturally contained in the gas-bearing coal seams. The large volume of these waters presents a major challenge and has led researchers to examine potential impacts and beneficial uses for the water.

The University of Wyoming researchers used stable isotopic tracers, along with available water quality data, to look at three separate issues in the Powder River Basin. They monitored the infiltration and dispersion of coalbed methane co-produced water into shallower subsurface areas, and then determined those locations where coal seams are isolated from adjacent aquifers and co-produced water was limited to coal. Finally, the researchers evaluated the information provided by isotopic analyses of carbon, oxygen, and hydrogen in the co-produced waters.

The research indicated that the concentration of dissolved inorganic carbon and the isotopic carbon-13 to carbon-12 ratio are effective tracers in distinguishing groundwater from co-produced water. This discovery holds promise that different concentrations of dissolved inorganic carbon and isotopic ratios can be used to monitor the infiltration of co-produced water into streams and groundwater over a long period of time. The method can also be used to reduce the amount of co-produced water.

NETL's Oil and Natural Gas Environmental Solutions Program supports development of new technologies to encourage efficient oil and natural gas recovery and to ensure adequate, secure oil and gas supplies. Multiple projects aim to resolve the challenges associated with producing, handling, and treating co-produced water and to transform it into an asset in areas of the country where water is much needed.

Robbing the Pentagon to Pay Foggy Bottom

Robbing the Pentagon to Pay Foggy Bottom, by Jamie Fly
The Weekly Standard Blog, March 23, 2009 12:16 PM

After initially falling for the administration’s rhetoric that the $533 billion they were allocating for the Defense Department in the Fiscal Year 2010 budget represented an increase over the Bush administration’s final budget (an argument skillfully deconstructed by Tom Donnelly), the press now seems to have woken up to the fact that the administration has something different in mind. Recent press reports provide a rather disturbing preview of the administration’s plans for the defense budget and preparations to use Secretary Gates as a human shield of sorts to counter expected criticism from defense contractors and Republicans in Congress. It seems Gates’ role in the ongoing Pentagon review is so key that he has decided to skip the NATO summit in early April so he can devote himself to his “efforts to strategically rebalance the department’s budget.”

The administration’s plan to get us to European levels of defense spending by 2016 comes at the same time as reports that the Chinese government stated that its official defense budget will jump by 14.9 percent this year (who knows how high the actual amounts are) and an announcement by Russian President Medvedev that Russia will undertake a “large-scale rearming” by 2011 given that “there are a range of regions where there remains serious potential for conflicts.”

There are a multitude of reasons that cutting defense at this juncture is a bad idea. The systems reportedly on the chopping block include a new Navy destroyer, the Air Force’s F-22, Army ground-combat vehicles, and possibly aircraft carriers, all integral elements of our ability to project power and defend our allies. Missile defense, now out of favor because it is unpopular in Moscow, is expected to take a significant hit as well. One can debate the merits of each of these systems or programs, but these cuts will result in thousands of job losses -- an inconvenient fact that may wake up some Democratic members of Congress who seem willing to throw money at failing banks and insurance companies but not at the Pentagon.

But beyond the programs and systems, the administration’s budget offers insight into its mentality about national security. Listen to some Democratic budget experts and you hear phrases like a “return to normalcy” used to describe the administration’s plans for defense spending. In this “normal” world, there is no mention of 9/11, or of the fact that countries such as Russia and China still threaten our allies and our interests. Military commitments in the Middle East or South Asia 5-10 years from now, what military commitments?

The difficulty for Republicans who wish to oppose these cuts is that for the last four years, key figures in the Bush administration helped set the stage for the Obama defense budget we now have in front of us. It became a fad during the second Bush term to deliver speeches in front of the foreign policy establishment or delivery testimony and warn about the “militarization” of U.S. foreign policy. Secretaries Rice and Gates and eventually Chairman Mullen all participated in this group therapy session played out in public. Many on the left pounced on these comments and today use them to justify cuts to defense to help pay for an increase in funding for a supposedly struggling State Department.

It is true that in recent years the U.S. military has all too often been asked to conduct missions they have not been trained for and probably should not be doing, but that is not a reason to cut defense spending. Nor is it a reason to expand the Foreign Service at the cost of the F-22 or missile defense. Our allies in Central and Eastern Europe and in Asia want American boots on the ground as a sign of our commitment to their defense, not more diplomats in pinstripes and wingtips.

While Secretary Gates is meeting with the green eyeshade crowd at the Pentagon in early April instead of enjoying a glass of Riesling in Strasbourg-Kehl at the NATO Summit, he may want to think long and hard about whether he wants to become the chief advocate for a defense budget that will undermine, not strengthen our long term security.

DLC: On behalf of the nation’s children, Obama is prepared to take on members of his own party and the special interests

Education reform is a defining issue. By Harold Ford Jr
Politico, Mar 26, 2009 @4:36 AM ED

President Barack Obama’s recent speech on education reform demonstrates that he is willing to put the full weight of his office behind fixing our failing schools. He called for higher standards, more charter schools, merit pay and eliminating bad teachers. When many of our urban school districts are graduating only 25 percent to 50 percent of their students, he knows that the failed methods and orthodoxies must be jettisoned for what will work.

The brave new world of the 21st century demands much more from our children. Obama’s ambitious and sweeping agenda will help educate and equip them to make the most of the opportunities created by an integrated global economy.

While there is a broad national consensus for education reform in the country, Obama expects that special interests will oppose his reform agenda. Those who do will fight vigilantly to hold onto the failed schools that shame us as a nation.

But their actions will put them against the best interests of our children and on the wrong side of history. Teachers unions and education groups have expressed opposition in the past to ideas like merit pay and charter schools. They are strongly opposed to a successful voucher program in Washington, D.C., which tragically was killed by Senate Democrats in the omnibus spending bill that passed the Senate last week.

On behalf of the nation’s children, Obama is prepared to take on members of his own party and the special interests. Along with turning around the economy, education reform could become the defining issue of his presidency.

Toward that end, the president and Secretary of Education Arne Duncan should consider hosting an education reform summit at the White House. The focus could be on what is working in public schools around the country. This list of “best practices” should be studied, evaluated and shared with principals and teachers — especially in schools that are underperforming.

He could invite education groups, teachers unions, principals, teachers and education leaders who have a proven record of reform and inform them how they could qualify for federal funding for programs that comply with the policy ideas of the Obama administration.

The genius of America is that we have always been able to overcome the challenges we face. Acknowledging our failures and focusing on methods and programs that have succeeded in educating our children are the best place to start.

It is also time we wake the sleeping giant: the parents who have children attending public schools. Alexis de Tocqueville said that people in a democracy “reign supreme.” The parents of public schoolchildren have never fully realized the power they have to bring change to underperforming schools.

With the financial support of the nation’s leading charitable foundations, Parent Teacher Associations around the country could be transformed into a national grass-roots effort to advocate for reform of our schools. Patterned after the missionary zeal and political sophistication of the Children’s Defense Fund, PTAs could be organized in school districts nationwide. Parents — motivated by wanting a world-class education for their children and being highly informed and organized — could bring persistent pressure to members of Congress to adopt an agenda of change to fix our failing schools.

What is at stake is nothing less than the American dream. To pass it on to our children and generations to come, we must restore quality and innovation to all our schools. President Obama knows that our legacy of excellence in education must be redeemed and, with his speech a couple of weeks ago, he has set us on a course to give our children the knowledge and skills they need to compete in this new and changing world.

As Americans, it’s time we think of our obligations to each other. It’s time we take seriously our collective responsibility for future generations. Providing our children — regardless of race, class or religion — with a world-class education is what binds us together and will make our country stronger.

President Obama’s plan to reform our schools will help our children live up to their God-given potential. We don’t have a moment to lose. Congress should enact his education reform proposal this year.

Former Rep. Harold Ford Jr. (D-Tenn.) is chairman of the Democratic Leadership Council.