Showing posts with label junk science. Show all posts
Showing posts with label junk science. Show all posts

Thursday, May 28, 2009

Autism Exploitation

Autism Exploitation. By Marvin Schissel
ACSH, May 27, 2009

The rate of diagnosed autism in the country today has increased from 1 in 10,000 in 1995 to 1 in 150 today. However, this likely reflects increased information and awareness about autism, the expansion of diagnostic criteria, more thorough and accurate diagnoses, and the classification of many cases as autism that would previously have been recorded as mental retardation. Autism is a lifelong condition that has a devastating effect on individuals and on their families. It is understandable that those involved with the autism spectrum can be desperate for help, for any hope of help. And this desperation makes them ready prey for charlatans.

There are therapies, ABA (Applied Behavioral Analysis) and CBT (Cognitive,Behavior Therapy), that are considered helpful. But they are not widely available, can be expensive and time consuming, and may offer only limited progress. This sets the stage for sham artists who eagerly hurl themselves into the breach with information and treatment that is false, ineffective and harmful.

The false assertion that connects autism with vaccines and mercury has led to lower vaccination rates and a marked increase in those major diseases that vaccination protected against; should the trend continue we may be faced with an epidemic. But promoters of this deceit are cashing in with books, lectures, and, worst of all, scientifically unsupported treatments that are not only ineffective but can be dangerous. Such treatments include chelation, hyperbaric oxygen, lupron, and a wide variety of other questionable therapies, diets, and ineffective behavioral regimens.

The vaccine quackery started in 1998 with a published paper by Andrew Wakefield and twelve others that suggested a link with MMR vaccine and autism. This study was criticized as flawed and ten of its twelve authors have since disassociated themselves from its assertions.

Subsequently it was revealed that previous to the study Wakefield had received well over a half million dollars from lawyers hoping to sue vaccine companies. Recently it was claimed that Wakefield falsified his data. Worse yet, it has been discovered that Wakefield, prior to his publication, had applied for a patent for a new measles vaccine: if he could prove the old vaccine was dangerous a new vaccine would be very profitable. But the bottom line is that with all the studies that have been done worldwide involving over half a million children, no association between autism and vaccines has ever been demonstrated, and this counterfeit controversy has been scientifically laid to rest. Unfortunately, this phony issue still rages among the scientifically ignorant public.

In the news recently have been the activities of Dr. Mark Geier and his son David, longtime campaigners in the arena of dubious autism activity. Mark Geier has appeared as an "expert" witness in over a hundred cases, although he has been criticized by courts for being intellectually dishonest and not having appropriate training, expertise and experience. Reputable scientists have repeatedly dismissed the Geiers' autism research as seriously flawed. A front-page article in the NY Times actually made fun of the pretentions of the Geiers and their naive lab facilities. But they are not deterred by criticism, and their latest venture is opening clinics around the country offering autism treatment with the dangerous drug Lupron. Lupron alters levels of testosterone and is sometimes used to chemically castrate sex offenders; no scientific support exists for it to treat autism. To use it for autism has been called irresponsible.

Wakefield and the Geiers are by no means the only offenders in the autism world. They are just two examples of the probably thousands of impostors exploiting the desperation of the autism community. The only solution will be a better understanding on the part of the public of the principles of science, and a much clear-cut and louder noise made by the legitimate scientific community.

Dr. Marvin J. Schissel is a dentist and an advisor to the American Council on Science and Health, the National Council Against Health Fraud, and the Committee for Scientific Investigation of Claims of the Paranormal and has a son with autism.

Monday, May 25, 2009

WSJ Editorial Page: Malaria, Politics and DDT - The U.N. bows to the anti-insecticide lobby

Malaria, Politics and DDT. WSJ Editorial
The U.N. bows to the anti-insecticide lobby.
WSJ, May 25, 2009

In 2006, after 25 years and 50 million preventable deaths, the World Health Organization reversed course and endorsed widespread use of the insecticide DDT to combat malaria. So much for that. Earlier this month, the U.N. agency quietly reverted to promoting less effective methods for attacking the disease. The result is a victory for politics over public health, and millions of the world's poor will suffer as a result.

The U.N. now plans to advocate for drastic reductions in the use of DDT, which kills or repels the mosquitoes that spread malaria. The aim "is to achieve a 30% cut in the application of DDT worldwide by 2014 and its total phase-out by the early 2020s, if not sooner," said WHO and the U.N. Environment Program in a statement on May 6.

Citing a five-year pilot program that reduced malaria cases in Mexico and South America by distributing antimalaria chloroquine pills to uninfected people, U.N. officials are ready to push for a "zero DDT world." Sounds nice, except for the facts. It's true that chloroquine has proven effective when used therapeutically, as in Brazil. But it's also true that scientists have questioned the safety of the drug as an oral prophylactic because it is toxic and has been shown to cause heart problems.

Most malarial deaths occur in sub-Saharan Africa, where chloroquine once worked but started failing in the 1970s as the parasite developed resistance. Even if the drugs were still effective in Africa, they're expensive and thus impractical for one of the world's poorest regions. That's not an argument against chloroquine, bed nets or other interventions. But it is an argument for continuing to make DDT spraying a key part of any effort to eradicate malaria, which kills about a million people -- mainly children -- every year. Nearly all of this spraying is done indoors, by the way, to block mosquito nesting at night. It is not sprayed willy-nilly in jungle habitat.

WHO is not saying that DDT shouldn't be used. But by revoking its stamp of approval, it sends a clear message to donors and afflicted countries that it prefers more politically correct interventions, even if they don't work as well. In recent years, countries like Uganda, Tanzania and Zambia have started or expanded DDT spraying, often with the help of outside aid groups. But these governments are also eager to remain in the U.N.'s good graces, and donors typically are less interested in funding interventions that WHO discourages.

"Sadly, WHO's about-face has nothing to do with science or health and everything to do with bending to the will of well-placed environmentalists," says Roger Bate of Africa Fighting Malaria. "Bed net manufacturers and sellers of less-effective insecticides also don't benefit when DDT is employed and therefore oppose it, often behind the scenes."

It's no coincidence that WHO officials were joined by the head of the U.N. Environment Program to announce the new policy. There's no evidence that spraying DDT in the amounts necessary to kill dangerous mosquitoes imperils crops, animals or human health. But that didn't stop green groups like the Pesticide Action Network from urging the public to celebrate World Malaria Day last month by telling "the U.S. to protect children and families from malaria without spraying pesticides like DDT inside people's homes."

"We must take a position based on the science and the data," said WHO's malaria chief, Arata Kochi, in 2006. "One of the best tools we have against malaria is indoor residual spraying. Of the dozen or so insecticides WHO has approved as safe for house spraying, the most effective is DDT." Mr. Kochi was right then, even if other WHO officials are now bowing to pressure to pretend otherwise.

Tuesday, May 5, 2009

Famine-monger Lester Brown still gets it wrong after all these years

Never Right, But Never in Doubt. By Ronald Bailey
Famine-monger Lester Brown still gets it wrong after all these years
Reason, May 5, 2009

"Could food shortages bring down civilization?," asks environmental activist Lester Brown in the current issue of Scientific American. Not surprisingly, Brown's answer is an emphatic yes. He claims that for years he has "resisted the idea that food shortages could bring down not only individual governments but also our global civilization." Now, however, Brown says, "I can no longer ignore that risk." Balderdash. Brown, head of the Earth Policy Institute, has been a prominent and perennial predictor of imminent global famine for more than 45 years. Why should we believe him now?

For instance, back in 1965, when Brown was a young bureaucrat in the U.S. Department of Agriculture, he declared, "the food problem emerging in the less-developing regions may be one of the most nearly insoluble problems facing man over the next few decades." In 1974, Brown maintained that farmers "can no longer keep up with rising demand; thus the outlook is for chronic scarcities and rising prices." In 1981, Brown stated that "global food insecurity is increasing," and further claimed that "the slim excess of growth in food production over population is narrowing." In 1989, Brown contended that "population growth is exceeding the farmer's ability to keep up," concluding that, "our oldest enemy, hunger, is again at the door." In 1995, Brown starkly warned, "Humanity's greatest challenge may soon be just making it to the next harvest." In 1997, Brown again proclaimed, "Food scarcity will be the defining issue of the new era now unfolding."

But this time it's different, right? After all, Brown claims that "when the 2008 harvest began, world carryover stocks of grain (the amount in the bin when the new harvest begins) were at 62 days of consumption, a near record low." But Brown has played this game before with world grain stocks. As the folks at the pro-life Population Research Institute (PRI) report, Brown claimed in 1974 that there were only 26 days of grain reserves left, but later he upped that number to 61 days. In 1976, reserves were supposed to have fallen to just 31 days, but again Brown raised that number in 1988 to 79 days. In 1980, only a 40-day supply was allegedly on hand, but a few years later he changed that estimate to 71 days. The PRI analysts noted that Brown has repeatedly issued differing figures for 1974: 26 or 27 days (1974); 33 days (1975); 40 days (1981); 43 days (1987); and 61 days (1988). In 2004, Brown claimed that the world's grain reserves had fallen to only 59 days of consumption, the lowest level in 30 years.

In any case, Brown must know that the world's farmers produced a bumper crop last year. Stocks of wheat are at a six-year high and increases in other stocks of grains are not far off. This jump in reserves is not at all surprising considering the steep run-up in grain prices last year, which encouraged farmers around the world to plant more crops. By citing pre-2008 harvest reserves, Brown evidently hopes to frighten gullible Scientific American readers into thinking that the world's food situation is really desperate this time.

Brown argues that the world's food economy is being undermined by a troika of growing environmental calamities: falling water tables, eroding soils, and rising temperatures. He acknowledges that the application of scientific agriculture produced vast increases in crop yields in the 1960s and 1970s, but insists that "the technological fix" won't work this time. But Brown is wrong, again.

It is true that water tables are falling in many parts of the world as farmers drain aquifers in India, China, and the United States. Part of the problem is that water for irrigation is often subsidized by governments who encourage farmers to waste it. However, the proper pricing of water will rectify that by encouraging farmers to transition to drip irrigation, switch from thirsty crops like rice to dryland ones like wheat, and help crop breeders to develop more drought-tolerant crop varieties. In addition, crop biotechnologists are now seeking to transfer the C4 photosynthetic pathway into rice, which currently uses the less efficient C3 pathway. This could boost rice yields by 50 percent while reducing water use.

To support his claims about the dangers of soil erosion, Brown cites studies in impoverished Haiti and Lesotho. To be sure, soil erosion is a problem for poor countries whose subsistence farmers have no secure property rights. However, one 1995 study concluded that soil erosion would reduce U.S. agriculture production by 3 percent over the next 100 years. Such a reduction would be swamped by annual crop productivity increases of 1 to 2 percent per year—which has been the average rate for many decades. A 2007 study by European researchers found "it highly unlikely that erosion may pose a serious threat to food production in modern societies within the coming centuries." In addition, modern biotech herbicide-resistant crops make it possible for farmers to practice no-till agriculture, thus dramatically reducing soil erosion.

Brown's final fear centers on the effects of man-made global warming on agriculture. There is an ongoing debate among experts on this topic. For example, University of California, Santa Barbara economist Olivier Deschenes and Massachusetts Institute of Technology economist Michael Greenstone calculated that global warming would increase the profits of U.S. farmers by 4 percent, concluding that "large negative or positive effects are unlikely." Other researchers have recently disputed Deschenes' and Greenstone's findings, arguing that the impact of global warming on U.S. agriculture is "likely to be strongly negative." Fortunately, biotechnology research—the very technology fix dismissed by Brown—is already finding new ways to make crops more heat and drought tolerant.

On the other hand, Brown is right about two things in his Scientific American article: the U.S. should stop subsidizing bioethanol production (turning food into fuel) and countries everywhere should stop banning food exports in a misguided effort to lower local prices. Of course these policy prescriptions have been made by far more knowledgeable and trustworthy commentators than Brown.

Given the fact that Brown's dismal record as a prognosticator of doom is so well-known, it is just plain sad to see a respectable publication like Scientific American lending its credibility to this old charlatan.

Ronald Bailey is Reason magazine's science correspondent. His book Liberation Biology: The Scientific and Moral Case for the Biotech Revolution is now available from Prometheus Books.

Monday, May 4, 2009

Study: Declining Great Lakes Levels Entirely Natural

Study: Declining Great Lakes Levels Entirely Natural. By Henry Payne

Detroit, Mich. — Like polar bears, hurricanes, and arctic ice caps, recent drops in Great Lake water levels have been a poster child for green activists’ claims that the global warming crisis is upon us. A sampling:

April, 2003, Detroit News: “A group of scientists predicted that global warming will wreak havoc on the Great Lakes region . . . the largest single concentration of fresh water in the world.”
October, 2003, Detroit Free Press: “The idea that warming has benefits may be a particularly tough sell to Michiganders already disturbed by what happens when the Great Lakes drop near historic lows.”

April, 2007, Detroit News: “Data from a new United Nations report on climate change . . . strengthens scientific opinion that Michigan will see other dramatic effects in the coming decades: lower Great Lakes water levels, a dramatically receding Lake St. Clair. . . . ”

May, 2008, Detroit News: “A report released by an environmental group warns that unless Congress acts to curb global warming, Great Lakes water levels will drop up to 3 feet; beaches will close more often, and fish and animal populations will decline.”

Never mind.

In a comprehensive, two-year study of Great Lakes water levels, Canadian and American researchers working for the International Joint Commission this week found Mother Nature was to blame. “It’s not ongoing. It has definitely stabilized,” said Ted Yuzyk, the Canadian co-chair of the study, who added the changes have reversed in the last two years anyway. “And it’s not human driven. This is more natural.”

“Record high levels were seen in the early 1950s, in 1973, and again in 1985-1986,” reads The International Upper Great Lakes Study. “In the late 1990s, a nearly 30-year period of above-average water level conditions in the upper Great Lakes ended. Since then, Lake Michigan-Huron and Lake Superior have experienced lower than average lake level conditions.”

Among the natural factors that explain the lakes cyclical rise and fall, reported the Detroit News, “were changing climate patterns, including greater rain and snow” and “shifts in the ’s crust, called glacial isostatic adjustment, that are the result of the planet's rebound from the melting of glaciers 10,000 years ago.”

Green groups were not amused. Facts are such inconvenient things.

Tuesday, April 28, 2009

The war on mining: Fighting back

The war on mining: Fighting back. By Silvia Santacruz
The Financial Post, April 22, 2009

Gold has become a safe haven as jittery investors move away from weakened stock markets, and currencies are threatened by inflation. But the allure of gold goes well beyond its future value or price per ounce.

The demand for the precious metal has propelled economic growth in the developing world as investment in exploration has led to significant job creation and improvements in health. Despite this, the industry is under attack by environmental NGOs, which accuse it of bringing poverty and pollution to the regions.

The war on mining is global. In 2007, Newmont CEO Richard Ness was cleared in a 21-month Indonesian criminal trial for the firm’s alleged pollution of Buyat Bay. National Geographic criticized the same operation on the grounds that mine’s benefits—$391-million in local royalties and taxes, 8,000 jobs and $3-million in welfare projects—accrue to only five of the nearest communities.

In Ecuador, NGOs sow alarm among poor communities with claims that if large-scale mining were to start near them, their rivers would be contaminated, their animals and crops would die and their children would fall ill. To prove their point, environmentalists play videos of the damage that mercury, cyanide and arsenic can cause, blithely ignoring the fact that new techniques no longer use those chemicals and cause little environmental impact.

Poverty, not the natural resources industry, is the biggest enemy of people. So what would the anti-mining activists’ success mean for the communities where they are concentrating their efforts?

In Africa and Indonesia, the world’s four largest gold producers—Barrick, Gold Fields, Newmont and Anglo-Gold Ashanti—are engaged in the fight against HIV/AIDS, tuberculosis and malaria, which kill thousands in the developing world every year. The industry works in partnership with nonprofits like International SOS, IFC Against AIDS, African Medical Research (AMREF) and Global Business Coalition on HIV/AIDS, among others.

“Gold mining companies are particularly affected by the triple disease threat of HIV/AIDS, tuberculosis and malaria,” explains Maureen Upton, a World Gold Council official, in a 2008 study. “It is difficult to think of what other industry faces a situation where in certain locations 30% of its employees are infected with a fatal disease such as HIV, or where a similar percentage is likely to be infected with malaria.”

In Ghana, AngloGold Ashanti hired a worldwide authority on insecticide resistance, Professor Richard Hunt, who found that the dominant mosquito species were completely or partially resistant to three standard insecticides but susceptible to another one not being provided by the World Health Organization. The company responded by initiating a program that reduced malaria infections by 73% in scarcely two years.

Also in Ghana, Gold Fields launched the Bowoho Ban (“Protect Yourself “) weekly radio program to educate people about HIV/AIDS. In South Africa, where AngloGold Ashanti’s workforce has an HIVinfection rate of 30%—which, while high, is still lower than South Africa’s national average of 44%—the firm hired AIDS Peer Educators who persuade mine workers and community members to undergo HIV testing and counselling. The response among mine workers during 2007-2008 was 100%, up from 40% during 2006-2007.

Newmont is fighting malaria in Indonesia by distributing bed nets, clearing larvae and talking to residents about malaria prevention. The incidence of malaria among children in the area of Newmont’s project declined from 47% in 1999 to 13% in 2000 (the project’s first year) to 1.5% in 2007.

If mining companies were to pull out in the wake of government or activist pressure, many poor rural communities in developing countries would be left with no job opportunities, hope for development or health programs. Mining companies invest in these programs to keep a healthy and productive workforce, which, in turn, benefits underdeveloped towns.

To take that away would be a crime.

Silvia Santacruz is the Warren T. Brookes Journalism Fellow at the Competitive Enterprise Institute, writer-editor at Ecuador Mining News and a contributor to Openmarket.org.

Long-Term Storm Predictors Post a Sorry Track Record but Vow to Improve

For Early Hurricane Forecasts, Consult a Telepath. By Carl Bialik
Long-Term Storm Predictors Post a Sorry Track Record but Vow to Improve; One Certainty Is Their Guesses Get a Lot of Press
WSJ, apr 29, 2009

If analysts did no better than predicting stock prices would equal the average of the last five years, one would hope they'd find a different career -- or at least take their work private while they refined their techniques.

That's the sorry track record of climatologists who each year predict the number of hurricanes that will threaten the Caribbean and Southeastern U.S. before the storm season begins on June 1. Yet their seasonal forecasts continue to garner headlines in the spring as reliably as groundhogs and their shadows.

In early 2005, predictions ranged from 11 to 14 tropical storms -- compared with an average of 14 in the prior five years -- with seven or eight hurricanes, compared with a five-year average of seven. The storm season instead brought Katrina, Rita and 13 other hurricanes among the 27 named storms.Numbers Guy Blog

The forecasts' flaws were evident before that big miss and have continued since then. The next two years they overshot; last year, at last, they were right in predicting a typical year. This year, most forecasters are calling for below-average activity.

"It's as if they're presenting their data in the middle of a study, before they reach their conclusions," says Robert S. Young, director of the program for the study of developed shorelines at Western Carolina University. "They should keep doing what they're doing, and they shouldn't tell anyone about it until they've figured it out."

Yet even as academics, government agencies and private industry crowd into the forecasting arena, they're bumping up against obstacles that may render accurate forecasting so far ahead of time impossible. Some forecasts are based on past years with similar patterns, but the climatology record doesn't go back far enough to lend much confidence. And it's hard to even detect these weather patterns far in advance -- even giant patterns that determine the intensity of a season. El Niño, or warming of Pacific Ocean waters, tends to suppress hurricanes; La Niña, unusually cold Pacific waters, tends to increase storm activity. Yet neither of these seasonal effects can be predicted with much reliability before the late spring.

"Until you really get into the spring and the weather patterns start to set up, it's really hard to get any kind of decent forecast as to what's going to go on in the summer and fall," says Chuck Watson, who works on forecasts of damage from hurricanes. Anytime before spring, "You might as well throw a dart."

Or hire a gibbon and a trance medium to compete with the dart thrower. That was the stunt dreamed up by reporter Bo Petersen of the Post & Courier of Charleston, S.C., in 2007, after several years of more straightforward reporting of professionals' ultimately errant forecasts. The trance medium beat out Mr. Petersen, the gibbon, the dart thrower -- and the pros. This comedic contest was borne out of a serious problem, according to Mr. Petersen: "The sense we got from emergency-management people here is that the forecasts had been so wrong that they were hearing from the public, 'Why should we pay any attention to this stuff?' "

Some forecasters update their predictions once the season has begun. And those forecasts do well. But none of the major forecasts that come out before June has improved significantly on a simple prediction scheme that calls for the same number of named storms and hurricanes as the average of the five prior years. And some do much worse.

Forecasters often are open about their failings. Philip Klotzbach, who works on the Colorado State University forecast, and others post analyses of their accuracy, which is more than, say, political pundits do. And they say it's good scientific practice to publish their work in progress.

But why publish press releases and even, in some cases, hold press conferences? "Part of the reason we even do our press conference and release our data is, well, everyone else is," Mr. Watson says. He adds that research funders generally encourage the publicizing of the fruits of their grant money: "From a funding and research standpoint, you've almost got to release it," Mr. Watson says. "It's part of that game."

Even if the forecasts were dead-on, they wouldn't do emergency managers much good. The number of named tropical storms and hurricanes can have little to do with the damage they create: Hurricane Andrew struck in 1992, a year of below-average storm counts. "The total number of storms is a red herring," says Joe Bastardi, chief long-range and hurricane forecaster for AccuWeather.com. "It's a joke." Many forecasts include more useful measures such as the number of storms that hit land or the accumulated cyclone energy, which quantifies total storm intensity. But news reports often focus on the more-accessible predictions of storm counts.

The industries most affected by hurricanes focus more closely on the short-term forecasting of individual storms, an endeavor with much higher accuracy.

"The insurance industry is always interested to hear the long-range hurricane predictions, but they don't directly influence what companies do," says Loretta L. Worters, a spokeswoman for the Insurance Information Institute. More important is whether the storms hit U.S. soil, she says.

Forecasters say their task is complicated by the subjectivity involved in determining which storms are named, a reflection of intensity and greater likelihood they'll make landfall. The National Hurricane Center, an arm of the National Oceanic and Atmospheric Administration, decides when to append a name. "I don't understand why some storms are getting named and others are not getting named," Dr. Bastardi says.

Perhaps one to two additional storms are being named each year than would have been a few decades ago, thanks to improvements in technology and climate science, according to Christopher Landsea of the hurricane center. To some climatologists, the naming standards have gotten too lax. NOAA named 13 storms in 2007, prompting a press release from the Weather Research Center in Houston saying its forecast of seven named storms was dead-on -- after subtracting the six storms it deemed unworthy of naming, because they only briefly featured the levels of wind and pressure characteristic of tropical storms.

Several forecasters question NOAA's dual role as forecaster, through its Climate Prediction Center, and as forecast arbiter, via the National Hurricane Center. The concern is that those scientists deciding whether to name storms late in the season might feel pressure to base their decision in part on how it would reflect on their colleagues' predicted counts. "In some sense, they hold the cards," James Elsner, a professor of geography at Florida State University, says of NOAA scientists.

Gerry Bell, NOAA's lead seasonal hurricane forecaster, says, "There is absolutely no conflict," pointing to the separation between the two arms of the government agency that forecast storms and name storms.

Climatologists are making a new forecast -- this time about their ability to get predictions right. "It's inevitable, with increasing computing power and an increase in the understanding of the dynamics of El Niño, that skill will climb in the long term," says Adam Lea, who works on forecasts from University College London's Hazard Research Centre. "We will get better."

[Full article w/graphs here.]

Wednesday, April 22, 2009

Why don't environmentalists celebrate modern farming on Earth Day?

Yielding to Ideology Over Science. By Ronald Bailey
Why don't environmentalists celebrate modern farming on Earth Day?
Reason, April 21, 2009

One might think that environmentalists would celebrate the accomplishments of modern farming on Earth Day. After all, the biggest way humanity disturbs the natural world is in how we produce food. Agriculture uses up more land and water than any other human activity. To the extent that we want to preserve biodiversity and protect natural areas, boosting agricultural productivity is the most vital thing that we can do.

Since 1960 global crop yields have more than doubled, with the benefit that the area of land devoted to producing food has not increased very much. If farmers were still producing food at 1960 levels of productivity, agriculture would have had to expand from 38 percent of the earth's land to 82 percent to feed the world's current population. This enormous increase in yields is the result of applying more artificial fertilizers, breeding higher yielding crops, a wider use of pesticides and herbicides, and expanding irrigation. More recently, advances in modern biotechnology have also contributed to boosting yields. However, last week, the Union of Concerned Scientists (UCS) released a new report, Failure to Yield: Evaluating the Performance of Genetically Engineered Crops, by its senior scientist Doug Gurian-Sherman that tries to make the case that modern crop biotechnology should be largely abandoned because it has failed to increase agricultural yields.

Failure to Yield begins by noting that, in the United States, 90 percent of soybeans and 63 percent of the corn crop are biotech varieties. Genes have been inserted in these varieties (called transgenic or genetically engineered by the report) to confer pest and herbicide resistance on the crops. The UCS study distinguishes between intrinsic yield, the highest yield possible under ideal conditions, and operational yield, the yield obtainable in the field taking into account factors like pests and environmental stresses. The study then asserts, "No currently available transgenic varieties enhance the intrinsic yield of any crops."

In addition, Gurian-Sherman claims that biotech crops have only marginally increased operational yields for corn (largely through insect resistance traits) and not at all for soybeans in the United States.

First, keep in mind that farmers are not stupid, and especially not poor farmers in developing countries. The UCS report acknowledges that American farmers have widely adopted biotech crops in the past 13 years. Why? "The fact that the herbicide-tolerant soybeans have been so widely adopted suggests that factors such as lower energy costs and convenience of GE soybeans also influence farmer choices." Indeed. Surely saving fossil fuels that emit greenhouse gases should be viewed by a UCS advocacy scientist as an environmental good. And what does Gurian-Sherman mean by "convenience"? Later, he admits that biotech herbicide resistant crops save costs and time for farmers. Herbicide resistance is also a key technology for expanding soil-saving no-till agriculture which, according to a report in 2003, saved 1 billion tons of topsoil from eroding annually. In addition, no-till farming significantly reduces the run-off of fertilizers into streams and rivers.

The UCS report correctly observes, "It is also important to keep in mind where increased food production is most needed—in developing countries, especially in Africa, rather than in the developed world." Which is exactly what is happening with biotech crops in poor countries. Currently, 13.3 million farmers around the world are planting biotech crops. Notably, 90 percent of the world's biotech farmers, that is, 12.3 million, are small and resource-poor farmers in developing countries like China, India, and South Africa. Gurian-Sherman is right that biotech contributions to yields in developed countries are relatively modest. Farmers here already have access and can afford modern agricultural technologies so improvements are going to be at the margins. Nevertheless, it is instructive to compare the rate of increase in corn yields between the biotech-friendly U.S. and biotech-hostile France and Italy over the past ten years. University of Georgia crop scientist Wayne Parrott notes, "In marked contrast to yield increases in the U.S., yields in France and Italy have leveled off."

The yield story is very different in poor countries. For example, a 2006 study found that biotech insect resistant cotton varieties boosted the yields for India's cotton farmers by 45 to 63 percent. Amusingly, some anti-biotech activists counter that these are not really yield increases, merely the prevention of crop losses. Of course, another way to look at it is that these are increases in operational yields. Whether due to yield increase or crop loss prevention, in 2008 this success led to nearly 70 percent of India's cotton fields being planted with biotech varieties. Similarly, biotech insect resistant corn varieties increased yields (or prevented losses) by 24 percent in the Philippines.

The UCS report also declares, "We must not simply produce more food at the expense of clean air, water, soil, and a stable climate, which future generations will also require." Biotech varieties are already helping farmers to achieve those environmental benefits.

Gurian-Sherman notes that crops typically use only 30 to 50 percent of nitrogen fertilizers they receive. Nitrogen fertilizer contributes to water pollution and is the primary source of anthropogenic nitrous oxide, a greenhouse gas that is 300 times more potent than carbon dioxide. Agriculture contributes up to 12 percent of man-made global warming emissions. So one would think that a new biotech variety of rice created by Arcadia Biosciences, which needs 50 to 60 percent less nitrogen fertilizer than conventional varieties, would be welcomed by the UCS. But it isn't. The really good news is that research into transferring this same set of fertilizer-thrifty genes into other crops is moving rapidly forward.

Another promising area of research involves using genetic engineering to transfer the C4 photosynthetic pathway into rice, which currently uses the less efficient C3 pathway. This could boost rice yields tremendously, perhaps as much 50 percent, while reducing water use. In addition, researchers are pursuing all manner of other ways to boost crop production including salt, heat, and drought tolerance, along with viral, fungal, and bacterial disease resistance. All of these biotech techniques could improve crop productivity and thus reduce agriculture's toll on land, water, and air resources.

"To the extent to which groups like UCS have advocated prohibitive and disproportional regulations, they are responsible for the lack of even greater achievements in operational yield and perhaps even in intrinsic yield," notes Parrott. "In fact UCS is on the record as opposing engineered stress tolerance in crops. Such a stance by UCS is untenable and contradictory—yield losses caused by adverse growing conditions defeats the purpose of having a higher intrinsic yield—that is why it is so important to increase operational yield, and increasing operational yield is done with resistance to biotic and abiotic stresses—i.e., adverse growing conditions."
Increasing crop yields to meet humanity's growing demand for healthful food while protecting the natural world will require deploying the full scientific armamentarium. This includes advances in crop breeding, improvements in cultivation practices, the safer deployment of fertilizers, pesticides, and herbicides—and, yes, genetic engineering. It is odd that while the UCS accepts the scientific consensus on man-made global warming, it refuses to accept the scientific consensus on the safety, usefulness, and environmental benefits of biotech crops.

"In the end, after helping prevent scientific advances with genetically modified crops," notes Parrott, "the UCS is not in a good position to be calling genetically modified crops a failure because their scientific advances have not been greater."

Ronald Bailey is Reason magazine's science correspondent. His book Liberation Biology: The Scientific and Moral Case for the Biotech Revolution is now available from Prometheus Books.

Rebuttal re Erroneous Analysis on Transgenic Insecticidal Crops - Loevei, Lovei

Rebuttal re Erroneous Analysis on Transgenic Insecticidal Crops.
Crop Biotech Update/ISAAA, Apr 17, 2009

An article by Lövei et al. (Transgenic insecticidal crops and natural enemies: a detailed review of laboratory studies, Environmental Entomology 38(2): 293-306 (2009)) purports that insect-protected crops based on the Cry proteins of Bacillus thuringiensis may have substantial negative impacts on non-target organisms.

A group of experts in this area strongly disagreed with this April, 2009 publication and felt that a rapid response was required but, because of production schedules of this bi-monthly journal, it could not accommodate a rapid rebuttal. Thus, A. M. Shelton and 14 colleagues published their Letter to the Editor in Transgenic Research (Setting the Record Straight: A Rebuttal to an Erroneous Analysis on Transgenic Insecticidal Crops and Natural Enemies).

Among the many concerns Shelton and colleagues describe in their rebuttal are the inappropriate and unsound methods for risk assessment that led Lövei et al. to reach conclusions that are in conflict with those of several comprehensive reviews and meta-analyses. Shelton summarized the concerns of the 15 authors by stating, "The Lövei et al. article advocates inappropriate summarization and statistical methods, a negatively biased and incorrect interpretation of the published data on non-target effects, and fails to place any putative effect into a meaningful ecological context." What was also troubling to this international group of 15 experts is the potential for the Lövei et al. article to be accepted at face value and impact some regulatory agencies.

Their rebuttal can be accessed using the following link: http://www.springerlink.com/content/q7hk642137241733/. The article is open access and freely available to all and is published as DOI: 10.1007/s11248-009-9260-5. It will be published in print form in the June issue of Transgenic Research.

Friday, April 17, 2009

Small Cars Are Dangerous Cars - Fuel economy zealots can kill you

Small Cars Are Dangerous Cars. By Sam Kazman
Fuel economy zealots can kill you.
CEI, Apr 17, 2009

The super-high efficiency minicar has become the Holy Grail for many environmentalists. But on Tuesday, a new study on minicar safety tossed some cold water on the dream. The Insurance Institute for Highway Safety (IIHS) reported that in a series of test crashes between minicars and midsize models, minis such as the Smart car provided significantly less protection for their passengers.

The tests did not involve the much ballyhooed mismatches between subcompacts and Hummers, but measured the effect of relatively modest differences in size and weight. Even though the Smart car and other minis such as the Honda Fit and the Toyota Yaris have fared relatively well in single-car crash tests, they performed poorly in these two-car frontal offset collisions. In the words of IIHS president Adrian Lund, "though much safer than they were a few years ago, minicars as a group do a comparatively poor job of protecting people in crashes, simply because they're smaller and lighter."

That difference is reflected in the real world. The death rate in minis in multi-vehicle crashes is almost twice as high as that of large cars. And in single-vehicle crashes, where there's no oversized second vehicle to blame, the difference is even greater: Passengers in minis suffered three times as many deaths as in large cars.

Given the nonstop pronouncements we've been hearing about the green promise of high-efficiency cars, these results were shocking to some. But not to IIHS. The Institute has long been reporting similar results from other tests, and its publications candidly advise that, when it comes to safety, larger and heavier cars are generally better.

That's not what advocates of higher fuel-economy standards want to hear. Greater weight may increase crashworthiness, but it also decreases miles per gallon, so there's an inevitable trade-off between safety and efficiency. A 2002 National Research Council study found that the federal Corporate Average Fuel Economy (CAFE) standards contributed to about 2,000 deaths per year through their restrictions on car size and weight. But amazingly, with the exception of IIHS, there's practically no one else providing information on the size-safety issue:

- Not the National Highway Traffic Safety Administration, which has a highly dubious track record on CAFE. In a 1992 lawsuit filed by the Competitive Enterprise Institute, and Consumer Alert, a federal appeals court found the agency guilty of using "mumbo jumbo" and "legerdemain" to conceal CAFE's lethal effects.
- Not the Environmental Protection Agency, which is about to become a major partner in setting CAFE standards. EPA is often fixated on minute risks, such as radon in drinking water, but don't expect it to admit to CAFE's dangers. Its official mission may be "to protect human health and the environment," but its operating philosophy seems to be "not necessarily in that order."
- Not Ralph Nader and his allied traffic safety groups, which are often CAFE's most energetic cheerleaders. Decades ago, Mr. Nader and his colleagues repeatedly warned of the hazards of small cars. The Center for Auto Safety's 1972 book "Small -- On Safety," noted "the inherent limitations" that "small size and light weight" impose on crashworthiness. But in the 1990s both Mr. Nader and the Center reversed their position. Why? Because CAFE presented them with a stark choice between more government power and more safety. They went for more power.
- Not Consumer Reports, which has consistently failed to mention the importance of size and weight in discussing how to choose a safer car. Though it is regarded as the information bible by many car buyers, not a single one of its annual auto issues in the last five years has touched on this topic.

As the National Research Council reported, the current CAFE program -- 27.5 mpg for passenger cars -- contributed to about 2,000 deaths. But driving is going to get even more lethal over the next decade: CAFE standards will be raised to a 35 mpg combined average for cars and light trucks. And with the notable exception of IIHS, information about those risks may be hard to come by.

Mr. Kazman is general counsel of the Competitive Enterprise Institute.

Wednesday, April 8, 2009

Now banned on campus: bottled water

Now banned on campus: bottled water. By Angela Logomasini
Originally published in The Union Leader. April 6, 2009

There is a new “sin” industry on college campuses. It’s not beer, fast food or tobacco. It’s water! Universities around the nation have begun to deny students the option to drink bottled water, removing it from vending machines and campus stores.

Why? They are following the advice of environmental activist groups that say students should “drink responsibly” — which to them means tap water. Drinking bottled water is supposedly wasteful because you get basically the same thing from a tap. Yet their claims don’t hold water, and surely don’t warrant this silly prohibition.

At the extreme is Washington University in St. Louis, MO. As part of its “Tap It” campaign, the school took a symbolic step in promoting sustainability, according to student body representative, Kady McFadden. This “step” basically banned bottled water from campus stores and vending machines, except where sales must continue until bottled water contracts expire.

These actions ignore the important reasons why some people choose bottled water. Among them is predictable quality. Tap water, on the other hand, periodically experiences quality problems that cause governments issue health alerts.

In the spring of 2008, Penn State — a campus considering prohibitions on bottled water — declared a tap water health advisory, calling students to boil water or drink bottled water. Fortunately, it was eventually determined that the water was OK. Such incidents reveal that overreliance on tap water doesn’t make sense and why people appreciate other options.

Even places that claim to have exceptional tap water — such as New York City — experience problems. New York’s Columbia/New York Presbyterian Hospital has provided bottled water to its patients for drinking and brushing teeth since 2005 after two patients died from Legionnaire’s disease which transmitted via city tap water. Because tap water must travel through pipes, it can develop such quality problems along the way.

In addition to safety issues, piped water can suffer flavor defects from contaminants found in pipes, disinfectants, or from the water source. Some sources, such as the Potomac River next to Washington D.C., are home to species of algae that periodically impact tap water flavor.

This is not to suggest that most tap water isn’t generally pretty safe. The United States has some of the best quality tap water in the world. However, it is not correct for environmentalists to deny the unique challenges and quality differences that tap water possesses. Nor is it fair to deny students and other consumers the option to pick a product with fewer such issues or one they simply like better.

In fact, bottled water delivers consistent results. Seventy five percent of bottled water is drawn from non-municipal sources, such as springs and aquifers, which provide water on a sustainable long-term basis. Many of these sources have supplied quality water for decades. Other distributors purify municipal water, providing a higher quality product than simply opening the tap, and the packaging ensures the quality is maintained during delivery.

Still opponents of bottled water argue that plastic bottles have been the source of excessive waste. Yet the bottles contribute less than 0.3 percent of solid waste, which is managed safely via recycling and landfilling.

This debate over bottled water has taken calls for “dry” campuses to a whole new level! Many people desire their water will taste just as sweet or crisp as the last time they bought it. And why not? There is no good reason why anyone else should deprive them access to those products—on campus or anywhere else.

Charles Huang is a student at the University of California, Berkeley, and Angela Logomasini, Ph.D., is director of risk and environmental policy at the Competitive Enterprise Institute.

The Advantages of Incremental Innovation in Drug Development

Pharmaceutical Evolution. By Albert I. Wertheimer and Thomas M. Santella
The Advantages of Incremental Innovation in Drug Development
CEI, April 7, 2009

Innovation is the lifeblood of the pharmaceutical industry. Over the last century, that industry has been responsible for thousands of new drugs, based on hundreds of thousands of smaller incremental innovations. The breakthrough “blockbuster” drugs taken by millions of patients today were not produced from thin air. Most represent the combined weight of seemingly small improvements achieved over time. The advantages of incremental improvements on existing drugs are paramount to overall increases in the quality of health care. As the pharmaceutical industry developed, classes of drugs—those with similar chemical composition and which treat similar conditions—have grown to provide physicians with the tools they need to treat diverse patient groups.

Still, critics have been highly condescending about what they call “Me-too” drugs—drugs within the same chemical class as one or more others already on the market—which they claim add little or no therapeutic value and are nothing more than an opportunity for pharmaceutical companies to fleece unsuspecting consumers. While some claim that there are too many similar drugs, and that pharmaceutical industry research and development could be more profitably directed toward developing entirely new classes of medicines, drugs based on incremental improvements generally represent advances in safety and efficacy. They also provide new formulations and dosing options that significantly increase patient compliance—both of which lead to improved health outcomes. From an economic standpoint, adding new drugs to a class of medicines also offers the possibility of lower drug prices as competition between manufacturers increases. Additionally, pharmaceutical companies depend on incremental innovations to provide the revenue that will support development of the riskier, capital-and research-intensive blockbuster drugs.

When critics refer to Me-too drugs, they do not mean exact generic copies of already existing drugs, or illegal counterfeits. Instead, Me-toos have a similar chemical composition to one or more others on the market, and have similar biological effects. But, in order to be approved, Me-too drugs must undergo the same extensive clinical testing as other new drugs to determine their safety and efficacy because they are chemically different. In addition, these differences, even if small, typically must represent a medical advancement—such as fewer side effects or improved efficacy for patient sub-populations—in order to attract a portion of the market away from the first approved drug in the class. Nevertheless, many drug industry critics have called for federal policies to inhibit the development and marketing of such incrementally improved medicines. But policies that curb incremental innovation will ultimately lead to a reduction in the overall quality of existing drug classes and could arrest the creation of truly novel drugs.

Research in any industry is a building process. Few scientists develop groundbreaking drugs from no prior research. Most work within, and respond to, existing knowledge—reading the same medical literature, and reacting to new technological breakthroughs at the same time. It is not hard to imagine, therefore, that many different companies would be working on similar drugs. In fact, it is often the case that the only reason why one drug is called novel and another a Me-too analogue is the speed at which each moves through the regulatory process.

Like other technological and value-added industries, the pharmaceutical industry depends on small steps for the creation of blockbuster drugs, which often result from a long series of small innovations. It also depends on these steps for the creation of drugs that provide slight, incremental improvements on existing drugs—thereby adding to a drug class, increasing competition among drugs, and incentivizing further innovation. As the National Research Council has observed, “the cumulative effect of numerous minor incremental innovations can sometimes be more transforming and have more economic impact than a few radical innovations or ‘technological breakthroughs’.” The net effect of increasing the number of drugs through innovation leads to advances in safety, efficacy, selectivity, and utility of drugs within a specific class.

Importantly, providing physicians with a variety of prescription options within a given therapeutic class is paramount to the provision of optimal health care. This is especially true for some drug classes, such as those relating to the central nervous system, for which overall response rates can be as low as 50 percent. For unknown reasons, certain patients respond differently to different drugs within a single class. If physicians have many options at their disposal, they can calibrate their prescribing patterns to better address the needs of specific patients. The existence of multiple similar molecular agents also provides backup in situations where the novel drug in a class is found to have unacceptable side effects and is thus removed from the market. As patients come to depend on a particular class of drugs, it is essential to make sure that they do not lose access to needed medication as a result of regulatory action.

One of the most vehement criticisms made against Me-too drugs is that they siphon money away from research that could be devoted to the creation of novel breakthrough drugs. This assumption is incorrect for a host of reasons, the most important of which is the fact that the pharmaceutical industry depends on selling the products of incremental innovations to provide the revenue for research and development of breakthrough drugs. Additionally, while it is unrealistic to presume that every incremental innovation leads to cost savings, the sum of all drug innovations can result in cost savings by reducing overall treatment costs, shortening or obviating hospital stays, increasing worker productivity and reducing absenteeism, and lowering drug costs through increased competition among manufacturers.

Ideally, every new drug would represent an unprecedented breakthrough and lead to the creation of a completely novel treatment. This, however, is not the reality of the pharmaceutical industry, or of any other development-based industry. Creating drugs based on incremental innovations provides pharmaceutical companies with a secure stream of revenue, which can be directed to higher-risk, potential blockbuster-yielding research. Policies aimed at reducing the industry’s ability to obtain revenues from incremental innovations could be self-defeating, as those industries will then have less revenue to reinvest in R&D for new drugs. Put simply, limiting incremental drug innovation is analogous to limiting competition. The ultimate result could have devastating consequences for the future of the pharmaceutical industry and for the millions of patients who depend on it.

The authors and CEI would like to thank the International Policy Network in London, which published an earlier version of this paper.

Full paper: Wertheimer and Santella - Pharmaceutical Evolution.pdf

Tuesday, April 7, 2009

WSJ Editorial: The Silicosis Abdication

The Silicosis Abdication. WSJ Editorial
A scam that deserves as much scrutiny as Lerach and Scruggs.
WSJ, Apr 07, 2009

It is going on four years since a Texas judge blew the whistle on widespread silicosis fraud, exposing a ring of doctors and lawyers who ginned up phony litigation to reap jackpot payouts. So where's the enforcement follow-up?

That's an especially apt question given news that New York's State Board for Professional Medical Conduct has finally revoked the license of Dr. Ray Harron. He was among the doctors who Texas Judge Janis Graham Jack showed had fraudulently diagnosed thousands of plaintiffs with silicosis, a rare lung disease. These doctors were later called to testify in Congress, where many, including Dr. Harron, took the Fifth Amendment.

Dr. Harron has since lost his medical licenses in California, New Mexico, Texas, Florida, North Carolina and Mississippi. This is progress, though hardly sufficient. Among the questions Congress asked state departments of health during the silicosis hearings were why those bodies hadn't moved to shut down these doctors and their mobile X-ray vans at the time they were committing medical malpractice.

New York is belatedly joining the queue, and its order stripping Dr. Harron of his license is particularly noteworthy. After outlining his unethical actions, and citing other medical boards that had denied him a new license, it summarized: "[Dr. Harron] was part of an operation to find plaintiffs with silicosis whether or not they really had silicosis. This is perpetrating a fraud on the courts."

Precisely. The question is what anybody else is doing about it. Judge Jack's findings inspired U.S. attorneys in the Southern District of New York to convene a grand jury investigation into silicosis fraud. The criminal division of the Texas state attorney general also went this route. We know both juries subpoenaed doctors and documents involved in the Jack case. While these physicians bear responsibility for negligent medical practices, the Jack trial and Congressional hearings made clear that many were taking orders from the trial bar. Dr. Harron has stated in court that he "capitulated" to attorney demands that he include inaccurate language in his silicosis reports.

Yet these grand juries have yet to result in prosecutions. The feds and Texas aside, it would seem incumbent upon New York State Attorney General Andrew Cuomo to follow up on his own state medical board's determination of fraud. A follow-up is especially important given that, prior to their silicosis escapade, these doctors made millions working for trial attorneys on asbestos. According to the Johns Manville Bankruptcy Trust, six of the doctors at the center of the silicosis fraud were also responsible for at least 140,000 asbestos-lawsuit diagnoses. Dr. Harron alone diagnosed an astonishing 51,048 people with asbestos-related disease.

The silicosis litigation machine broke down after Judge Jack's ruling, yet hundreds of thousands of phony asbestos-related diagnoses continue to clog courts. An Ohio state court in 2006 dismissed all cases that relied solely on Dr. Harron, and a federal court in Philadelphia recently did the same. But with medical boards now admitting these doctors were at the center of silicosis schemes to defraud courts, prosecutors ought to pursue their role in asbestos too.

The silicosis and asbestos scams are as corrosive to justice in their way as the cases that resulted in convictions for Bill Lerach, Dickie Scruggs and Melvyn Weiss for kickbacks or bribery. The difference is that these asbestos cases are still in court.

Tobacco cessation therapies, cell phone towers, &c.

ACSH Dispatches Round-Up: Tobacco cessation therapies, cell phone towers, &c. By Elizabeth Wade
ACSH, Apr 04, 2009


April 3, 2009

Congress's Pro-Smoking Bill and Anti-Book Law, plus Radiation Hysteria

Quitting smoking just got harder

A story about a study concluding that smokers who use nicotine replacement therapy are twice as likely to quit for six months than those who were given placebos reminds us of the dire straits we are in with regard to tobacco cessation therapy. What isn't reported until the end of the news story is that only 6.75% of the smokers given the nicotine replacement therapy managed to quit for six months -- and only half of them are expected to remain smoke-free in the future.

"These abysmal quit rates show that our current smoking cessation therapies are almost never effective," says ACSH's Dr. Gilbert Ross. "People who close their eyes to alternative cessation therapies, such as smokeless tobacco as harm reduction, are being ostriches at the expense of the over 40 million addicted smokers in this country -- and who knows how many millions around the world."

Unfortunately, the U.S. took a step in the wrong direction yesterday when the House of Representatives passed the Kennedy-Waxman bill giving the FDA regulatory control of tobacco and defeated Rep. Steven Buyer's (R-IN) harm reduction amendment in a 284-142 vote. Lawmakers expect a tighter vote in the Senate, and ACSH looks forward to offering our science-based perspective to the continuing debate. We wholeheartedly agree with Rep. Buyer when he says, "Effectively giving an FDA stamp of approval on cigarettes will improperly lead people to believe that these products are safe, and they really aren't. We want to move people from smoking down the continuum of risk to eventually quitting."

The current battle over the e-cigarette illustrates the problems with our country's approach to tobacco policy. The e-cigarette delivers a hit of nicotine vapor when a person "smokes" it, so smokers who are trying to quit can satisfy their craving without inhaling the harmful products of combustion produced by cigarettes. But because the FDA has yet to approve the e-cigarette as a nicotine-delivery device, this new technology could be banned until it undergoes the approval process. "You can't blame the FDA for enforcing the law, but we have bad laws about tobacco that lead to bad public policy outcomes," says ACSH's Jeff Stier.


Attack of the cell phone towers!

A group of Staten Island parents and lawmakers are up in arms about the possibility of cell phone towers sending low-level radiation into a nearby school. "When they traced the source of radiation, which was above average but not dangerous, they found that it wasn't connected to the cell phone towers at all," Dr. Ross remarks. "But they are still trying to get them removed!"
Dr. Whelan adds, "When people believe scares like this they become totally irrational. When they really believe that these towers are emitting dangerous levels of radiation, how do you convince them otherwise?"

Dr. Ross jokes, "The only way to appease them seems to be getting all the students at this school metal helmets." ACSH debunked the wrongheaded notion that cell phones cause brain cancer in our Top 10 Unfounded Health Scares of 2008. For more information, see our publication The Health Effects of Low-Level Radiation.


CPSIA remains intact, Prop 65 grows even more ridiculous

Unfortunately, the Senate rejected Senator Jim DeMint's (R-SC) amendment to the stimulus bill that would have reformed the Consumer Product Safety Improvement Act (CPSIA). "The Senate failed to take a breath of fresh air," Stier says. As we have written before, the CPSIA places an impossible burden on many small businesses by banning certain types of phthalates and requiring that every children's product be tested for minuscule levels of lead. If a business can't afford the expensive testing, it must throw out the products -- even all-terrain vehicles (ATVs) and children's books!

As summarized so succinctly in today's Wall Street Journal editorial, "With one stroke of the regulatory pen, an estimated $100 million of inventory can't be sold, and the industry loss may reach $1 billion."

In California, we see the results of another absurdly stringent "public health" measure, Proposition 65, which requires warnings to accompany any product that contains "toxic chemicals." To avoid lawsuits, businesses in the state have taken to posting frivolous "warning signs" about products that do not pose any danger to consumers' health.

"Prop 65 is not based on health or science, but rather perceptions and politics," Dr. Ross says. For more on the consequences of the misguided law, check out Stier's op-ed "Perils of Global Warnings" from the Washington Times.

Friday, April 3, 2009

Children's toys, the Consumer Product Safety Improvement Act and the lawmakers' intentions

Toys R Congress. WSJ Editorial
Ruining the kids motorcycle business
WSJ, Apr 03, 2009

Last year's Consumer Product Safety Improvement Act was supposed to make children safer by reducing the risk of lead poisoning in toys. Instead, the new law has become a case study in how hastily written regulation can club the economy and reduce consumer safety.

This bill was passed by wide margins in Congress and signed into law by President Bush in the aftermath of the controversy over lead paint in imported toys from China. The new law, which took effect in February, establishes strict limits on lead levels in products for children. Never mind that in 2008 only one American child was injured from lead poisoning from toys.

What few on Capitol Hill anticipated was how the new law would devastate the domestic toy industry. According to the American Toy Association, the new rules will cost retailers and toy makers an estimated $2 billion for compliance and removing children's products from the shelves even though they pose no real health threat. Even old children's books are being cleared from stores and libraries.

The multibillion-dollar children's motorcycle and all-terrain vehicle industry has been clobbered. Kids motorcross racing has boomed in recent years in rural and Western states. And the regulators at the Consumer Product Safety Commission (CPSC) have decided that virtually all of these youth vehicles violate the new standards because of lead in the brakes, tire valves and gears. They've ordered motorcycle dealers to stop selling them, putting hundreds of dealers and the entire motorcross industry in a depression. With one stroke of the regulatory pen, an estimated $100 million of inventory can't be sold, and the industry loss may reach $1 billion.

While safety concerns need to be paramount, there is virtually zero threat of lead poisoning from riding a motorcycle. One study by Dr. Barbara Beck of Harvard finds that a youth's intake of lead from riding a motorcycle is less than the amount from drinking water. Even the CPSC admits in a letter to Congress that the lead-intake risk from youth motorcycles is "remote at best."

The introduction in recent years of smaller cycles for kids under 12 has increased safety by replacing heavier cycles more prone to accident and more severe injury. According to a study by the Motorcycle Industry Council, "90% of the youth fatalities and injuries on motorcycles occur when kids ride adult vehicles." Those are what kids will ride if the CPSC ban stays in effect. Ken Luttrell, a Democratic state house member from Oklahoma, says, "With these new regulations, Washington has only succeeded in making biking much more dangerous for kids."

The inane regulations are leading to a backlash against Congress and the CPSC. A resolution calling for a year delay in implementing the new law so the industry has time to adjust passed the Oklahoma legislature 101-0 last week. Missouri and Nevada legislatures have passed similar resolutions. California's burgeoning cycle community is so enraged that some motorcycle dealers are openly defying the sales ban. On Wednesday a coalition of toy users and manufacturers held a rally in Washington to "stop the toy ban."

But so far the folks in Washington aren't interested in what families or employers think. Henry Waxman, a scourge of private business and ally of Speaker Nancy Pelosi, refuses even to hold hearings. Meanwhile, the Obama Administration has called for a major increase in the CPSC budget. Don't you feel safer already?

Fantasizing about capped 350 ppm CO2

Conference of the Century! (Fantasizing about capped 350 ppm CO2). By Marlo Lewis
Master Resource, March 30, 2009

Well, how else should we describe a conference addressing “The Greatest Challenge in History”? That’s what the 350 Climate Conference, to be held May 2 at Columbia University, calls global warming, which it also asserts is ”likely the greatest threat humanity has ever faced.”
The number “350″ refers to the “safe upper limit” of carbon dioxide (CO2) concentrations in the atmosphere–350 parts per million (ppm)–according to NASA scientist and Columbia University professor James Hansen, who will keynote the conference. Atmospheric CO2 levels today are roughly 385 ppm.

The online conference flyer explains:

While the exact limit–whether it be 550, 450, 350, or even lower–is subject to debate, the need for proactive strategies to climate change is clear. Vital issues directly relating to climate change, such as alternative energy and carbon sequestration, are likely to drive domestic and international policies for the decades and centuries to come. This conference will discuss the scientific, political, social and economic challenges and opportunities associated [with] reducing emissions and lowering atmospheric carbon levels.

Notice what’s missing from the program. There are “challenges and opportunities” associated wtih reducing emissions and lowering CO2 levels, but, apparently, no risks, no perils, no threats to humanity. That’s dishonest, daffy, or both.

For several years, the UN, the European Union, and numerous environmental groups have said that the world must reduce CO2 emissions 50% below 1990 levels by 2050 in order to “stabilize” atmospheric CO2 concentrations at 450 ppm by 2100.

Newsweek science reporter Sharon Begley (no skeptic she) interviewed Cal Tech chemist Nathan Lewis (no skeptic either) on what it would take just to keep atmospheric CO2 levels from reaching 450 ppm:

Lewis’s numbers show the enormous challenge we face. The world used 14 trillion watts (14 terawatts) of power in 2006. Assuming minimal population growth (to 9 billion people), slow economic growth (1.6 percent a year, practically recession level) and—this is key—unprecedented energy efficiency (improvements of 500 percent relative to current U.S. levels, worldwide), it will use 28 terawatts in 2050. (In a business-as-usual scenario, we would need 45 terawatts.) Simple physics shows that in order to keep CO2 to 450 ppm, 26.5 of those terawatts must be zero-carbon. That’s a lot of solar, wind, hydro, biofuels and nuclear, especially since renewables kicked in a measly 0.2 terawatts in 2006 and nuclear provided 0.9 terawatts. Are you a fan of nuclear? To get 10 terawatts, less than half of what we’ll need in 2050, Lewis calculates, we’d have to build 10,000 reactors, or one every other day starting now. Do you like wind? If you use every single breeze that blows on land, you’ll get 10 or 15 terawatts. Since it’s impossible to capture all the wind, a more realistic number is 3 terawatts, or 1 million state-of-the art turbines, and even that requires storing the energy—something we don’t know how to do—for when the wind doesn’t blow. Solar? To get 10 terawatts by 2050, Lewis calculates, we’d need to cover 1 million roofs with panels every day from now until then. “It would take an army,” he says. Obama promised green jobs, but still.*

The point? In Begley’s words, “We can’t get there from here: Political will and a price on CO2 won’t be enough” to stabilize emissions at 450 ppm. The UN/EU emission reduction target is unattainable absent “Nobel caliber breakthroughs.” Meeting the target will require “revolutionary changes in the technology of energy production, distribution, storage, and conversion,” as one group of energy experts wrote back in 2002.

Now, if those breakthroughs do not occur, then the only way to bring the world into compliance with the UN/EU goal envisioned for Kyoto II would be to deny large segments of humanity the blessings of affordable energy. As I observed in an earlier post, there is nothing quite like economic collapse to cut emissions.

Now recall that the emission stabilization goal of the 350 Climate Conference is 100 ppm lower than the EU/UN goal. In a paper on his Web page, Lewis says that achieving 350 ppm by mid-century would require world CO2 emissions to drop to zero by that date.

There is no known way to get there except draconian cutbacks in economic output, population, or both. Poverty is of course a perenniel source of conflict within and among nations as well as the leading cause of preventable disease and premature death. Moreover, climate policies punitive enough to induce negative economic and population growth are likely to meet with resistance and promote conflict rather than peace.

Will any of the invited speakers at the 350 Conference address these risks in a serious ways? Not unless he (or she) is brave enough to be the skunk at the garden party and endure abuse from those who denounce dissent as villainy and treason.

* See also my colleague Iain Murray’s blog on Begley’s column.

Thursday, April 2, 2009

Libertarian: FDA Regulation Threatens Cigarette Alternatives

FDA Regulation Threatens Cigarette Alternatives. By Jacob Sullum
Reason, April 1, 2009, 1:14pm

This evening the House of Representatives is expected to approve a bill authored by Rep. Henry Waxman (D-Calif.) that would let the Food and Drug Administration regulate tobacco products. The bill, which is supported by Philip Morris but opposed by its smaller competitors, is also supported by the leading anti-smoking groups but opposed by some of their smaller competitors. Recently the dissenters in the anti-smoking movement have been highlighting one of the bill's major flaws: It would grandfather in all current cigarettes (except for those with politically incorrect flavors) while making it virtually impossible to introduce and promote safer alternatives.

One of those alternatives is snus, Swedish-style oral snuff, the health risks of which are negligible compared to those of cigarettes. The Waxman bill would not ban snus, but it would prohibit manufacturers from informing consumers about oral snuff's dramatic safety advantages. Another cigarette alternative, one that probably would be kept off the market altogether under the bill's regulatory standards, is electronic cigarettes, battery-powered devices that deliver odorless nicotine vapor instead of smoke, avoiding all the hazards associated with tobacco combustion products. Sen. Frank Lautenberg (D-N.J.) wants the FDA to take electronic cigarettes off the market "until they are proven safe." Even if the FDA does not ban e-cigarettes under its existing drug authority, their manufacturers probably would not be able to meet the test established by the Waxman bill for products that compete with cigarettes.

One anti-smoking group that supports snus, e-cigarettes, and other harm-reducing alternatives to standard cigarettes is the American Association of Public Health Physicians (AAPHP), which says (PDF):

A variety of non-pharmaceutical alternative nicotine delivery products are already on the market or in various stages of development and market testing. These include sticks, strips, orbs, lozenges and e-cigarettes. The information available suggests risk and benefit profiles similar to widely accepted pharmaceutical nicotine replacement products.

Holding the snus and alternative nicotine delivery to the research standards of pharmaceutical products would cost the manufacturers millions of dollars per product and would deny current smokers the benefits of these products for a decade or more. Furthermore, such studies probably could not be conducted at current American academic centers because Institutional Review Board (IRB) guidelines would likely prohibit case/control studies on products with no therapeutic benefit. Thus, the seemingly reasonable research standards in the Waxman bill would likely result in a de-facto ban on all such products. AAPHP therefore favors the research guidelines from the Buyer bill [alternative legislation introduced by Rep. Steve Buyer (R-Ind.)].

Since both the Waxman and the Buyer bills would approve currently marketed cigarettes, the most hazardous of all tobacco products, the standard for lower risk products for use by current smokers should be the hazard posed by cigarettes, not a pharmaceutical safety standard.

Bill Godshall of Smokefree Pennsylvania (who alerted me to the AAPHP statement), tobacco policy blogger Michael Siegel (who clued me in to the e-cigarette controversy), and the American Council on Science and Health also worry that FDA regulation could stifle the market for cigarette alternatives. I explain why the Waxman bill is bad for smokers here, here, and here. I discuss snus here, here, and here.

Wednesday, April 1, 2009

Making sense of the “killer meat” study

Making sense of the “killer meat” study. By Rebecca Goldin Ph.D and Trevor Butterworth
Modest risk suggests meat in moderation, but cancer researchers warn that too much is being made of the link between diet and cancer at the expense of smoking and obesity.
stats.org, March 30, 2009

Hundreds of news stories last week warned people that eating red meat raised their risk for cancer and death. The headline in the Los Angeles Times health section was succinct: “Killer meat,” and the opening graph warned:

“Before you dig into another hamburger, consider this: Americans who ate the most red meat boosted their overall risk of death by 30% during a 10-year period compared to those who ate the least, according to a new study. And before you switch to cold cuts instead, keep in mind that people who consumed the most processed meat raised their overall risk of death by at least 16%.”

Actually, the study didn’t quite say this. While this large prospective study did find a modest association between dying and eating meat, the risks cited were not due to one hamburger. “Meat Intake and Mortality: A Prospective Study of Over Half a Million People” which was published in the Archives of Internal Medicine didn’t, as many other studies on diet have done, pool numerous, smaller studies to achieve a high number of participants. The study tracked over half a million Americans aged 50 – 71 from eight states over ten years and started with a common baseline evaluation of diet, which was then tracked through questionnaires. Naturally, self-reporting always raises questions as to whether the participants are capable of complete fidelity and recall, but the researchers appear to have conducted spot checks, as well as adjust for confounders like smoking.

The researchers compared high levels of red and processed meat consumption (meaning those people in the top 20 percent for meat consumption as a proportion of their calories) to those eating low levels of red and processed meat (i.e. those in the bottom 20 percent consumption level). To give a sense of the difference among the two groups, people with the highest red meat consumption ate almost seven times as much meat as those in the lowest group. For a man, that amounted to 68.1g/1000kcal of meat per day, which is almost a 1/3 lb burger a day (based on the 2116 calorie diet these men typically ate). Those in the lowest quintile of meat consumption ate on average 9.3g/1000, which comes out to approximately the same burger once a week. So before you panic, consider how your red meat intake compares to the people in the study.

On the other hand, there was some good news for meat lovers as well: high levels of white meat consumption seem to lower your chance of death. For those in the highest quintile of white meat consumption (which includes poultry and fish), the risk of death was associated with an approximately eight percent lower chance of death in the ten years of the study, for both men and women. But a curious feature that might temper the benefits of white meat to nonsmokers is that high levels of white meat consumption seems to raise rather significantly their risk of cardiovascular disease. You’re in luck if you’re a smoker, however; for this group, white meat intake seemed to have no relationship to cardiovascular disease.

These were the results driving the interest in the study, although weirdly, the strangest association was between high red meat consumption in men versus low red meat consumption and mortality due to “injuries and sudden death.“

That result – a hazard ratio of 26 percent (meaning 26 percent more likely) – was buried by the media. The category included death from unintentional injury, adverse effects, suicide, self-inflicted injury, homicide, and legal intervention. The authors note that the number of deaths was low, but the mechanism is not clear. The finding is a reminder that mining epidemiological data can produce strange relationships. In particular, since it seems difficult to argue for causality, it suggests that red meat consumption may be linked to other behaviors that were not controlled for by the study. Are male red meat eaters likelier to take risks? Are suicidal old men more likely to eat red meat?

While the study has rather convincingly linked high levels or red meat to increased mortality, the purported risk increase is much lower than it is, for example, between smoking or obesity and cancer. Inevitably, this means that the causal link is weaker. As with any observational study, there are some limitations to drawing a causal line between red meat and cancer mortality. The study attempted to control for these factors, but it is impossible to control for everything. There is also no way to discern from this study whether eating less meat would provide the direct benefit of the magnitude of the study. One can only assume that the people who reported high levels of meat consumption had been eating that amount of meat for their entire lives.

Wider problems in nutrition research

The other, wider problem is that while red meat has provided figurative red meat for nutrition researchers, there has been increased criticism of the dramatic claims being made for the nutritional basis of cancer from actual cancer researchers. Many of the news stories said the study supported the claims by the World Cancer Research Fund linking red meat and cancer. For example, Forbes noted:

“Though nutrition experts frequently recommend eating less meat, Mozaffarian says research linking red and processed meat consumption and mortality weren't consistent. But last year, when the World Cancer Research Fund International reviewed the scientific literature on red meat intake and cancer, researchers determined a link between the two.”

Reuters quoted Ian Olver, Chief Executive Officer of Cancer Council Australia, saying that:

“This large study provides further evidence to support the recommendations by groups such as the World Cancer Research Fund in demonstrating an association between a high consumption of red and processed meats and a increase risk of death from cancer.”

But as STATS previously noted, the World Cancer Research Fund only managed to do achieve this link by excluding the largest ever study examining the association, whose publication had been delayed for three years after the results were initially made known. Those results did not show a link between cancer and meat consumption. The Harvard Pooling Project, which conducted that meta-analysis, and other recent research have thrown a wrench into the conventional scientific wisdom about nutrition and health, and the exclusion of some of its key studies from the World Cancer Research Fund has left some cancer researchers troubled.

A recent editorial in the Journal of Oncology written by the director of the International Agency for Research on Cancer (Boyle et al, Oct 2008) warned that smoking and obesity as significant causes for cancer were being minimized in the face of weak evidence for diet.

"In presenting its summary and recommendations, the [World Cancer Research Fund] report implicitly downplays the key importance of tobacco smoking in cancer causation. Contrary to that stated in the press release (the best advice for cancer prevention is to avoid weight gain), avoiding tobacco smoking and use of tobacco in other forms is the single best advice to reducing cancer risk as one-third of cancer deaths in high-income countries is attributable to tobacco use. Failure to include ‘stop smoking’ and ‘avoid exposure and exposing others to second-hand smoke’ among the 10 key recommendations undermines the most important message in cancer control. The ‘best advice’ also fails to mention the importance of a variety of established cancer risk factors including sun behaviour, occupational exposures, chronic infections and use of exogenous hormones."

At the same time, the evidence presented by the WCRF for diet’s role in cancer had gotten weaker:

"‘We think we know’ or, more accurately, ‘we thought we knew’ that a high-fat diet and low consumption of fruits, vegetables and fibres were associated with increased risks of common cancers. However, faith in the cancer prevention properties of fruits and vegetables began to crack when all the available evidence was critically reviewed by an International Agency for Research on Cancer (IARC) Working Group. Subsequently, it has crumbled as major analyses of prospective studies have continued to demonstrate consistently a lack of association between intake of fruits and vegetables and risk of several cancers. This major change in classification of one the few agents classified by WCRF in the category of strongest evidence in 1997 casts doubt on the rationale to classify ‘convincing’ to the evidence linking high meat intake to colorectal cancer risk in the current report. This also raises questions about the evaluation process and about the robustness of the classification system."

But the IARC noted

"The substantial review of the evidence in the WCRF report demonstrates that there is no discernible association between many forms of cancer and specific dietary practices. There are still some very interesting hypotheses to pursue, such as the value of an approach on the basis of the food patterns (e.g. the Mediterranean diet score) rather than individual foods and nutrients, but the cupboard is remarkably bare."

The failure of science to come up with robust conclusions about diet and cancer is one of the emerging "inconvenient truths" in public health (the other is that diets don't really work), and both are at odds with giving the public clear, comprehensible guidelines for diet. This new study has been hailed for building on existing evidence that red meat consumption is linked to cancer, but good reporting would include the naysayers as well as the yaysayers; scientific consensus is never built with one study alone.

How efficient are the solar panels that were inspected by President Obama?

How efficient are the solar panels that were inspected by President Obama? By Todd Shepherd
The Denver Museum of Science isn't telling. But you are helping to foot the bill for the solar array that won't pay for itself until the year 2118.
The Independence Institute, Mar 31, 2009

Before signing the $787 billion stimulus package into law on Feburary 17, 2009, President Barack Obama and Vice President Joe Biden toured an array of solar panels on top of the Denver Museum of Nature and Science. The photo-op allowed the President to once again extol the virtues of the coming “green” economy.

According to the Denver Post's article on the event, “The sun generates enough energy on the museum rooftop to power about 30 homes.” However, that claim cannot be verified at this time, and in fact, seems to be belied by the scant information provided by the museum and other sources.[1] Laura Holtman, Public Relations Manager for the Museum said in an email, “Because the array generates less than 5 percent of the Museum's power, [the purchased energy] is not a particularly large bill.”

The Independence Institute asked the Denver Museum of Science and Nature to provide certain statistical information regarding the now-famous solar array. Specifically, the Institute asked for:

1 ) Two years worth of electric bills prior to the installation of the solar array,
2 ) All electric bills following the completion of the installation.

The Museum denied those requests.

The solar array is not owned by the Museum, however. It is owned by Hybrid Energy Group, LLC. HEG owns the solar array, sells the electricity to the Museum, and receives tax incentives from the state and federal governments, while also receiving “rebates” from Xcel Energy. The rebates are funded by a surcharge collected on the monthly bill of every Colorado Xcel customer.

A 2008 article in the Denver Business Journal sheds further light on the subject. The article notes the total price of the solar array was $720,000. And Dave Noel, VP of operations and chief technology officer for the Museum, was quoted as saying, “We looked at first installing [the solar array] ourselves, and without any of the incentive programs, it was a 110-year payout.” Noel went on to say that the Museum did not purchase the solar array because it did not “make sense financially.”

Additionally, most solar panels have an expected life-span of 20 to 25 years.

So how can Hybrid Energy Group afford to own a solar array that not even the museum would buy? In part, HEG gets “rebates” from Xcel's “Solar Rewards” program. The Solar Rewards program is a response to Colorado voters passing Amendment 37 in 2004. The Amendment mandated that Colorado utilities procure a certain percentage of their power generation from renewable resources like wind and solar.

“Amendment 37 really should have been called a tax,” said Independence Institute President Jon Caldara. “And it would have been interesting to see whether it would have passed if the ballot language had started off with the phrase, 'shall there be an increase in energy taxes?' For those of you who are Xcel customers, look at your bill and find the line that says 'Renew. Energy Std. Adj.' Then realize that you are paying this “adjustment” to buy solar panels which the museum has admitted that without any government subsidization wouldn't pay for themselves until the year 2118.”

[table]

HEG also uses state and federal tax “incentives” in order to be able to own a $720,000 solar array that produces such a minute cash flow, compared to the rest of the Museum's monthly power expenses.

The fact that solar energy may currently only be viable due to engineering of the tax code means that citizens may not have all the information when weighing the costs of “green” projects, says Barry Poulson, Professor of Economics at the University of Colorado, and Senior Fellow at the Independence Institute.

“Colorado citizens need to know that these policies will result in a significant dislocation of our industries, a fall in income and employment, and rising costs to consumers. These burdens will fall primarily on low income families. Nowhere in these proposals for a 'new Energy Economy' is there any discussion of the costs that these policies will impose on Colorado citizens.”


Notes

[1] Additionally, the claim in the Post article that “The sun generates enough energy on the museum rooftop to power about 30 homes,” is regretfully lacking a crucial time context. Does the power for 30 homes last one hour, one day, one week, one month?