Showing posts with label junk science. Show all posts
Showing posts with label junk science. Show all posts

Thursday, June 22, 2017

Evaluation of a proposal for reliable low-cost grid power with 100% wind, water, and solar

Evaluation of a proposal for reliable low-cost grid power with 100% wind, water, and solar. By Christopher T. M Clack, Staffan A. Qvist, Jay Apt,, Morgan Bazilian, Adam R. Brandt, Ken Caldeira, Steven J. Davis, Victor Diakov, Mark A. Handschy, Paul D. H. Hines, Paulina Jaramillo, Daniel M. Kammen, Jane C. S. Long, M. Granger Morgan, Adam Reed, Varun Sivaram, James Sweeney, George R. Tynan, David G. Victor, John P. Weyant, and Jay F. Whitacre. Proceedings of the National Academy of Sciences.

Significance: Previous analyses have found that the most feasible route to a low-carbon energy future is one that adopts a diverse portfolio of technologies. In contrast, Jacobson et al. (2015) consider whether the future primary energy sources for the United States could be narrowed to almost exclusively wind, solar, and hydroelectric power and suggest that this can be done at “low-cost” in a way that supplies all power with a probability of loss of load “that exceeds electric-utility-industry standards for reliability”. We find that their analysis involves errors, inappropriate methods, and implausible assumptions. Their study does not provide credible evidence for rejecting the conclusions of previous analyses that point to the benefits of considering a broad portfolio of energy system options. A policy prescription that overpromises on the benefits of relying on a narrower portfolio of technologies options could be counterproductive, seriously impeding the move to a cost effective decarbonized energy system.

Abstract: A number of analyses, meta-analyses, and assessments, including those performed by the Intergovernmental Panel on Climate Change, the National Oceanic and Atmospheric Administration, the National Renewable Energy Laboratory, and the International Energy Agency, have concluded that deployment of a diverse portfolio of clean energy technologies makes a transition to a low-carbon-emission energy system both more feasible and less costly than other pathways. In contrast, Jacobson et al. [Jacobson MZ, Delucchi MA, Cameron MA, Frew BA (2015) Proc Natl Acad Sci USA 112(49):15060–15065] argue that it is feasible to provide “low-cost solutions to the grid reliability problem with 100% penetration of WWS [wind, water and solar power] across all energy sectors in the continental United States between 2050 and 2055”, with only electricity and hydrogen as energy carriers. In this paper, we evaluate that study and find significant shortcomings in the analysis. In particular, we point out that this work used invalid modeling tools, contained modeling errors, and made implausible and inadequately supported assumptions. Policy makers should treat with caution any visions of a rapid, reliable, and low-cost transition to entire energy systems that relies almost exclusively on wind, solar, and hydroelectric power.

Monday, June 5, 2017

Healthy offspring from freeze-dried mouse spermatozoa held on the International Space Station for 9 months

Healthy offspring from freeze-dried mouse spermatozoa held on the International Space Station for 9 months
Proceedings of the National Academy of Sciences of the United States of America

Significance: Radiation on the International Space Station (ISS) is more than 100 times stronger than at the Earth’s surface, and at levels that can cause DNA damage in somatic cell nuclei. The damage to offspring caused by this irradiation in germ cells has not been examined, however. Here we preserved mouse spermatozoa on the ISS for 9 mo. Although sperm DNA was slightly damaged during space preservation, it could be repaired by the oocyte cytoplasm and did not impair the birth rate or normality of the offspring. Our results demonstrate that generating human or domestic animal offspring from space-preserved spermatozoa is a possibility, which should be useful when the “space age” arrives.

Abstract: If humans ever start to live permanently in space, assisted reproductive technology using preserved spermatozoa will be important for producing offspring; however, radiation on the International Space Station (ISS) is more than 100 times stronger than that on Earth, and irradiation causes DNA damage in cells and gametes. Here we examined the effect of space radiation on freeze-dried mouse spermatozoa held on the ISS for 9 mo at –95 °C, with launch and recovery at room temperature. DNA damage to the spermatozoa and male pronuclei was slightly increased, but the fertilization and birth rates were similar to those of controls. Next-generation sequencing showed only minor genomic differences between offspring derived from space-preserved spermatozoa and controls, and all offspring grew to adulthood and had normal fertility. Thus, we demonstrate that although space radiation can damage sperm DNA, it does not affect the production of viable offspring after at least 9 mo of storage on the ISS.

Saturday, June 3, 2017

Do Globalization and Free Markets Drive Obesity among Children and Youth? An Empirical Analysis, 1990–2013

Do Globalization and Free Markets Drive Obesity among Children and Youth? An Empirical Analysis, 1990–2013

ABSTRACT: Scholars of public health identify globalization as a major cause of obesity. Free markets are blamed for spreading high calorie, nutrient-poor diets, and sedentary lifestyles across the globe. Global trade and investment agreements apparently curtail government action in the interest of public health. Globalization is also blamed for raising income inequality and social insecurities, which contribute to “obesogenic” environments. Contrary to recent empirical studies, this study demonstrates that globalization and several component parts, such as trade openness, FDI flows, and an index of economic freedom, reduce weight gain and obesity among children and youth, the most likely age cohort to be affected by the past three decades of globalization and attendant lifestyle changes. The results suggest strongly that local-level factors possibly matter much more than do global-level factors for explaining why some people remain thin and others put on weight. The proposition that globalization is homogenizing cultures across the globe in terms of diets and lifestyles is possibly exaggerated. The results support the proposition that globalized countries prioritize health because of the importance of labor productivity and human capital due to heightened market competition, ceteris paribus, even if rising incomes might drive high consumption.

KEYWORDS: Globalization, obesity, trade and FDI, economic freedom

Monday, December 26, 2016

What scientists think of themselves, other scientists and the population at large

Who Believes in the Storybook Image of the Scientist? 
Dec 2016

Abstract: Do lay people and scientists themselves recognize that scientists are human and therefore prone to human fallibilities such as error, bias, and even dishonesty? In a series of three experimental studies and one correlational study (total N = 3,278) we found that the ‘storybook image of the scientist’ is pervasive: American lay people and scientists from over 60 countries attributed considerably more objectivity, rationality, open-mindedness, intelligence, integrity, and communality to scientists than other highly-educated people. Moreover, scientists perceived even larger differences than lay people did. Some groups of scientists also differentiated between different categories of scientists: established scientists attributed higher levels of the scientific traits to established scientists than to early-career scientists and PhD students, and higher levels to PhD students than to early-career scientists. Female scientists attributed considerably higher levels of the scientific traits to female scientists than to male scientists. A strong belief in the storybook image and the (human) tendency to attribute higher levels of desirable traits to people in one’s own group than to people in other groups may decrease scientists’ willingness to adopt recently proposed practices to reduce error, bias and dishonesty in science.

Tuesday, December 6, 2016

My Unhappy Life as a Climate Heretic. By Roger Pielke Jr.

My Unhappy Life as a Climate Heretic. By Roger Pielke Jr.
My research was attacked by thought police in journalism, activist groups funded by billionaires and even the White House.
Updated Dec. 2, 2016 7:04 p.m. ET

Much to my surprise, I showed up in the WikiLeaks releases before the election. In a 2014 email, a staffer at the Center for American Progress, founded by John Podesta in 2003, took credit for a campaign to have me eliminated as a writer for Nate Silver’s FiveThirtyEight website. In the email, the editor of the think tank’s climate blog bragged to one of its billionaire donors, Tom Steyer: “I think it’s fair [to] say that, without Climate Progress, Pielke would still be writing on climate change for 538.”

WikiLeaks provides a window into a world I’ve seen up close for decades: the debate over what to do about climate change, and the role of science in that argument. Although it is too soon to tell how the Trump administration will engage the scientific community, my long experience shows what can happen when politicians and media turn against inconvenient research—which we’ve seen under Republican and Democratic presidents.

I understand why Mr. Podesta—most recently Hillary Clinton’s campaign chairman—wanted to drive me out of the climate-change discussion. When substantively countering an academic’s research proves difficult, other techniques are needed to banish it. That is how politics sometimes works, and professors need to understand this if we want to participate in that arena.

More troubling is the degree to which journalists and other academics joined the campaign against me. What sort of responsibility do scientists and the media have to defend the ability to share research, on any subject, that might be inconvenient to political interests—even our own?

I believe climate change is real and that human emissions of greenhouse gases risk justifying action, including a carbon tax. But my research led me to a conclusion that many climate campaigners find unacceptable: There is scant evidence to indicate that hurricanes, floods, tornadoes or drought have become more frequent or intense in the U.S. or globally. In fact we are in an era of good fortune when it comes to extreme weather. This is a topic I’ve studied and published on as much as anyone over two decades. My conclusion might be wrong, but I think I’ve earned the right to share this research without risk to my career.

Instead, my research was under constant attack for years by activists, journalists and politicians. In 2011 writers in the journal Foreign Policy signaled that some accused me of being a “climate-change denier.” I earned the title, the authors explained, by “questioning certain graphs presented in IPCC reports.” That an academic who raised questions about the Intergovernmental Panel on Climate Change in an area of his expertise was tarred as a denier reveals the groupthink at work.

Yet I was right to question the IPCC’s 2007 report, which included a graph purporting to show that disaster costs were rising due to global temperature increases. The graph was later revealed to have been based on invented and inaccurate information, as I documented in my book “The Climate Fix.” The insurance industry scientist Robert-Muir Wood of Risk Management Solutions had smuggled the graph into the IPCC report. He explained in a public debate with me in London in 2010 that he had included the graph and misreferenced it because he expected future research to show a relationship between increasing disaster costs and rising temperatures.

When his research was eventually published in 2008, well after the IPCC report, it concluded the opposite: “We find insufficient evidence to claim a statistical relationship between global temperature increase and normalized catastrophe losses.” Whoops.

The IPCC never acknowledged the snafu, but subsequent reports got the science right: There is not a strong basis for connecting weather disasters with human-caused climate change.

Yes, storms and other extremes still occur, with devastating human consequences, but history shows they could be far worse. No Category 3, 4 or 5 hurricane has made landfall in the U.S. since Hurricane Wilma in 2005, by far the longest such period on record. This means that cumulative economic damage from hurricanes over the past decade is some $70 billion less than the long-term average would lead us to expect, based on my research with colleagues. This is good news, and it should be OK to say so. Yet in today’s hyper-partisan climate debate, every instance of extreme weather becomes a political talking point.

For a time I called out politicians and reporters who went beyond what science can support, but some journalists won’t hear of this. In 2011 and 2012, I pointed out on my blog and social media that the lead climate reporter at the New York Times,Justin Gillis, had mischaracterized the relationship of climate change and food shortages, and the relationship of climate change and disasters. His reporting wasn’t consistent with most expert views, or the evidence. In response he promptly blocked me from his Twitter feed. Other reporters did the same.

In August this year on Twitter, I criticized poor reporting on the website Mashable about a supposed coming hurricane apocalypse—including a bad misquote of me in the cartoon role of climate skeptic. (The misquote was later removed.) The publication’s lead science editor, Andrew Freedman, helpfully explained via Twitter that this sort of behavior “is why you’re on many reporters’ ‘do not call’ lists despite your expertise.”

I didn’t know reporters had such lists. But I get it. No one likes being told that he misreported scientific research, especially on climate change. Some believe that connecting extreme weather with greenhouse gases helps to advance the cause of climate policy. Plus, bad news gets clicks.

Yet more is going on here than thin-skinned reporters responding petulantly to a vocal professor. In 2015 I was quoted in the Los Angeles Times, by Pulitzer Prize-winning reporter Paige St. John, making the rather obvious point that politicians use the weather-of-the-moment to make the case for action on climate change, even if the scientific basis is thin or contested.

Ms. St. John was pilloried by her peers in the media. Shortly thereafter, she emailed me what she had learned: “You should come with a warning label: Quoting Roger Pielke will bring a hailstorm down on your work from the London Guardian, Mother Jones, and Media Matters.”

Or look at the journalists who helped push me out of FiveThirtyEight. My first article there, in 2014, was based on the consensus of the IPCC and peer-reviewed research. I pointed out that the global cost of disasters was increasing at a rate slower than GDP growth, which is very good news. Disasters still occur, but their economic and human effect is smaller than in the past. It’s not terribly complicated.

That article prompted an intense media campaign to have me fired. Writers at Slate, Salon, the New Republic, the New York Times, the Guardian and others piled on.

In March of 2014, FiveThirtyEight editor Mike Wilson demoted me from staff writer to freelancer. A few months later I chose to leave the site after it became clear it wouldn’t publish me. The mob celebrated., founded by former Center for American Progress staffer Brad Johnson, and advised by Penn State’s Michael Mann, called my departure a “victory for climate truth.” The Center for American Progress promised its donor Mr. Steyer more of the same.

Yet the climate thought police still weren’t done. In 2013 committees in the House and Senate invited me to a several hearings to summarize the science on disasters and climate change. As a professor at a public university, I was happy to do so. My testimony was strong, and it was well aligned with the conclusions of the IPCC and the U.S. government’s climate-science program. Those conclusions indicate no overall increasing trend in hurricanes, floods, tornadoes or droughts—in the U.S. or globally.

In early 2014, not long after I appeared before Congress, President Obama’s science adviser John Holdren testified before the same Senate Environment and Public Works Committee. He was asked about his public statements that appeared to contradict the scientific consensus on extreme weather events that I had earlier presented. Mr. Holdren responded with the all-too-common approach of attacking the messenger, telling the senators incorrectly that my views were “not representative of the mainstream scientific opinion.” Mr. Holdren followed up by posting a strange essay, of nearly 3,000 words, on the White House website under the heading, “An Analysis of Statements by Roger Pielke Jr.,” where it remains today.

I suppose it is a distinction of a sort to be singled out in this manner by the president’s science adviser. Yet Mr. Holdren’s screed reads more like a dashed-off blog post from the nutty wings of the online climate debate, chock-full of errors and misstatements.

But when the White House puts a target on your back on its website, people notice. Almost a year later Mr. Holdren’s missive was the basis for an investigation of me by Arizona Rep. Raul Grijalva, the ranking Democrat on the House Natural Resources Committee. Rep. Grijalva explained in a letter to my university’s president that I was being investigated because Mr. Holdren had “highlighted what he believes were serious misstatements by Prof. Pielke of the scientific consensus on climate change.” He made the letter public.

The “investigation” turned out to be a farce. In the letter, Rep. Grijalva suggested that I—and six other academics with apparently heretical views—might be on the payroll of Exxon Mobil (or perhaps the Illuminati, I forget). He asked for records detailing my research funding, emails and so on. After some well-deserved criticism from the American Meteorological Society and the American Geophysical Union, Rep. Grijalva deleted the letter from his website. The University of Colorado complied with Rep. Grijalva’s request and responded that I have never received funding from fossil-fuel companies. My heretical views can be traced to research support from the U.S. government.

But the damage to my reputation had been done, and perhaps that was the point. Studying and engaging on climate change had become decidedly less fun. So I started researching and teaching other topics and have found the change in direction refreshing. Don’t worry about me: I have tenure and supportive campus leaders and regents. No one is trying to get me fired for my new scholarly pursuits.

But the lesson is that a lone academic is no match for billionaires, well-funded advocacy groups, the media, Congress and the White House. If academics—in any subject—are to play a meaningful role in public debate, the country will have to do a better job supporting good-faith researchers, even when their results are unwelcome. This goes for Republicans and Democrats alike, and to the administration of President-elect Trump.

Academics and the media in particular should support viewpoint diversity instead of serving as the handmaidens of political expediency by trying to exclude voices or damage reputations and careers. If academics and the media won’t support open debate, who will?

Mr. Pielke is a professor and director of the Sports Governance Center at the University of Colorado, Boulder. His most recent book is “The Edge: The Wars Against Cheating and Corruption in the Cutthroat World of Elite Sports” (Roaring Forties Press, 2016).

Saturday, May 10, 2014

China moves to free-market pricing for pharmaceuticals, after price controls led to quality problems & shortages

China Scraps Price Caps on Low-Cost Drugs. By Laurie Burkitt
Move Comes After Some Manufacturers Cut Corners on Production
Wall Street Journal, May 8, 2014 1:15 a.m.


China will scrap caps on retail prices for low-cost medicine and is moving toward free-market pricing for pharmaceuticals, after price controls led to drug quality problems and shortages in the country.

The move could be a welcome one for global pharmaceutical companies, which have been under scrutiny in China since last year for their sales and marketing practices.

The world's most populous country is the third-largest pharmaceutical market, behind the U.S. and Japan, according to data from consulting firm McKinsey & Co., but Beijing has used price caps and other measures to keep medical care affordable.

Price caps will be lifted for 280 medicines made by Western drug companies and 250 Chinese patent drugs, the National Development and Reform Commission, China's economic planning body, said Thursday. The move will affect prices on drugs such as antibiotics, painkillers and vitamins, it said.

The statement said local governments will have until July 1 to unveil details of the plan. In China, local authorities have broad oversight over how drugs are distributed to local hospitals.

Aiming to keep prices low, some manufacturers cut corners on production, exposing consumers to safety risks, said Helen Chen, a Shanghai-based partner and director of L.E.K. Consulting. Many also closed production, creating shortages of low-cost drugs such as thyroid medication.

"It means the [commission] recognizes that forcing prices down and focusing purely on price does sacrifice drug safety, quality and availability," said Ms. Chen.

Several drug makers, including GlaxoSmithKline PLC, didn't immediately respond to requests for comment. Spokeswomen for Sanofi and Pfizer Inc. said that because implementation of the new policy is unclear, it is too early to understand how it will affect their business in China.

The industry was dealt a blow last summer when Chinese authorities accused Glaxo of bribing doctors, hospitals and local officials to increase sales of their drugs. The U.K. company has said some of its employees may have violated Chinese law.

The central government, which began overhauling the country's health-care system in 2009, has until now largely favored pricing caps and has encouraged provincial governments to cut health-care costs and prices. Regulators phased out five years ago premium pricing for a list of "essential drugs" to be available in hospitals.

Chinese leaders want health care to be more accessible and affordable, but there have been unintended consequences in attempting to ensure the lowest prices on drugs. For instance, many pharmaceutical companies registered to sell the thyroid medication Tapazole have halted production in recent years after pricing restrictions squeezed out profits, experts say, creating a shortage. Chinese patients with hyperthyroidism struggled to find the drug and many suffered with increased anxiety, muscle weakness and sleep disorder, according to local media reports.

In 2012, some drug-capsule manufacturers were found to be using industrial gelatin to cut production costs. The industrial gelatin contained the chemical chromium, which can be carcinogenic with frequent exposure, according to the U.S. Centers for Disease Control and Prevention.

"Manufacturers have attempted to save costs, and doing that has meant using lower-quality ingredients," said Ms. Chen.

The pricing reversal won't necessarily alleviate pricing pressure for these drugs, experts say. To get drugs into hospitals, companies must compete in a tendering process at the provincial level, said Justin Wang, also a partner at L.E.K. "It's still unclear how the provinces will react to this new national list," Mr. Wang said.

If provinces don't change their current system, price will remain a key competitive factor for drug makers, said Franck Le Deu, a partner at McKinsey's China division.

"The bottom line is that there may be more safety and more pricing transparency, but the focus intensifies on creating more innovative drugs," Mr. Le Deu said.

  —Liyan Qi contributed to this article.

Saturday, December 28, 2013

MRSA Infections, swine effluent lagoons, and farm consolidations

Answering to some comments in a book review, 'In Meat We Trust,' by Maureen Ogle (, WSJ, Dec. 17, 2013 6:36 p.m. ET:

A recent paper* in a FAO publication summarizes advances in hog manure management. Obviously, the cases mentioned are small in comparison with the great consolidated farms, but even so, there are multiple ways to manage better the effluents and some useful ways to profit from the lagoons/catchments are shown here.

@Mr Evangelista: I got access to the paper** you mentioned. If interested you may ask for it. I'd like, though, to calm down things. As it says other paper*** published at the same time, which it is likely it is the one Mr Blumenthal mentioned:
"In 2011,we estimated the overall number of invasive MRSA infections was 80 461; 31% lower than when estimates were first available in 2005"

The reasons are not well understood (several explanations are offered), but that is not relevant now. The important idea is that despite increasing consolidation of farm operations and an increasing population (from approx 295 million in 2005 to approx 311 million in 2011), there are 31% less MRSA infections.


* Intensive and Integrated Farm Systems using Fermentation of Swine Effluent in Brazil. By I. Bergier, E. Soriano, G. Wiedman and A. Kososki. In Biotechnologies at Work for Smallholders: Case Studies from Developing Countries in Crops, Livestock and Fish. Edited by J. Ruane, J.D. Dargie, C. Mba, P. Boettcher, H.P.S. Makkar, D.M. Bartley and A. Sonnino. Food and Agriculture Organization of the United Nations, 2013.

** High-Density Livestock Operations, Crop Field Application of Manure, and Risk of Community-Associated Methicillin-Resistant Staphylococcus aureus Infection in Pennsylvania. By Joan A. Casey, MA; Frank C. Curriero, PhD, MA; Sara E. Cosgrove,MD, MS; Keeve E. Nachman, PhD, MHS; Brian S. Schwartz, MD,MS. JAMA Intern Med. Vol 173, No. 21, doi:10.1001/jamainternmed.2013.10408

*** National Burden of InvasiveMethicillin-Resistant Staphylococcus aureus Infections, United States, 2011. By Raymund Dantes, MD, MPH; Yi Mu, PhD; Ruth Belflower, RN, MPH; Deborah Aragon, MSPH; Ghinwa Dumyati, MD; Lee H. Harrison, MD; Fernanda C. Lessa, MD; Ruth Lynfield, MD; Joelle Nadle, MPH; Susan Petit, MPH; Susan M. Ray, MD; William Schaffner, MD; John Townes, MD; Scott Fridkin, MD; for the Emerging Infections Program–Active Bacterial Core Surveillance MRSA Surveillance Investigators. JAMA Intern Med. Vol 173, No. 21, doi:10.1001/jamainternmed.2013.10423

Tuesday, July 16, 2013

Trevor Butterworth's Fad Food Nation

Fad Food Nation. By Trevor Butterworth
A skeptical survey of the claims being made about food, health and the environment.
The Wall Street Journal, July 16, 2013, on page A13  


Not so long ago, I spoke to a chef who ministers to children attending some of the most elite and expensive schools in America. Why, I asked him, was his company's website larded with almost comical warnings about the lethality of eating genetically modified (GM) food? Did he actually believe this as scientific fact or was he catering to his clientele's spiritual fears? It was simply for the mothers, he said, candidly. They ate it up—or, rather, they had swallowed so many apocalyptic warnings about genetically modified food that he had no choice but to echo their terror. How could they entrust their children to him otherwise? The downside of such dogma, he explained, was cost. Many of the mothers wouldn't agree to their children eating anything less than 100% organic, even if organic food required flying in, as he put it, "apples from Cuba."

Mr. Butterworth is a contributor at Newsweek and editor at large for

Wednesday, September 19, 2012

New Report Aims to Improve the Science Behind Regulatory Decision-Making

New Report Aims to Improve the Science Behind Regulatory Decision-Making

WASHINGTON, D.C. (September 18, 2012) – Scientists and policy experts from industry, government, and nonprofit sectors reached consensus on ways to improve the rigor and transparency of regulatory decision-making in a report being released today. The Research Integrity Roundtable, a cross-sector working group convened and facilitated by The Keystone Center, an independent public policy organization, is releasing the new report to improve the scientific analysis and independent expert reviews which underpin many important regulatory decisions. The report, Model Practices and Procedures for Improving the Use of Science in Regulatory Decision-Making, builds on the work of the Bipartisan Policy Center (BPC) in its 2009 report Science for Policy Project: Improving the Use of Science in Regulatory Policy.

"Americans need to have confidence in a U.S. regulatory system that encourages rational, science-based decision-making," said Mike Walls, Vice President of Regulatory and Technical Affairs for the American Chemistry Council (ACC), one of the sponsors of the Keystone Roundtable. "For this report, a broad spectrum of stakeholders came together to identify and help resolve some of the more troubling inconsistencies and roadblocks at the intersection of science and regulatory policy."

Controversies surrounding a regulatory decision often arise over the composition and transparency of scientific advisory panels and the scientific analysis used to support such decisions. The Roundtable's report is the product of 18 months of deliberations among experts from advocacy groups, professional associations and industry, as well as liaisons from several key Federal agencies. The report centers on two main public policy challenges that lead to controversy in the regulatory process: appointments of scientific experts, and the conduct of systematic scientific reviews.

The Roundtable's recommendations aim to improve the selection process for scientists on federal advisory panels and the scientific analysis used to draw conclusions that inform policy. The report seeks to maximize transparency and objectivity at every step in the regulatory decision-making process by informing the formation of scientific advisory committees and use of systematic reviews. The Roundtable's report offers specific recommendations for improving expert panel selection by better addressing potential conflicts of interest and bias. In addition, the report recommends ways to improve systematic reviews of scientific studies by outlining a step-by-step process, and by calling for clearer criteria to determine the relevance and credibility of studies.

"Conflicted experts and poor scientific assessments threaten the scientific integrity of agency decision making as well as the public's faith in agencies to protect their health and safety," said Francesca Grifo, Senior Scientist and Science Policy Fellow for the Union of Concerned Scientists. "Given the abundance of inflamed partisan dialogue around regulatory issues, it was refreshing to be a part of a rational and respectful roundtable. If adopted by agencies, the changes recommended in the report have the potential to reduce the ability of narrow interests to weaken regulations' power to protect the public good."

The Keystone Center and members of the Research Integrity Roundtable welcome additional conversations and dialogue on the matters explored in and recommendations presented in this report.

For more information, access the Roundtable's website at:

Tuesday, May 15, 2012

Changes in U.S. water use and implications for the future

It is interesting to see some data in Water Reuse: Expanding the Nation's Water Supply Through Reuse of Municipal Wastewater (, a National Research Council publication.

See for example figure 1-6, p 17, changes in U.S. water use and implications for the future:

Friday, October 21, 2011

The Case Against Global-Warming Skepticism

The Case Against Global-Warming Skepticism. By Richard A Muller
There were good reasons for doubt, until now.
WSJ, Oct 21, 2011

Are you a global warming skeptic? There are plenty of good reasons why you might be.

As many as 757 stations in the United States recorded net surface-temperature cooling over the past century. Many are concentrated in the southeast, where some people attribute tornadoes and hurricanes to warming.

The temperature-station quality is largely awful. The most important stations in the U.S. are included in the Department of Energy's Historical Climatology Network. A careful survey of these stations by a team led by meteorologist Anthony Watts showed that 70% of these stations have such poor siting that, by the U.S. government's own measure, they result in temperature uncertainties of between two and five degrees Celsius or more. We do not know how much worse are the stations in the developing world.

Using data from all these poor stations, the U.N.'s Intergovernmental Panel on Climate Change estimates an average global 0.64ºC temperature rise in the past 50 years, "most" of which the IPCC says is due to humans. Yet the margin of error for the stations is at least three times larger than the estimated warming.

We know that cities show anomalous warming, caused by energy use and building materials; asphalt, for instance, absorbs more sunlight than do trees. Tokyo's temperature rose about 2ºC in the last 50 years. Could that rise, and increases in other urban areas, have been unreasonably included in the global estimates? That warming may be real, but it has nothing to do with the greenhouse effect and can't be addressed by carbon dioxide reduction.

Moreover, the three major temperature analysis groups (the U.S.'s NASA and National Oceanic and Atmospheric Administration, and the U.K.'s Met Office and Climatic Research Unit) analyze only a small fraction of the available data, primarily from stations that have long records. There's a logic to that practice, but it could lead to selection bias. For instance, older stations were often built outside of cities but today are surrounded by buildings. These groups today use data from about 2,000 stations, down from roughly 6,000 in 1970, raising even more questions about their selections.

On top of that, stations have moved, instruments have changed and local environments have evolved. Analysis groups try to compensate for all this by homogenizing the data, though there are plenty of arguments to be had over how best to homogenize long-running data taken from around the world in varying conditions. These adjustments often result in corrections of several tenths of one degree Celsius, significant fractions of the warming attributed to humans.

And that's just the surface-temperature record. What about the rest? The number of named hurricanes has been on the rise for years, but that's in part a result of better detection technologies (satellites and buoys) that find storms in remote regions. The number of hurricanes hitting the U.S., even more intense Category 4 and 5 storms, has been gradually decreasing since 1850. The number of detected tornadoes has been increasing, possibly because radar technology has improved, but the number that touch down and cause damage has been decreasing. Meanwhile, the short-term variability in U.S. surface temperatures has been decreasing since 1800, suggesting a more stable climate.

Without good answers to all these complaints, global-warming skepticism seems sensible. But now let me explain why you should not be a skeptic, at least not any longer.

Over the last two years, the Berkeley Earth Surface Temperature Project has looked deeply at all the issues raised above. I chaired our group, which just submitted four detailed papers on our results to peer-reviewed journals. We have now posted these papers online at to solicit even more scrutiny.

Our work covers only land temperature—not the oceans—but that's where warming appears to be the greatest. Robert Rohde, our chief scientist, obtained more than 1.6 billion measurements from more than 39,000 temperature stations around the world. Many of the records were short in duration, and to use them Mr. Rohde and a team of esteemed scientists and statisticians developed a new analytical approach that let us incorporate fragments of records. By using data from virtually all the available stations, we avoided data-selection bias. Rather than try to correct for the discontinuities in the records, we simply sliced the records where the data cut off, thereby creating two records from one.

We discovered that about one-third of the world's temperature stations have recorded cooling temperatures, and about two-thirds have recorded warming. The two-to-one ratio reflects global warming. The changes at the locations that showed warming were typically between 1-2ºC, much greater than the IPCC's average of 0.64ºC.

To study urban-heating bias in temperature records, we used satellite determinations that subdivided the world into urban and rural areas. We then conducted a temperature analysis based solely on "very rural" locations, distant from urban ones. The result showed a temperature increase similar to that found by other groups. Only 0.5% of the globe is urbanized, so it makes sense that even a 2ºC rise in urban regions would contribute negligibly to the global average.

What about poor station quality? Again, our statistical methods allowed us to analyze the U.S. temperature record separately for stations with good or acceptable rankings, and those with poor rankings (the U.S. is the only place in the world that ranks its temperature stations). Remarkably, the poorly ranked stations showed no greater temperature increases than the better ones. The mostly likely explanation is that while low-quality stations may give incorrect absolute temperatures, they still accurately track temperature changes.

When we began our study, we felt that skeptics had raised legitimate issues, and we didn't know what we'd find. Our results turned out to be close to those published by prior groups. We think that means that those groups had truly been very careful in their work, despite their inability to convince some skeptics of that. They managed to avoid bias in their data selection, homogenization and other corrections.

Global warming is real. Perhaps our results will help cool this portion of the climate debate. How much of the warming is due to humans and what will be the likely effects? We made no independent assessment of that.

Mr. Muller is a professor of physics at the University of California, Berkeley, and the author of "Physics for Future Presidents" (W.W. Norton & Co., 2008).

Tuesday, October 4, 2011

White House: Now is Not the Time to Wave the White Flag on Clean Energy Jobs

Now is Not the Time to Wave the White Flag on Clean Energy Jobs. Blog post from Dan Pfeiffer, White House Communications Director
October 04, 2011

This morning, Chairman Cliff Stearns, who leads the House Energy and Commerce Subcommittee on Oversight and Investigations, told NPR that "We can't compete with China to make solar panels and wind turbines."

This comment reflects exactly the sort of counterproductive defeatism that Energy Secretary Steven Chu warned against this weekend when he spoke to a group of America’s most promising young solar innovators:

“The United States faces a choice today: Will we sit on the sidelines and fall behind or will we play to win the clean energy race? Some say this is a race America can’t win. They’re ready to wave the white flag and declare defeat… Others say this is a race America shouldn’t even be in. They say we can’t afford to invest in clean energy. I say we can’t afford not to.

“It’s not enough for our country to invent clean energy technologies – we have to make them and use them too. Invented in America, made in America, and sold around the world – that’s how we’ll create good jobs and lead in the 21st century.”

The race for clean energy jobs and industries is on – and it is a race well worth winning. The International Energy Agency projects that in the coming decades, solar power could grow to more than 20 percent of the world’s electricity.
Conservatively, this means that there is an economic opportunity worth trillions of dollars for whichever countries claim the lead. The global market for wind turbines is also growing exponentially.

But it’s not just the vast potential of jobs tomorrow – these industries employ a growing number of Americans today. In fact, business groups estimate that America’s solar industry accounts for about 100,000 jobs and the wind industry employs 75,000. Should we simply tell those workers that we’ve given up on them?

A study released last month showed that, in spite of the intense global competition, the U.S. remains a net global exporter of solar technology – with $5.6 billion in exports and an overall positive trade balance of $1.8 billion.

It is certainly true that China is playing to win. Last year alone, China offered its solar manufacturers $30 billion in government financing, vastly exceeding the U.S. investment. And China has overtaken the United States market share in solar power – a technology we invented.

Chairman Stearns and other members of his party in Congress believe that America cannot, or should not, try to compete for jobs in a cutting edge and rapidly growing industry. We simply disagree: the answer to this challenge is not to wave the white flag and give up on American workers. America has never declared defeat after a single setback – and we shouldn’t start now.

America’s entrepreneurs and innovators are still the very best in the world. Our workers are second to none – and we have never been afraid of a challenge. It’s time to do what we’ve always done in the face of a tough competitor: roll up our sleeves and recapture the lead.

Friday, September 30, 2011

EPA Inspector General Statement on Greenhouse Gases Endangerment Finding Report - Data Quality Processes

EPA Inspector General Statement on Greenhouse Gases Endangerment Finding Report - Data Quality Processes

Press Statement - U.S. Environmental Protection Agency
For Immediate Release
Office of Inspector General
Washington, D.C., September 28, 2011Contact: John Manibusan. Phone: (202) 566-2391

WASHINGTON, D.C. – Statement of Inspector General Arthur A. Elkins, Jr., on the Office of Inspector General (OIG) report Procedural Review of EPA’s Greenhouse Gases Endangerment Finding Data Quality Processes:
“The OIG evaluated EPA’s compliance with established policy and procedures in the development of the endangerment finding, including processes for ensuring information quality. We concluded that the technical support document that accompanied EPA’s endangerment finding is a highly influential scientific assessment and thus required a more rigorous EPA peer review than occurred. EPA did not certify whether it complied with OMB’s or its own peer review policies in either the proposed or final endangerment findings as required. While it may be debatable what impact, if any, this had on EPA’s finding, it is clear that EPA did not follow all required steps for a highly influential scientific assessment. We also noted that documentation of events and analyses could be improved.

We made no determination regarding the impact that EPA’s information quality control systems may have had on the scientific information used to support the finding. We did not test the validity of the scientific or technical information used to support the endangerment finding, nor did we evaluate the merit of EPA’s conclusions or analyses.

We make recommendations that we think will strengthen EPA’s control over data quality processes. EPA disagreed with our conclusions and did not agree to take any corrective actions in response to this report. All the report’s recommendations are unresolved.”


Monday, May 3, 2010

Drilling in Deep Water - A ban on offshore production won't mean fewer oil spills

Drilling in Deep Water. WSJ Editorial
A ban on offshore production won't mean fewer oil spills.WSJ, May 04, 2010

It could be months before we know what caused the explosion and oil spill below the drilling rig Deepwater Horizon. But as we add up the economic costs and environmental damage (and mourn the 11 oil workers who died), we should also put the disaster in some perspective.

Washington is, as usual, showing no such restraint. As the oil in the Gulf of Mexico moves toward the Louisiana and Florida coasts, the left is already demanding that President Obama reverse his baby steps toward more offshore drilling. The Administration has partly obliged, declaring a moratorium pending an investigation. The President has raised the political temperature himself, declaring yesterday that the spill is a "massive and potentially unprecedented environmental disaster."

The harm will be considerable, which is why it is fortunate that such spills are so rare. The most recent spill of this magnitude was the Exxon Valdez tanker accident in 1989. The largest before that was the Santa Barbara offshore oil well leak in 1969.

The infrequency of big spills is extraordinary considering the size of the offshore oil industry that provides Americans with affordable energy. According to the Interior Department's most recent data, in 2002 the Outer Continental Shelf had 4,000 oil and gas facilities, 80,000 workers in offshore and support activities, and 33,000 miles of pipeline. Between 1985 and 2001, these offshore facilities produced seven billion barrels of oil. The spill rate was a minuscule 0.001%.

According to the National Academy of Sciences—which in 2002 completed the third version of its "Oil in the Sea" report—only 1% of oil discharges in North Americas are related to petroleum extraction. Some 62% of oil in U.S. waters is due to natural seepage from the ocean floor, putting 47 million gallons of crude oil into North American water every year. The Gulf leak is estimated to have leaked between two million and three million gallons in two weeks.

Such an accident is still unacceptable, which is why the drilling industry has invested heavily to prevent them. The BP well had a blowout preventer, which contains several mechanisms designed to seal pipes in the event of a problem. These protections have worked in the past, and the reason for the failure this time is unknown. This was no routine safety failure but a surprising first.

One reason the industry has a good track record is precisely because of the financial consequences of accidents. The Exxon Valdez dumped 260,000 barrels of oil, and Exxon spent $3.14 billion on cleanup. Do the math, and Exxon spent nearly 600 times more on cleanup and litigation than what the oil was worth at that time.

As for the environmental damage in the Gulf, much will depend on the weather that has made it more difficult to plug the leak and contain the spill before it reaches shore. The winds could push oil over the emergency containment barriers, or they could keep the oil swirling offshore, where it may sink and thus do less damage.

It is worth noting that this could have been worse. The Exxon Valdez caused so much damage in part because the state of Alaska dithered over an emergency spill response. Congress then passed the 1990 Oil Pollution Act that mandated more safety measures, and it gave the Coast Guard new powers during spill emergencies. We have seen the benefits in the last two weeks as the Coast Guard has deployed several containment techniques—from burning and chemical dispersants to physical barriers. America sometimes learns from its mistakes.

On the other hand, Washington's aversion to drilling closer to shore has pushed the industry into deeper, more difficult, waters farther out to sea. BP's well is 5,000 feet down, at a depth and pressure that test the most advanced engineering and technology. The depth complicates containment efforts when there is a disaster.

As for a drilling moratorium, it is no guarantee against oil spills. It may even lead to more of them. Political fantasies about ending our oil addiction notwithstanding, the U.S. economy will need oil and other fossil fuels for decades to come. If we don't drill for it at home, the oil will have to arrive by tanker and barges. Tankers are responsible for more spills than offshore wells, and those spills tend to be bigger and closer to shore—which usually means more environmental harm.

The larger reality is that energy production is never going to be accident free. No difficult human endeavor is, whether space travel or using giant cranes to build skyscrapers. The rest of the world is working to exploit its offshore oil and gas reserves despite the risk of spills. We need to be mindful of such risks, and to include prevention and clean up in the cost of doing business, but a modern economy can't run without oil.

Sunday, May 2, 2010

The US EPA opens a re-re-evaulation of atrazine

The War on a Weed Killer. WSJ Editorial
The EPA opens a re-re-evaulation of a safe chemical.WSJ, May 03, 2010

With the headlines full of oil spills and immigration, the Obama Administration's regulatory agenda is getting little attention. That's a mistake. Consider the Environmental Protection Agency's effort to revive an assault on atrazine, one of the oldest, most well-established agricultural chemicals on the market. Just this past week, the EPA held its third "re-evaluation" hearing on atrazine.

Atrazine is the nation's second-most common herbicide. For 50 years it has been the farm industry's primary crop protector. In the U.S., the weed killer is used in the production of 60% of corn, 75% of sorghum and 90% of sugarcane.

Since atrazine's debut in 1959, 10 Administrations have endorsed its use. The EPA in 2006 completed a 12-year review involving 6,000 studies and 80,000 public comments. In re-registering the product, the agency concluded the cumulative risks posed "no harm that would result to the general U.S. population, infant, children or other . . . consumers." The World Health Organization has found no health concerns.

None of this has stopped the most politicized environmental groups, which oppose both chemicals and the idea of industrial farming itself. Organizations such as the Natural Resources Defense Council have spent years ginning up claims that atrazine in groundwater causes cancer, birth defects and other maladies. Manufacturers such as Syngenta have been required to conduct millions of dollars worth of studies investigating these alarmist claims. EPA staff routinely review the studies in atrazine's favor.

But now the Obama Administration has begun to fill such agencies with hires who are either sympathetic to, or even hail from, these activist groups. Consider the EPA's new head for toxic substances, Stephen Owens. As director of Arizona's Department of Environmental Quality, he so aggressively imposed an activist's climate agenda that the state legislature voted to strip his department of authority to enact greenhouse gas rules.

In August, the NRDC and the Pesticide Action Network began a new campaign against atrazine. In October, the EPA announced it would begin a re-re-evaluation of atrazine with a series of scientific panel meeting, and those are underway. The goal seems to be to lay the groundwork to ban atrazine.

Among the environmental lobby's new lines of attack is that some U.S. water systems occasionally show "spikes" in the chemical. This ignores that the EPA's drinking water standard for atrazine—three parts per billion—has a built-in, 1,000-fold safety factor. It ignores EPA findings that atrazine isn't likely to be carcinogenic to humans.

Also re-energized by the EPA's sudden interest in atrazine is, you guessed it, the plaintiffs bar. Tort kingpin Stephen Tillery, joined by Baron & Budd, filed a class action in 2004 against atrazine makers in tort-friendly Madison County, Illinois, but they've struggled even there. The EPA's re-re-evaluation is already helping the lawyers sign up more water-district plaintiffs—Mr. Tillery has filed a new federal class action—and it surely will provide ammunition in court.

There is an agenda here far more ambitious than getting one chemical. The environmental lobby wants more farmland retired to "nature," and one way to do that is to make farming more expensive. The EPA notes that eliminating atrazine would cost $2 billion annually in lost crop yields and substituting more expensive herbicides. Some farmers would go out of business or ask the federal government for more subsidies.

The environmental lobby also figures that if it can take down atrazine with its long record of clean health, it can get the EPA to prohibit anything. Sounds plausible. Between this and its determination to regulate greenhouse gases, the Obama EPA is proving itself a regulatory fundamentalist, with scant regard for good science or economics.

Friday, April 23, 2010

China and the US, Two Energy Giants: A Contrast In Approach

Two Energy Giants: A contrast in approach
IER, Apr 22, 2010

China’s economy is growing with dizzying speed, and the government is fueling the growth with plentiful energy. In fact, China’s electrification program and its ability to secure future oil supplies are second to none. By contrast, the U.S. economy is growing more slowly and its energy strategy is limiting that growth. The United States has slowed its electrification, adding only select forms of generating capacity, and has taken steps to reduce its flexibility in securing safe oil supplies.

China Setting Records: China Oil Demand, Coal Production and Vehicle Sales Up in 2010

During January, February, and March of this year, China was again setting records with huge year-over-year increases in oil demand.  In February, China’s oil demand rose 19.4 percent over a year earlier, the second fastest rise on record. According to Reuters, China is the world’s second largest oil user (second to the United States) and consumed 8.65 million barrels of oil per day in February, an increase of 9.4 percent or 604,000 barrels per day over January’s consumption.[i] Oil imports were up 13.8 percent in March over February, reaching 4.95 million barrels per day, according to preliminary data from China’s General Administration of Customs.[ii] In part, these large oil increases are fueling China’s passenger car fleet. New passenger car sales rose 55 percent in February from a year earlier, following a 116 percent increase in January, most likely aided by the extension of government incentives to boost purchases of smaller vehicles and spur rural demand for cars.  [iii]

China has spent nearly $200 billion on oil deals during the past few years, joining with more than 19 countries —including Russia, Turkmenistan, Kuwait, Yemen, Libya, Angola, Venezuela and Brazil— and paying for exploration, production, infrastructure construction, as well as “loans for energy” deals.[iv] Recently, China’s Sinopec International Petroleum Exploration and Production Company agreed to buy, for $4.65 billion, the 9 percent interest that ConocoPhillips holds in Syncrude,[v] a Canadian business involved in the production of oil sands (an asphalt-like heavy oil).[vi] Approval from the Canadian and Chinese governments is expected in the third quarter of this year.

Along with China’s Canadian oil pursuits, long thought to be a safe and secure supply for U.S. oil demand, the state-owned China Development Bank has promised to lend $20 billion to Venezuela to build new power plants, highways, and other projects, which will be repaid with Venezuelan crude oil. Venezuela’s President Hugo Chavez has long complained about the United States’ standing as the largest buyer of Venezuelan oil, and so he is more than pleased to offer his country’s oil to China instead.[vii] Both the Canadian crude and the Venezuelan crude are heavy oils, and the United States owns most of the refineries that can process heavy crude oils. So, to prepare itself for future heavy oil supplies, China has approved plans for construction of such a refinery. As the United States loses neighboring oil supplies to China, one wonders how the U.S. will meet future oil demand, especially as the Obama Administration has been slow to open new offshore areas to oil development (claiming further study is needed) but speedy at advocating climate legislation and a low-carbon fuel standard, both policies aimed at reducing the demand for fossil fuels without providing comparable energy substitutes.

china oil demand

Oil resources are not the only target on China’s energy wish-list. It also plans to increase its consumption of natural gas; last year, its liquefied natural gas imports rose by two-thirds, to 5.53 million tons or 7.7 billion cubic meters.[viii] China also continues to consume large quantities of its primary fuel, coal, in its industrial and electric generation sectors. According to China’s National Bureau of Statistics, the country’s coal output grew more than 28 percent, to well over 751 million tons in the first quarter of 2010. A report by China’s National Coal Association estimates China’s total coal production capacity exceeds 3.6 billion tons.[ix] This is in sharp contrast to coal mining in the United States, where the Environmental Protection Agency (EPA) has issued a new policy aimed at curbing mountain top removal mining[x] and is scrutinizing surface coal mine permits.  EPA is revoking or blocking Clean Water Act permits for mountain top mining citing irreversible damage to the environment. Some of the permits were awarded years ago.[xi]

Seventy percent of China’s energy comes from coal,[xii] the most carbon-intensive fossil fuel. China already consumes more than twice the coal as  the United States, and by 2030, China is expected to consume 3.7 times as much coal.[xiii] As a result, China emits more carbon dioxide than any other country in the world including the United States, and by 2030, it is expected to release 82 percent more carbon dioxide emissions than the United States.[xiv]

china co2 emissions

China’s Race to Electrification; U.S. Stagnation

Between 2004 and 2008, China added 346 gigawatts of generating capacity, of which 272 gigawatts were conventional thermal power (mostly coal) and 66 gigawatts were hydroelectric power. This compares to a total installed US hydroelectric capacity of 77 gigawatts.  China is estimated to have added an additional 85 gigawatts in 2009, reaching a total of 874 gigawatts,[xv] about 15 percent less than the total capacity in the United States. Of the 85 gigawatts added in 2009, 51 gigawatts were conventional thermal, again mostly coal, 25 gigawatts were hydroelectric, and 9 gigawatts were wind power.[xvi] Many of China’s wind turbines were funded by the U.N.’s Clean Development Mechanism,   under which wealthy countries fund projects in developing countries and receive carbon credits so long as those projects would not have been accomplished otherwise.[xvii]

In contrast, the United States added only 47 gigawatts of generating capacity from 2004 to 2008 (14 percent of the capacity China added), of which 26 gigawatts were natural gas-fired units and 18 gigawatts were wind turbines. New coal-fired capacity additions are practically non-existent in the United States primarily owing to objections regarding emissions of carbon dioxide. Coal-fired projects in the United States have either been cancelled or delayed because of permitting problems, reviews and re-reviews by EPA and resulting financing problems. While the United States has more coal than any other country in the world, with over 200 years of reserves at current usage rates, coal’s share of new U.S. generating markets has been replaced by natural gas and renewable units that are  more politically in vogue.

china electricity generating capacity
us electricity generating capacity

China’s Economic Growth and Export Market

China’s economy, the second-largest in the world in terms of purchasing power, is currently about half the size of the U.S. gross domestic product. According to China’s central bank, the country’s economy grew at an annual rate of 10.7 percent in the fourth quarter of 2009,[xviii] a rate almost twice the U.S. rate of 5.6 percent for the same time period.[xix] And in the first quarter of 2010, China’s economy grew by 11.9 percent. Forecasters predict that China’s economy will exceed that of the United States in 10 to 15 years.[xx]

China became the world’s largest exporter last year, edging out Germany and the United States. Despite a decline in total world trade, China’s exports fell less than those of other big powers. A report by the World Trade Organization calculates that the total value of merchandise exports fell by 23 percent in 2009. Among the top ten exporters, Japan’s shipments were the worst affected, falling by 26 percent. Because China’s exports fell by only 16 percent, it is now the single largest exporter. The World Trade Organization expects trade to rebound by nearly 10 percent this year.[xxi]

leading exporters world

Lessons to Be Learned

Many environmentalists and politicians seem to believe that China is winning the green energy race, but nothing could be further from reality.[xxii] China is in a race for energy—all forms of energy—to fuel its growing economy. The size and scope of its investments in conventional forms of energy dwarf their commitment to “green energy.” It is providing loans around the world to invest in future oil projects, and it cares not that the oil is less than the lightest and sweetest. Canadian oil sands and Venezuelan heavy crude are perfectly fine. China is building a coal-fired generating plant each and every week on average, and increasing its coal mining capacity to fuel them. This belies any stated concerns about increasing their carbon dioxide emissions, already the highest of any country in the world. China is building wind turbines too, but if wealthy countries are willing to pay—why not? It matters not at all that the transmission capacity is not yet there to operate almost a third of these wind turbines. And China’s large-scale hydroelectric projects are engineering feats par excellence, built regardless of environmental concerns.
China is ensuring energy supplies will be available to fuel its growing economy. The United States should take note.

[i] Reuters, China oil demand rise second fastest, inventories drag, March 22, 2010, [ii] Reuters, Oil falls as demand, inventories weigh, April 12, 2010,
[iii] Reuters, China oil demand rise second fastest, inventories drag, March 22, 2010,
[iv] Politico, To compete with China, U.S. must tap natural gas, April 13, 2010,
[v] Reuters, China bags oil sands stake, not finished yet, April 13, 2010, and
[vi] Syncrude,
[vii] The Wall Street Journal, China’s $20 Billion Bolsters Chavez, April 18, 2010,
[viii] Reuters, China bags oil sands stake, not finished yet, April 13, 2010,
[ix] China Daily, China’s coal output up 28.1% in Q1, April 15, 2010,
[x] Environmental protection Agency, New Releases, EPA issues comprehensive guidance to protect Appalachian communities from harmful environmental impacts of mountaintop mining, April 1, 2010,!OpenDocument
[xi] Associated Press, Arch Coal sues EPA over veto of W.Va. mine permit, April 2, 2010,
[xii] Energy Information Administration, China,
[xiii] Energy Information Administration, International Energy Outlook 2009,
[xiv] Energy Information Administration, International Energy Outlook 2009,
[xvi] China’s power generation goes greener with total capacity up 10%, January 7, 2010,
[xviii] Politico, To compete with China, U.S. must tap natural gas, April 13, 2010,
[xx] Energy Information Administration, International Energy Outlook 2009,
[xxi] China overtakes Germany to become the biggest exporter of all, March 31, 2010,

Monday, February 8, 2010

GMO Panel deliberations on the paper by de Vendômois et al. (2009, A Comparison of the Effects of Three GM Corn Varieties on Mammalian Health, International Journal of Biological Sciences, 5: 706-726)

EFSA: Adopted part of the minutes of the 55th plenary meeting of the Scientific Panel on Genetically Modified Organisms held on 27-28 January 2010 to be published at

GMO Panel deliberations on the paper by de Vendômois et al. (2009, A Comparison of the Effects of Three GM Corn Varieties on Mammalian Health, International Journal of Biological Sciences, 5: 706-726)
The EFSA GMO Panel has considered the paper by de Vendômois et al. (2009, A Comparison of the Effects of Three GM Corn Varieties on Mammalian Health, International Journal of Biological Sciences, 5: 706-726), a statistical reanalysis of data from three 90-day rat feeding studies already assessed by the GMO Panel (EFSA, 2003a,b; EFSA 2004a,b; EFSA 2009b,c). The GMO Panel concludes that the authors’ claims, regarding new side effects indicating kidney and liver toxicity, are not supported by the data provided in their paper. There is no new information that would lead it to reconsider its previous opinions on the three maize events MON810, MON863 and NK603, which concluded that there were no indications of adverse effects for human, animal health and the environment.

The GMO Panel notes that several of its fundamental statistical criticisms (EFSA, 2007a,b) of the authors' earlier study (Seralini et al., 2007) of maize MON863 are also applicable to the new paper by de Vendômois et al. In the GMO Panel's extensive evaluation of Seralini et al. (2007), reasons for the apparent excess of significant differences found for MON863 (8%) were given and it was shown that this raised no safety concerns. The percentage of variables tested reported by de Vendômois et al. that were significant for NK603 (9%) and MON810 (6%) were of similar magnitude to that for MON863.

The GMO Panel considers that de Vendômois et al.: (1) make erroneous statements concerning the use of reference varieties to provide estimates of variability that allow equivalence testing to place statistically significant results into biological context as advocated by EFSA (2008, 2009a); (2) do not use the available information concerning normal background variability between animals fed with different diets, to place observed differences into biological context; (3) do not present results using their False Discovery Rate methodology in a meaningful way; (4) give no evidence to relate wellknown gender differences in response to diet to claims of effects due to the respective GMOs; (5) estimate statistical power based on inappropriate analyses and magnitudes of difference.

The significant differences highlighted by de Vendômois et al. have all been considered previously by the GMO Panel in its previous opinions on the three maize events MON810, MON863 and NK603.  The study by de Vendômois et al. provides no new evidence of toxic effects. The approach used by de Vendômois et al. does not allow a proper assessment of the differences claimed between the GMOs and their respective counterparts for their toxicological relevance because: (1) results are presented exclusively in the form of percentage differences for each variable, rather than in their actual measured units; (2) the calculated values of the toxicological parameters tested are not related to the normal range for the species concerned; (3) the calculated values of the toxicological parameters tested are not compared with ranges of variation found in test animals fed with diets containing different reference varieties; (4) the statistically significant differences did not show consistency patterns over endpoint variables and doses; (5) the inconsistencies between the purely statistical arguments of de Vendômois et al., and the results for these three animal feeding studies which relate to organ pathology, histopathology and histochemistry, are not addressed. Regarding claims made by de Vendômois et al.  concerning the inadequacy of the experimental design of these three animal feeding studies, the GMO Panel notes that they were all carried out to agreed internationally-defined standards consistent with OECD protocols. 


-  EFSA, 2003a. Opinion of the Scientific Panel on genetically modified organisms (GMO) on a request from the Commission related to the safety of foods and food ingredients derived from herbicidetolerant genetically modified maize NK603, for which a request for placing on the market was submitted under Article 4 of the Novel Food Regulation (EC) No 258/97 by Monsanto.
-  EFSA, 2003b. Opinion of the Scientific Panel on genetically modified organisms (GMO) on a request from the Commission related to the Notification (Reference CE/ES/00/01) for the placing on the market of herbicide-tolerant genetically modified maize NK603, for import and processing, under Part C of Directive 2001/18/EC from Monsanto.
-  EFSA, 2004a. Opinion of the Scientific Panel on genetically modified organisms (GMO) on a request from the Commission related to the Notification (Reference C/DE/02/9) for the placing on the market of insect-protected genetically modified maize MON 863 and MON 863 x MON 810, for import and processing, under Part C of Directive 2001/18/EC from Monsanto.
-  EFSA, 2004b. Opinion of the Scientific Panel on genetically modified organisms (GMO) on a request from the Commission related to the safety of foods and food ingredients derived from insectprotected genetically modified maize MON 863 and MON 863 x MON 810, for which a request for placing on the market was submitted under Article 4 of the Novel Food Regulation (EC) No 258/97 by Monsanto.
-  EFSA, 2007a. EFSA review of statistical analyses conducted for the assessment of the MON 863 90- day rat feeding study. EFSA, 2007b. Statement on the analysis of data from a 90-day rat feeding study with MON 863 maize by the Scientific Panel on genetically modified organisms (GMO).
-  EFSA, 2008. Updated guidance document for the risk assessment of genetically modified plants and derived food and feed. Annex A. EFSA, 2009a. Statistical considerations for the safety evaluation of GMOs.
-  EFSA, 2009b. Applications (references EFSA-GMO-NL-2005-22, EFSA-GMO-RX-NK603) for the placing on the market of the genetically modified glyphosate tolerant maize NK603 for cultivation, food and feed uses, import and processing and for renewal of the authorisation of maize NK603 as existing products, both under Regulation (EC) No 1829/2003 from Monsanto.
-  EFSA, 2009c. Applications (EFSA-GMO-RX-MON810) for renewal of authorisation for the continued marketing of (1) existing food and food ingredients produced from genetically modified insect resistant maize MON810; (2) feed consisting of and/or containing maize MON810, including the use of seed for cultivation; and of (3) food and feed additives, and feed materials produced from maize MON810, all under Regulation (EC) No 1829/2003 from Monsanto.
-  Seralini, G.E., Cellier D., de Vendômois J.S. 2007. New analysis of a rat feeding study with a genetically modified maize reveals signs of hepatorenal toxicity. Arch. Environ. Contam.  Toxicol., 52: 596-602.

Saturday, January 16, 2010

FDA Decision on Chemical BPA Gets Mixed Review: "ACSH scientists are glad a ban was avoided but remain disappointed"

FDA Decision on Chemical BPA Gets Mixed Review: "ACSH scientists are glad a ban was avoided but remain disappointed."
ACSH, January 15, 2010

New York NY -- January 15th, 2010. The American Council on Science and Health applauds today's decision by the Food and Drug Administration (FDA) not to ban the plastic hardener bisphenol-A (BPA). Despite heavy pressure from various activist "environmental" groups, the FDA has not placed any restrictions on the chemical's use in consumer products but rather decided to "support" industry's decisions to reduce exposure to BPA in food-related products aimed at infants and children. FDA is also "facilitating" the development of alternatives to BPA in infant formula cans.

FDA stopped well short of a ban on this common and useful chemical, which has been in safe use in a wide spectrum of consumer products for over 50 years. ACSH scientists are pleased but remain disappointed that the FDA review and recommendations deviated at all from sound science -- by showing concern for hypothetical and non-existent health risks. ACSH's medical director, Dr. Gilbert Ross, said: "BPA has been among the most well-studied substances known to man, and repeated evaluation by respected scientific bodies worldwide has without fail deemed BPA safe as typically used. Our publication on BPA remains quite relevant today: we found that BPA is safe for all ages, including infants and children."

Another key fact is that since BPA became commonplace in the lining of canned goods, foodborne illness from canned foods -- including botulism -- has virtually disappeared. Any possible new replacement could not have the same record of testing and safety as has been shown for BPA.

ACSH's president, Dr. Elizabeth M. Whelan, added, "The fear campaign against BPA promoted by a few activist groups has been based solely on flimsy animal research. Recently, lacking real science to support their alarmist claims, some labs have tried 'novel approaches to test for subtle effects,' as the FDA report states. This is not how human risk assessment should be carried out. If there were any real adverse health effects from exposure to BPA, such effects would have become manifest long ago and would not have required bizarre tests in a few advocate's labs."

ACSH's associate director, Jeff Stier, pointed out: "This finding should put the matter to rest. The current FDA is very cautionary. After taking all this extra time to re-study the issue, the fact that they are keeping BPA on the market speaks volumes about the safety of the product. If BPA were endangering children, they'd have never left it on the market."

See also: ACSH's earlier official statement on The Facts About Bisphenol A.

For media contact, including interviews, please call:

Dr. Elizabeth Whelan (WhelanE[at] 917-439-8043
Dr. Gilbert Ross (RossG[at] 516-581-8400
Jeff Stier (StierJ[at] 646-245-1443

Tuesday, October 20, 2009

Industry views: The U.S. doubles down on solar subsidies while Europe retreats

The U.S. doubles down on solar subsidies while Europe retreats
IER, Oct 19, 2009

The cap and trade bills circulating in Congress (such as H.R. 2454, the Waxman-Markey bill) not only “tax” the people of the nation for the right to reduce greenhouse gas emissions in this country, but they contain additional energy-related “tax” provisions.[i] One of these is a Renewable Portfolio Standard (RPS) that requires 20 percent of electricity generation to come from qualified renewable technologies by 2020.[ii] This is a “tax” because it requires those utilities unable to meet the required percentage to purchase renewable credits from those that can exceed the targeted amount. The higher generating costs incurred from constructing and operating the renewable technologies, or buying renewable credits, will be passed on to the users of the electricity. These “taxes” are in addition to the generous tax-funded subsidies already provided to many qualified renewables.

The concept of an RPS is not new. Twenty-nine states and the District of Columbia currently have some form of RPS[iii], but few states are meeting their mandates,[iv] and these states have often tailored their “qualified renewables” liberally to what makes sense to their area. Texas, a state that has met its mandates mainly from wind-generated power, the least-cost qualified renewable, is now considering expanding into more costly renewables, such as solar power. Houston, for example, is considering using solar to generate 1.5 percent of its government’s needs from a 10-megawatt plant to be built by NRG and to be operating by July 2010. When the sun is not visible, the plant will be backed-up by the city’s natural gas-fired generating units.

The proposed 10-megawatt Houston plant is estimated to cost $40 million[v], $4,000 per kilowatt, which is a smaller cost figure than many other solar project estimates and most probably speculative. And, that $4,000 per kilowatt is also far more costly than other generating technologies that are more reliable to boot. For example, the Energy information Administration (EIA), an independent agency within the U.S. Department of Energy, is estimating the cost to build a coal-fired plant at about half the estimated cost in Houston, or just over $2,000 per kilowatt, and a natural-gas fired plant at less than a quarter of that cost, at below $1,000 per kilowatt. [vi] EIA’s estimate for a photovoltaic plant, which is what is being proposed in Houston, is just over $6,000 per kilowatt, 50 percent higher than the NRG cost estimate.[vii] In fact, photovoltaic solar is the highest-cost generating technology of EIA’s slate of 20 potential technologies for generating this country’s future electricity needs.[viii]

European Experience

However, we do not have to use EIA’s cost figures to know that solar is non-competitive with conventional grid generation. Several countries in Europe have already implemented RPS type programs with hefty subsidies funded by the country’s taxpayers. They include Spain, Germany, and Denmark. For example, in Alvarado, Spain, the energy firm Acciona inaugurated a 50-MW concentrating solar power plant in late July. The cost is €236 million, about $350 million U.S., or about $7,000 per kilowatt.[ix] Construction of the plant began in February 2008, with an average of 350 people working throughout the 18-month construction period. The plant will be run by a 31-person operation and maintenance team. This is the second solar plant of this type built in Spain. Its predecessor has been operating since June 2007.[x]

Spain ranks second in the world in installed solar capacity, second only to Germany.[xi] To achieve that ranking, Spain initiated legislation that requires 20 percent of its electricity generation to be from renewable energy by 2010. To make renewable energy attractive to investors, Spain also subsidized its renewable technologies. In 2008, for instance, when solar power generated less than 1 percent of Spain’s electricity, its cost was over 7 times higher than the average electricity price. Due to feed-in tariffs, utility companies were forced to buy the renewable power at its higher cost. And not only is solar power more expensive, jobs that could have been fostered and continued elsewhere in the Spanish economy were foregone to meet the government’s renewable mandates. A Spanish researcher found that while solar energy employs many workers in the plant’s construction, it consumes a great amount of capital that would have created many more jobs in other parts of the economy. In fact, for each megawatt of solar energy installed in Spain, 12.7 jobs were lost elsewhere in the Spanish economy.[xii] Recently, the Spanish government decided to slash subsidies to solar power. Spain will subsidize just 500 megawatts of solar projects this year, down sharply from 2,400 megawatts last year.[xiii]

Germany—the world’s highest ranking country for installed solar capacity and the largest market for solar products—is also slashing its subsidies for solar power in order to ease costs for electricity users. Owners of solar panels receive as much as 43 euro cents (64 U.S. cents) per kilowatt hour of power they generate.[xiv] The Energy Information Administration calculates the levelized cost of electricity[xv] from solar photovoltaic power to be 39.57 cents per kilowatt hour (2007 dollars) in 2016,[xvi] far less than the German subsidy. According to some German researchers, the feed-in tariff for solar is 43 euro cents per kilowatt hour (kWh), making solar electricity by far the most subsidized technology among all forms of renewable energy. This feed-in tariff for solar photovoltaic power is more than eight times higher than the electricity price at the power exchange and more than four times the feed-in tariff paid for electricity produced by on-shore wind turbines. Because of solar power’s low capacity factor, solar generated only 0.6 percent of Germany’s electricity in 2008.[xvii] Since the sun doesn’t always shine on solar plants, solar power cannot compete with more mature generating technologies. The EIA estimates the capacity factor for solar in 2008 to be 17 percent.[xviii]

U.S. Subsidies

While the U.S. does not have feed-in tariffs at this time, it does subsidize solar power through investment tax credits that are as high as 30 percent currently and until 2016. Solar also benefits from a permanent investment tax credit of 10 percent in the U.S., and a 5-year accelerated depreciation write-off. The Energy information Administration estimates that total federal subsidies for electric production from solar power for fiscal year 2007 were $24.34 per megawatt hour, compared to 25 cents per megawatt hour for natural gas and petroleum fueled technologies—98 times higher.[xix] Yet, even with these subsidies, solar generated only 0.02 percent of U.S. electricity in 2008.[xx] That is because solar at around 40 cents per kilowatt hour is more than 4 times as expensive on a levelized cost basis than its fossil competitors. (EIA estimates that levelized costs for conventional coal are 9.46 cents per kilowatt hour and those for natural gas combined cycle are 8.39 cents per kilowatt hour (in 2007 dollars) for 2016.[xxi])

Of course, the U.S. is slow in learning from Europe’s experiences. On October 12, 2009, California Governor Arnold Schwarzenegger signed into law S.B. 32, a feed-in tariff that requires California utilities to buy all renewable generation under 3 megawatts within their service territories, until they hit a state-wide total cap of 750 megawatts.[xxii] How California will monitor this program is yet to be seen. It has yet to achieve its renewable generating mandates from its RPS program.[xxiii]


Solar power has it place in certain applications. As always, the individual citizen or company should be able to choose if solar works for their energy needs. But using solar power to generate electricity for the electrical grid is very expensive. Requiring ratepayers to buy solar power, either through renewable energy mandates or through feed-in tariffs, will only increase the price of electricity. The last thing the economy needs is higher energy prices, but that is exactly what solar energy’s supporters are promoting.


[i] Robert J. Michaels, The Other Half of Waxman-Markey: An Examination of the non-Cap-and-Trade Provisions,–FINAL.pdf
[ii] H.R. 2454, section 101
[iii] Database of State Incentives for Renewables and Efficiency (DSIRE), North Carolina State University,
[iv] Traci Watson, States not meeting renewable energy goals, USA Today, Oct. 8, 2009,
[v] “Solar forecast: expensive”, Loren Steffy, Houston Chronicle, September 29, 2009,
[vi] Energy information Administration, Assumptions to the Annual Energy outlook 2009, Table 8.2.
[vii] Ibid.
[viii] Ibid.
[ix] Sonal Patel, Power Digest, Power Magazine, Sept. 2009,
[x] Sonal Patel, Interest in Solar Tower Technology Rising, Power Magazine,
[xi] Solar Energy Industries Association,
[xii] Study of the effects on employment of public aid to renewable energy sources, Universidad Rey Juan Carlos, March 2009,
[xiii] Wall Street journal, “Darker Times for Solar-Power Industry”, May 11, 2009, .
[xiv] “Merkel’s Coalition to “Definitely” Cut German Solar subsidies”, Brian Parker and Nicholas Comfort, Bloomberg, October 12, 2009,
[xv] The levelized cost of a generating technology is the present value of the total cost of building and operating the generating plant over its financial life.
[xvi]“Levelized Cost of New Electricity Generating Technologies” , Institute for Energy Research, May 12, 2009,
[xvii] “Economic impacts from the promotion of renewable energies”, Rheinisch-Westfälisches Institut für Wirtschaft sforschung
[xviii] “Solar forecast: expensive”, Loren Steffy, Houston Chronicle, September 29, 2009,
[xix] Energy information Administration, Federal Financial interventions and Subsidies in Energy markets 2007, .
[xx] Energy Information Administration, Monthly Energy Review, Table 7.2a,
[xxi]“Levelized Cost of New Electricity Generating Technologies” , Institute for Energy Research, May 12, 2009,
[xxii] Greenwire, “California: Schwarzenegger signs feed-in tariff, spate of enviro bills”, October 12, 2009,
[xxiii] Robert J. Michaels, “A National Renewable Portfolio Standard: Politically Correct, Economically Suspect,” Electricity Journal 21 (April 2008)