Showing posts with label science. Show all posts
Showing posts with label science. Show all posts

Wednesday, May 20, 2009

Golden rice an effective source of vitamin A

Golden rice an effective source of vitamin A
Eurekalert, May 13, 2009

HOUSTON – (May 13, 2009) – The beta-carotene in so-called "Golden Rice" converts to vitamin A in humans, according to researchers at Baylor College of Medicine (www.bcm.edu) and Tufts University in an article that appears in the current issue of the American Journal of Clinical Nutrition.

Golden Rice was developed in the early 1990s with a grant from the Rockefeller Foundation with the goal of creating rice that had beta-carotene -- a vitamin A precursor – in the rice grain. In its current form, Golden Rice contains 35 micrograms of beta-carotene per gram.

"We found that four units of beta-carotene from Golden Rice convert to one unit of vitamin A in humans," said Dr. Michael Grusak (http://www.bcm.edu/cnrc/faculty/?PMID=9536), associate professor of pediatrics at the USDA/ARS Children's Nutrition Research Center (http://www.bcm.edu/cnrc/?PMID=0) at BCM and Texas Children's Hospital.

They determined this by feeding five healthy adults a specific amount of specially-labeled Golden Rice and measured the amount of retinol, a form of vitamin A, in the blood.
Vitamin A deficiency is prevalent in many parts of the world where poorer community members rely on rice as their major food source. People who lack adequate amounts of this vitamin can have vision problems or even blindness as a result.

"By incorporating vitamin A into the major crop that is consumed, we would be able to make it accessible to the majority of people in the area," said Grusak.

Additional research is necessary before Golden Rice is made commercially available. The next steps of the research include incorporating this technology into the rice grains found in various regions and continuing testing the conversion rates in humans.

###

Others who participated in this study include Guangwen Tang, Jian Qin, Gregory G. Dolnikowski and Robert M Russell, all of the Jean Mayer US Department of Agriculture Human Nutrition Research Center on Aging at Tufts University.

Funding for this study came from the National Institute of Diabetes and Digestive and Kidney Diseases, a part of the National Institutes of Health.

The study can be found at http://www.ajcn.org/cgi/rapidpdf/ajcn.2008.27119v1.

For more information on basic science at Baylor College of Medicine, please go to www.bcm.edu/news or www.bcm.edu/fromthelab.

Wednesday, March 18, 2009

Challenging Ultra-skeptic Climate Claims on CO2 Emissions

Part I in an Occasional Series Challenging Ultra-skeptic Climate Claims. By Chip Knappenberger
Master Resource, March 18, 2009

In the realm of climate science, as in most topics, there exists a range of ideas as to what is going on, and what it means for the future. At the risk of generalizing, the gamut looks something like this: Ultra-alarmists think that human greenhouse-gas-producing activities will vastly change the face of the planet and make the earth inhospitable for humans; they therefore demand large and immediate action to curtail greenhouse gas emissions. Alarmists understand that human activities are changing the earth’s climate and think that the potential changes are sufficient to warrant some pre-emptive action to try to mitigate them. Skeptics think that humans activities are changing the earth’s climate but, by and large, they think that the changes, in net, are not likely to be terribly disruptive and that drastic action to curtail greenhouse gas emissions is unnecessary, difficult, and ineffective. Ultra-skeptics think that human greenhouse gas-producing activities are impacting the earth’s climate in no way whatsoever.

Most of my energy tends to be directed at countering alarmist claims about impending climate catastrophe, but the scientist in me gets just as bent out of shape about some of the contentions made by the ultra-skeptics, which are simply unsupported by virtually any scientific evidence. Primary among these claims is that human activities are not responsible for the observed build-up of atmospheric carbon dioxide. This is just plain wrong.

We have good measurement of how much carbon dioxide is building up in the atmosphere each year, and we have good estimates of how much carbon dioxide is being emitted from human activities each year. It turns out that there are more than enough anthropogenic emissions to account for how much the atmosphere is accumulating. In fact, the great mystery concerns the “missing carbon,” that is, where exactly is the extra carbon dioxide going that is emitted by humans but that doesn’t end up staying in the atmosphere. (Only about half of the human CO2 emissions end up accumulating in the atmosphere; the rest end up somewhere else—in the oceans, in the terrestrial biosphere, etc.) In my opinion, it would be much more useful for folks interested in the carbon cycle to try to better understand the behavior of the CO2 sinks and how that behavior may change in the future (if at all) rather than in trying for come up with sources of CO2 other than human activities to explain the atmospheric concentration growth—as it is, we already have too much, not too little.

What this means is: The argument that the increase in atmospheric CO2 results from a natural temperature recovery from the depths of the Little Ice Age in the mid-to-late 1800s just doesn’t work.

In fact, all lines of evidence are against it.

This argument has its foundation in the carbon-dioxide and temperature trends of the past 400 to 600 thousand years, which we know from air bubbles trapped in ice that has been extracted from ice cores taken in Antarctica and Greenland. Basically, the data from the ice cores show that periods when the earth’s climate has been warm are also periods when there have been relatively higher CO2 concentrations (Figure 1). Al Gore uses this to say that the higher CO2 caused the higher temperatures; ultra-skeptics counter by pointing out that, if you look closely enough, you’ll see that the temperature rises before the CO2 rises, so rising temperatures cause rising CO2, not vice versa. The fact is that both interpretations are correct—rising temperatures led to rising CO2, which led to more rising temperatures. But the only relevance that this has to the current situation is that this natural positive feedback between temperature variations and CO2 variations didn’t run away in the past, and so we shouldn’t expect it to run away now. It carries no relevance as to what is causing the ongoing increase in atmospheric CO2 concentrations.

[Figure 1. Variability in atmospheric CO2 (red, left-hand axis) and temperatures (blue, right-hand axis) derived from ice core air samples extending back more than 400,000 years.]

But anyone who looks at the data (shown in Figure 1) will see that no matter which caused the other, the changes in temperature from ice-age cold to interglacial warmth are about 10ºC while the change in CO2 is about 100ppm. Since the late 1800s, the temperature has warmed a bit less than 1ºC , while the CO2 concentration has increased by a bit less than 100ppm. In other words, the natural, historical relationship between CO2 and temperature is about 10 times weaker than that observed over the past 100 or so years. Thus, there is no way that the temperature rise from the Little Ice Age to the present can be the cause of an atmospheric CO2 increase of nearly 100ppm—the reasonable expectation would be about 10ppm. This line of reasoning is off by an order of magnitude.

And where do ultra-skeptics think the CO2 building up in the atmosphere is coming from, if not from humans? Their answer is typically “the oceans”—as the oceans warm, they outgas carbon dioxide. While this is certainly true, an opposite effect is also ongoing—a greater concentration of CO2 in the air drives more CO2 into the oceans. One way of determining how much CO2 is dissolved in the oceans is to observe the pH of the ocean waters. Long-term trends show a gradual decline in ocean pH (the source of the ocean “acidification” scare—the subject of a future MasterResource article). This means that the ocean is gaining more CO2 than it is losing. So, it can’t be the source of the large CO2 increase observed in the atmosphere.

Another way to figure out where the extra CO2 that is now part of the annual flux is coming from is through an isotopic analysis. CO2 that is released from fossil fuels carries a different (lighter) molecular weight than that which is usually part of the annual CO2 flux from land and oceans and atmosphere. CO2 released by fossil fuel has a lower 13C/12C ratio than does most other CO2 and long-term records show that the overall 13C/12C ratio in the atmosphere has been declining—an indication that an increasing proportion of atmospheric CO2 is coming from fossil fuel sources.

So there are (at least) three independent methods of determining the source of the extra CO2 that is building-up in the atmosphere, and all three of them finger fossil-fuel combustion as the primary culprit.

Yet, despite the overwhelming scientific evidence, the ultra-skeptics persist on forwarding the concept that the observed atmospheric CO2 growth is not caused by human actions. And sadly, since this notion is extremely pleasing to those folks (politicians et al.) who are actively fighting legislation aimed at limiting greenhouse gas emissions, it is widely incorporated into their stump speeches. Some even go so far as to suggest that the respiration of 6.5 (and growing) billion humans plays a role in the CO2 increases—again, pure nonsense. We humans breathe out only what we take in, and since we eat plants, which extract atmospheric CO2 for their carbon source in producing carbohydrates, it is a completely closed loop. (Now if we ate coal, or drank oil, perhaps things would be different.)

Fighting bad science with bad science is just a bad idea. There are numerous reasons to oppose restrictions on greenhouse gas emissions, but the notion that they aren’t contributing to increasing atmospheric concentrations of carbon dioxide isn’t one of them.

In future articles, if I have time between combating alarmist outbreaks, I may point out some other ultra-skeptic fallacies—such as, “The build-up of atmospheric greenhouse gases isn’t responsible for elevating global average surface temperatures” or “Natural variations can fully explain the observed ‘global warming’.”

Tuesday, March 17, 2009

US Energy Dept Report on Techniques to Ensure Safe, Effective Geologic Carbon Sequestration

DOE Releases Report on Techniques to Ensure Safe, Effective Geologic Carbon Sequestration
Comprehensive Report Describes New and Emerging Methods to Monitor, Verify, and Account for CO2 Stored in Geologic Formations
March 17, 2009

Washington, DC — The Office of Fossil Energy's National Energy Technology Laboratory (NETL) has created a comprehensive new document that examines existing and emerging techniques to monitor, verify, and account for carbon dioxide (CO2) stored in geologic formations. The report, titled Monitoring, Verification, and Accounting of CO2 Stored in Deep Geologic Formations, should prove to be an invaluable tool in reducing greenhouse gas emissions to the atmosphere through geologic sequestration.

The report was prepared by NETL with input from the seven Regional Carbon Sequestration Partnerships. Its main goals are to—
  • Provide an overview of monitoring, verification, and accounting (MVA) techniques that are currently in use or are being developed.
  • Summarize the Energy Department’s MVA research and development program.
  • Present information that can be used by regulatory organizations, project developers, and national and state policymakers to ensure the safety and efficacy of carbon storage projects.
  • Emissions of CO2 have increased from an insignificant level two centuries ago to more than 30 billion tons worldwide today. As a result, atmospheric levels of CO2 have risen from preindustrial levels of 280 parts per million (ppm) to more than 380 ppm today. If no effort is made to reduce CO2 emissions, yearly release from the United States could increase by one third from 2005 to 2030.
Carbon capture and storage will help reduce this growth by capturing CO2 before it is emitted into the atmosphere. Geologic sequestration—the storage of CO2 in deep geologic formations such as depleted oil and gas reservoirs, unmineable coal seams, and saline formations—has emerged as an important and viable option in a wide-ranging portfolio of technologies.

Reliable and cost-effective MVA techniques are critical to making geologic storage a safe, effective, and acceptable method for reducing greenhouse gas emissions. Additionally, MVA provides data that can be used to—
  • Verify national inventories of greenhouse gases.
  • Assess reductions of greenhouse gas emissions at geologic sequestration sites.
  • Evaluate potential regional, national, and international greenhouse gas reduction goals.
The Office of Fossil Energy supports a number of carbon capture and storage initiatives including a vigorous MVA research and development program.

Sunday, March 15, 2009

WaPo: Mr. Obama's next step on stem cell research

A Moral Stand. WaPo Editorial
Mr. Obama's next step on stem cell research
Sunday, March 15, 2009; A18

PRESIDENT OBAMA'S pronouncement on stem cell research last week, as we noted at the time, was only a partial decision. He decreed that federal funding of such research could go forward on a much broader scale than President George W. Bush had permitted. But he didn't say whether it could proceed on stem cells derived from embryos created specifically for the purpose of research. This is in large part an ethical question. Mr. Obama is right to turn to scientists for advice on the matter, but he should not hide behind them in making the ultimate decision.

Embryonic stem cell research is thought to hold great promise for the treatment of Parkinson's and other debilitating diseases and conditions. But many Americans are troubled by the destruction of human embryos that such research requires. As a result, Mr. Bush limited federal funding to research on stem cell lines in existence at the time of his 2001 decision; there would be no incentive for further creation or destruction of embryos for experimentation.

A breakthrough came in 2007: Scientists learned to develop stem cells from adult skin cells. Some argued that this would end the need to use embryos. Others, though, said that the field was too young to close off any avenue, and that the embryonic lines available under Mr. Bush's order had proved too limiting.

Mr. Obama accepted the latter argument, and we supported him. In so doing, though, he shunned a possible compromise: to allow research on stem cell lines grown from embryos that were created in fertility laboratories but never implanted. Thousands are frozen and awaiting destruction; with permission of the egg and sperm donors, they might satisfy researchers' needs. Mr. Obama did not embrace this opportunity to reach out to opponents -- not all of whom, of course, would have been satisfied by such a compromise.

The president has asked the National Institutes of Health to develop guidelines for research. Scientists can develop rules to make sure donors are dealt with ethically. If the scientists so believe, they can present reasons why existing frozen embryos aren't enough -- why research would benefit from having embryos created. But it's not the job of the scientist to decide whether those reasons outweigh concerns about such a practice. That's the president's job. He should listen to the scientists' arguments, make his decision and -- as Mr. Bush did in 2001 -- explain it to the American people.

Friday, March 13, 2009

Global hurricane activity has decreased to the lowest level in 30 years

Great Depression! Global hurricane activity reaches new lows. By Ryan N Maue
Global hurricane activity has decreased to the lowest level in 30 years.

Excerpts:

Very important: global hurricane activity includes the 80-90 tropical cyclones that develop around the world during a given calendar year, including the 12-15 that occur in the North Atlantic (Gulf of Mexico and Caribbean included). The heightened activity in the North Atlantic since 1995 is included in the data used to create this figure.

As previously reported here and here at Climate Audit, and chronicled at my Florida State Global Hurricane Update page, both Northern Hemisphere and overall Global hurricane activity has continued to sink to levels not seen since the 1970s. Even more astounding, when the Southern Hemisphere hurricane data is analyzed to create a global value, we see that Global Hurricane Energy has sunk to 30-year lows, at the least. Since hurricane intensity and detection data is problematic as one goes back in time, when reporting and observing practices were different than today, it is possible that we underestimated global hurricane energy during the 1970s. See notes at bottom to avoid terminology discombobulation.

Using a well-accepted metric called the Accumulated Cyclone Energy index or ACE for short (Bell and Chelliah 2006), which has been used by Klotzbach (2006) and Emanuel (2005) (PDI is analogous to ACE), and most recently by myself in Maue (2009), simple analysis shows that 24-month running sums of global ACE or hurricane energy have plummeted to levels not seen in 30 years. Why use 24-month running sums instead of simply yearly values? Since a primary driver of the Earth's climate from year to year is the El Nino Southern Oscillation (ENSO) acts on time scales on the order of 2-7 years, and the fact that the bulk of the Southern Hemisphere hurricane season occurs from October - March, a reasonable interpretation of global hurricane activity requires a better metric than simply calendar year totals. The 24-month running sums is analogous to the idea of "what have you done for me lately".

During the past 6 months, extending back to October of 2008 when the Southern Hemisphere tropical season was gearing up, global ACE had crashed due to two consecutive years of well-below average Northern Hemisphere hurricane activity. To avoid confusion, I am not specifically addressing the North Atlantic, which was above normal in 2008 (in terms of ACE), but the hemisphere (and or globe) as a whole. The North Atlantic only represents a 1/10 to 1/8 of global hurricane energy output on average but deservedly so demands disproportionate media attention due to the devastating societal impacts of recent major hurricane landfalls.


Why the record low ACE?

During the past 2 years +, the Earth's climate has cooled under the effects of a dramatic La Nina episode. The Pacific Ocean basin typically sees much weaker hurricanes that indeed have shorter lifecycles and therefore — less ACE . Conversely, due to well-researched upper-atmospheric flow (e.g. vertical shear) configurations favorable to Atlantic hurricane development and intensification, La Nina falls tend to favor very active seasons in the Atlantic (word of warning for 2009). This offsetting relationship, high in the Atlantic and low in the Pacific, is a topic of discussion in my GRL paper, which will be a separate topic in a future posting. Thus, the Western North Pacific (typhoons) tropical activity was well below normal in 2007 and 2008 (see table). Same for the Eastern North Pacific. The Southern Hemisphere, which includes the southern Indian Ocean from the coast of Mozambique across Madagascar to the coast of Australia, into the South Pacific and Coral Sea, saw below normal activity as well in 2008. Through March 12, 2009, the Southern Hemisphere ACE is about half of what's expected in a normal year, with a multitude of very weak, short-lived hurricanes. All of these numbers tell a very simple story: just as there are active periods of hurricane activity around the globe, there are inactive periods, and we are currently experiencing one of the most impressive inactive periods, now for almost 3 years.


Bottom Line

Under global warming scenarios, hurricane intensity is expected to increase (on the order of a few percent), but MANY questions remain as to how much, where, and when. This science is very far from settled. Indeed, Al Gore has dropped the related slide in his PowerPoint (btw, is he addicted to the Teleprompter as well?) Many papers have suggested that these changes are already occurring especially in the strongest of hurricanes, e.g. this and that and here, due to warming sea-surface temperatures (the methodology and data issues with each of these papers has been discussed here at CA, and will be even more in the coming months). The notion that the overall global hurricane energy or ACE has collapsed does not contradict the above papers but provides an additional, perhaps less publicized piece of the puzzle. Indeed, the very strong interannual variability of global hurricane ACE (energy) highly correlated to ENSO, suggests that the role of tropical cyclones in climate is modulated very strongly by the big movers and shakers in large-scale, global climate. The perceptible (and perhaps measurable) impact of global warming on hurricanes in today's climate is arguably a pittance compared to the reorganization and modulation of hurricane formation locations and preferred tracks/intensification corridors dominated by ENSO (and other natural climate factors). Moreover, our understanding of the complicated role of hurricanes with and role in climate is nebulous to be charitable. We must increase our understanding of the current climate's hurricane activity.

Background:
During the summer and fall of 2007, as the Atlantic hurricane season failed to live up to the hyperbolic prognostications of the seasonal hurricane forecasters, I noticed that the rest of the Northern Hemisphere hurricane basins, which include the Western/Central/Eastern Pacific and Northern Indian Oceans, was on pace to produce the lowest Accumulated Cyclone Energy or ACE since 1977. ACE is the convolution or combination of a storm's intensity and longevity. Put simply, a long-lived very powerful Category 3 hurricane may have more than 100 times the ACE of a weaker tropical storm that lasts for less than a day. Over a season or calendar year, all individual storm ACE is added up to produce the overall seasonal or yearly ACE. Detailed tables of previous monthly and yearly ACE are on my Florida State website.


Previous Basin Activity: Hurricane ACE

[table]

The table does not include the Northern Indian Ocean, which can be deduced as the portion of the Northern Hemisphere total not included in the three major basins. Nevertheless, 2007 saw the lowest ACE since 1977. 2008 continued the dramatic downturn in hurricane energy or ACE. The following stacked bar chart demonstrates the highly variable, from year-to-year behavior of Northern Hemisphere (NH) ACE. The smaller inset line graph plots the raw data and trend (or lack thereof). Thus, during the past 60 years, with the data at hand, Northern Hemisphere ACE undergoes significant interannual variability but exhibits no significant statistical trend.

[graph]

So what to expect in 2009? Well, the last Northern Hemisphere storm was Typhoon Dolphin in middle December of 2008, and no ACE has been recorded so far. The Southern Hemisphere is below normal by just about any definition of storm activity (unless you have access to the Elias sports bureau statistic creativity department), and the season is quickly running out. With La Nina-like conditions in the Pacific, a persistence forecast of below average global cyclone activity seems like a very good bet. Now if only the Dow Jones index didn't correlate so well with the Global ACE lately…Notes:Hurricane is the term for Tropical Cyclone specific to the North Atlantic, Gulf of Mexico, Caribbean Sea, and the Pacific Ocean from Hawaii eastward to the Mexican coast. Other names around the world include Typhoon, Cyclone, and Willy-Willy (Oz) but hurricane is used generically to avoid confusion.

Accumulated Cyclone Energy or ACE: is easily calculated from best-track hurricane datasets, with the one-minute maximum sustained wind squared and summed during the tropical lifecycle of a tropical storm or hurricane.

Tuesday, March 10, 2009

WaPo: President Obama lifts the limits on federally funded research but puts off key moral questions

Stem Cell Questions. WaPo Editorial
President Obama lifts the limits on federally funded research but puts off key moral questions.
WaPo, Tuesday, March 10, 2009; A12

PRESIDENT OBAMA did the right thing yesterday when he reversed President George W. Bush's limitations on federal funding for embryonic stem cell research. The potential for cures and treatments of debilitating diseases with these versatile cells is enormous. But this type of experimentation is thick with ethical and moral questions, many of which Mr. Obama put off answering.

"We will develop strict guidelines, which we will rigorously enforce, because we cannot ever tolerate misuse or abuse," the president said yesterday at the White House. But he offered little indication of where he would draw those lines. In effect since August 2001, Mr. Bush's limits were offered as a compromise between the needs of scientists and the moral and ethical convictions of those troubled by the stem cell extraction process that destroys the embryos. Mr. Bush permitted federal funding of experimentation, but only on stem cell lines that existed at the time of his announcement. In practice, those 21 viable stem cell lines proved too few, and many scientists said the restrictions were holding back research. The breakthrough in 2007 that made human skin cells function like embryonic stem cells has great potential. But there are still questions about the efficacy of that approach. Mr. Obama says he wants all types of experimentation in this arena to be done "responsibly."

Mr. Obama will allow federal funding to be used for stem cell research on lines derived from embryos since 2001 and into the future. He has directed the National Institutes of Health to devise within 120 days the guidelines that will regulate how this research is conducted. But will research be performed only on stem cell lines grown from the thousands of frozen embryos in fertility clinics that have been slated for destruction? Mr. Obama didn't say. The 1995 legislation known as the Dickey-Wicker Amendment bans federal money from being used to create or destroy human embryos for research, but not research on stem cells from such embryos once they have been created.

Aside from saying, "As a person of faith, I believe we are called to care for each other and work to ease human suffering," the president has not given a hint as to where he stands on some thorny questions. Should Dickey-Wicker be repealed? He leaves it up to Congress to decide that. Where does he stand on growing human embryos for experimentation in general and using them for stem cells in particular? It's unclear.

The White House said that Mr. Obama doesn't want to prejudge the NIH guidelines but that this will not be the last we'll hear from Mr. Obama on this subject. We hope not. Some of these ethical questions need to be dealt with in the political arena, and not just by scientists.

Executive order on human embyonic stem cells

THE WHITE HOUSE
Office of the Press Secretary
_________________________________________
For Immediate Release
March 9, 2009
EXECUTIVE ORDER
- - - - - - -
REMOVING BARRIERS TO RESPONSIBLE SCIENTIFIC RESEARCH INVOLVING HUMAN STEM CELLS

By the authority vested in me as President by the Constitution and the laws of the United States of America, it is hereby ordered as follows:

Section 1. Policy. Research involving human embryonic stem cells and human non-embryonic stem cells has the potential to lead to better understanding and treatment of many disabling diseases and conditions. Advances over the past decade in this promising scientific field have been encouraging, leading to broad agreement in the scientific community that the research should be supported by Federal funds.

For the past 8 years, the authority of the Department of Health and Human Services, including the National Institutes of Health (NIH), to fund and conduct human embryonic stem cell research has been limited by Presidential actions. The purpose of this order is to remove these limitations on scientific inquiry, to expand NIH support for the exploration of human stem cell research, and in so doing to enhance the contribution of America's scientists to important new discoveries and new therapies for the benefit of humankind.

Sec. 2. Research. The Secretary of Health and Human Services (Secretary), through the Director of NIH, may support and conduct responsible, scientifically worthy human stem cell research, including human embryonic stem cell research, to the extent permitted by law.

Sec. 3. Guidance. Within 120 days from the date of this order, the Secretary, through the Director of NIH, shall review existing NIH guidance and other widely recognized guidelines on human stem cell research, including provisions establishing appropriate safeguards, and issue new NIH guidance on such research that is consistent with this order. The Secretary, through NIH, shall review and update such guidance periodically, as appropriate.

Sec. 4. General Provisions. (a) This order shall be implemented consistent with applicable law and subject to the availability of appropriations.

(b) Nothing in this order shall be construed to impair or otherwise affect:

(i) authority granted by law to an executive department, agency, or the head thereof; or
(ii) functions of the Director of the Office of Management and Budget relating to budgetary, administrative, or legislative proposals.

(c) This order is not intended to, and does not, create any right or benefit, substantive or procedural, enforceable at law or in equity, by any party against the United States, its departments, agencies, or entities, its officers, employees, or agents, or any other person.

Sec. 5. Revocations. (a) The Presidential statement of August 9, 2001, limiting Federal funding for research involving human embryonic stem cells, shall have no further effect as a statement of governmental policy.

(b) Executive Order 13435 of June 20, 2007, which supplements the August 9, 2001, statement on human embryonic stem cell research, is revoked.

BARACK OBAMA
THE WHITE HOUSE,
March 9, 2009.

Friday, March 6, 2009

Opposite Views on Climate Feedbacks (and perhaps the answer lies in the middle)

Opposite Views on Climate Feedbacks (and perhaps the answer lies in the middle). By Chip Knappenberger
Master Resource, March 5, 2009

Just how much warming should we expect from rising levels of atmospheric greenhouse gases (GHGs)? The answer largely hinges on how much extra warming might be generated by the initial warming—that is, how strong (and in what direction) are the feedbacks from water vapor and clouds.

By most estimates (including climate model outcomes), these feedbacks are positive and result in about a doubling of the warming that would result from greenhouse gas increases alone. By others, however, the total feedbacks are negative, and imply that the total warming will be less than the warming from greenhouse gas increases alone, and only a fraction of that which is commonly expected.

The ultimate warming experienced across the 21st century will depend on the combination of greenhouse gas emissions and how the climate responds to them. The feedback issue is an essential part of the latter, for it spells the difference between a high climate sensitivity to greenhouse gas doubling (say 3-4ºC) and a much lower one (1-2ºC).

As to where the answer lies, the devil is in the details, and in this instance he is hard at work, as the processes involved are exceedingly complex—difficult to not only to fully understand, but even to adequately measure.

In a “Perspectives” piece in a recent issue of Science magazine, Andrew Dessler and Steven Sherwood attempt to put to rest any notion that the feedbacks on global temperatures are anything but positive and significant. Dessler and Sherwood point out that the role of water vapor plays the largest role in the feedback process—higher temperature (from greenhouse gases) lead to more water vapor in the atmosphere which leads to even higher temperatures still (as water vapor, itself, is a strong greenhouse gas).

But actual hard evidence that water vapor is increasing in the atmosphere has been hard to come by. Dessler and Sherwood review the recent literature on the topic, including an important contribution from Dessler et al. published last year, and conclude that there now exists sufficient evidence to conclude that atmospheric water vapor is increasing very much in line with climate model expectations and that this increase produces roughly twice the global temperature rise than does anthropogenic greenhouse gas enhancement alone. Meaning, of course, that all is generally right in model world, or as they put it: “There remain uncertainties in our simulations of the climate, but evidence for the water vapor feedback—and the large future warming it implies—is now strong.”

But apparently Dessler and Sherwood didn’t convince everyone that this is the case. One notable person who was less than impressed that this was the whole story was Roy Spencer, who has himself been working on the feedbacks issue. Spencer points out that water is actually involved in two feedback processes—the first, through water vapor as described by Dessler and Sherwood, and the second, through water droplets, or, more commonly, clouds. Changes in the patterns (horizontal, vertical, and temporal) and characteristics (droplet size, brightness, etc.) of cloud cover play an important role not only in the earth’s climate, but in how the climate responds to changes in the greenhouse effect. And, as you may have guessed from their ephemeral nature, the behavior or clouds is not particularly well-understood, and even less well modeled.

Spencer has been looking into the cloud part of the feedback processes. Over the past several years, during the period when Dessler et al. (2008) finds a positive feedback from increases in water vapor, Spencer, in his investigations, finds that cloud cover changes produce a feedback in the opposite direction. And when he adds these two effects together, he finds that the total feedback from warming-induced changes in water in the atmosphere to be negative (that is, the cloud effect dominates the vapor effect). Granted, Spencer’s investigations are far from complete and even farther from being generally accepted, but they do raise important concerns as to the ability of examinations of short-term behavior to diagnose long-term response (a situation relied on by both Spencer, and Dessler and Sherwood). Spencer concludes that “unless you know both [vapor] and [cloud] feedbacks, you don’t know the sensitivity of the climate system, and so you don’t know how much global warming there will be in the future.” Virtually the opposite sense of things than that put forth by Dessler and Sherwood.

Obviously, the final arbiter will be the earth’s climate itself, as it is the true integrator of all forces imparted upon it. But, still today, we struggle to even accurately observe the finer details of how it is responding to the changes to which it is being continually subjected. And we are further still from understanding the processes involved sufficiently to produce unassailable models of the climate’s behavior, much less future projections of its response response (as evidenced by the recent slowdown in the rate of global temperature increase despite ever-growing greenhouse gas emissions). And so the process of science continues…

[Breaking news: A new peer-reviewed paper has just been published in the journal Theoretical and Applied Climatology, by researchers Garth Paltridge and colleagues which finds that the increase in atmospheric water vapor that, according to Dessler and Sherwood most definitely accompanies the increase in temperature, is absent in one of the primary databases used to study climate behavior—the so-called NCEP reanalysis data. The authors admit that perhaps there are errors contained in this dataset which may explain their results, but as it stands now (and unless some errors are identified) the reanalysis data supports a negative water vapor feedback. Paltridge et al. conclude:

Negative trends in [water vapor] as found in the NCEP data would imply that long-term water vapor feedback is negative—that it would reduce rather than amplify the response of the climate system to external forcing such as that from increasing atmospheric CO2. In this context, it is important to establish what (if any) aspects of the observed trends survive detailed examination of the impact of past changes of radiosonde instrumentation and protocol within the various international networks.

Lead author Garth Paltridge describes the trials and tribulations of trying to get this result (which runs contrary to climate model expectations) published in an enlightening article over at ClimateAudit, including how at least one of the Dessler and Sherwood authors knew of Paltridge’s soon-to-be-published results and yet made no mention of it in their Science piece. Hmmm, so much for an open discussion of the science on this issue.]

References:

Dessler, A.E., and S. C. Sherwood, 2009. A matter of humidity. Science, 323, 1020-1021.

Dessler, A.E., et al., 2008. Water-vapor climate feedback inferred from climate fluctuations, 2003-2008. Geophysical Research Letters, 35, L20704.

Spencer, R., and W.D. Braswell. 2008. Potential biases in feedback diagnosis from observations data: a simple model demonstration. Journal of Climate, 21, 5624-5628.

Friday, February 27, 2009

Energy Dept Partner Begins Injecting 50,000 Tons of Carbon Dioxide in Michigan Basin

DOE Partner Begins Injecting 50,000 Tons of Carbon Dioxide in Michigan Basin
Project Expected to Advance National Carbon Sequestration Program, Create Jobs
February 27, 2009

Washington, D.C. — Building on an initial injection project of 10,000 metric tons of carbon dioxide (CO2) into a Michigan geologic formation, a U.S. Department of Energy (DOE) team of regional partners has begun injecting 50,000 additional tons into the formation, which is believed capable of storing hundreds of years worth of CO2, a greenhouse gas that contributes to climate change.

DOE’s Midwest Regional Carbon Sequestration Partnership (MRCSP), led by Battelle of Columbus, Ohio, began injecting the CO2 this week in the Michigan Basin near Gaylord, Mich., in a deep saline formation, the Silurian-age Bass Island dolomite. The MRCSP is one of seven partnerships in DOE’s Carbon Sequestration Partnership Program, which was created to assess optimal CO2 storage approaches in each region of the country. The program is managed for DOE’s Office of Fossil Energy by the National Energy Technology Laboratory (NETL).

"This injection test, one of three performed by our Midwest partner, will significantly increase our understanding of CO2 storage technologies and practices in a real-world setting," said Victor Der, Principal Deputy Assistant Secretary for Fossil Energy. "This project will not only provide important information about promising sequestration techniques but it will also go a long way toward creating jobs in the energy sector."

When the current project is completed, the total 60,000 metric ton injection at the Michigan site will mark the largest deep saline reservoir injection in the United States to date and will allow scientists to more fully evaluate how CO2 moves through the basin’s geologic formation. Injections are expected to take place at an average rate of 250 tons per day up to a maximum rate of 600 tons. The 6-month project and related activities of the MRCSP are expected to create more than 230 jobs and 2,900 total project job years. The latter figure represents the number of full-time jobs per year times the number of years that the jobs are supported.

During the Michigan basin injection process, the Midwest team will provide additional insight to the knowledge gained from the initial test in 2008. The team will record geochemical changes to the system, as well as the distribution of the CO2 along the wellbore. A larger volume of CO2 injected over a longer period of time will also provide scientists with additional insight into temperature and pressure responses in the geologic formation, as well as any seasonal changes to the system.

Since the test is taking place within an existing oil and gas field, continuing enhanced oil recovery operations — which are being conducted by well owner, Core Energy LLC — makes this area ideal for the injection test. The area already contains much of the needed infrastructure, such as CO2 compressors, injection systems, existing wells, and pipelines, including an 8-mile-long transport pipeline.

The CO2 being injected comes from a natural gas processing plant owned by DTE Energy, located near Gaylord, where the CO2 will be transported via the 8-mile pipeline to the well. The depth of the injection (3,500 feet) is significantly below the 1,000-foot level of drinking water sources and does not pose any danger to them.

DOE launched the Carbon Sequestration Partnership Program in 2003 to develop and validate technologies to store and monitor CO2 in various geologic formations around the country as part of a national strategy to combat global climate change.

The MRCSP team includes more than 30 partners from state and federal organizations, leading universities, state geological surveys, nongovernmental organizations, and private companies in the eight-state region of Indiana, Kentucky, Maryland, Michigan, New York, Ohio, Pennsylvania, and West Virginia. In addition to Battelle, Core Energy, and DTE, other participants include the Michigan Geological Repository for Research and Education at Western Michigan University, Stanford University Geophysics Department, Schlumberger, and the Michigan Department of Environmental Quality’s Office of Geological Survey.

Wednesday, February 25, 2009

US Energy Dept Project Promotes Low-Impact Drilling

DOE Project Leads to New Alliance to Promote Low-Impact Drilling
Alliance to Fund, Transfer Technologies to Minimize Environmental Impact of Drilling for Oil and Natural Gas

Washington, DC — A project supported by the Office of Fossil Energy’s National Energy Technology Laboratory (NETL) has given rise to a major new research consortium to promote advanced technology for low-impact oil and gas drilling. Announced earlier this month by the Houston Advanced Research Center (HARC) and Texas A&M University, the University/National Laboratory Alliance will fund and transfer advanced technologies to accelerate development of domestic oil and natural gas resources with minimal environmental impact.

The alliance has its roots in a project funded through the Office of Fossil Energy’s Oil and Natural Gas Environmental Program. The goal of the 3½-year project, which is drawing to a close, has been to identify and develop low-impact drilling systems for use in environmentally sensitive areas such as desert ecosystems and coastal margins. Among other accomplishments, the project has led to the creation of the Environmentally Friendly Drilling Program (EFD), which will continue with support from the energy industry and other government organizations after NETL sponsorship ends on March 31, 2009. The new alliance is part of the EFD.

"This is an excellent example of how the government’s investment in advanced, environmentally friendly technologies to develop domestic energy resources has encouraged industry interest and leveraged the taxpayer dollar," said Victor Der, Principal Deputy Assistant Secretary for Fossil Energy. "Technology advancement is the key to simultaneously addressing issues of energy security, supply, affordability, and environmental quality."

According to Rich Haut, manager of the new alliance, its goal is "to fund the development of low-impact systems that can be used in environmentally sensitive regions and share the latest research findings concerning these systems with leaders of energy, academia, environmental organizations, and government. . . . We will consider all aspects of energy resource recovery, not only traditional oil and natural gas production methods but also unconventional production, such as natural gas from shale or coal-bed methane."

In addition to HARC and Texas A&M, founding members of the alliance include:
  • Argonne National Laboratory
  • Los Alamos National Laboratory
  • Sam Houston State University
  • The University of Arkansas
  • The University of Colorado
  • The University of Wyoming
  • Utah State University
  • West Virginia University
The new University/National Laboratory Alliance is an outgrowth of an NETL-supported project with Texas A&M University entitled "Field Testing of Environmentally Friendly Drilling Systems." The $2.3 million project, which started on September 30, 2005, and will end on March 31, 2009, has resulted in a number of other significant accomplishments. These include:
  • Identifying more than 90 specific technologies related to the footprint of oil and natural gas operations that, if widely commercialized and applied, could help industry achieve more than a 90 percent reduction in environmental impact.
  • Creating more than 20 jobs that lasted for the duration of the project, and contributing to future job growth by developing technologies to make oil and gas resources that are environmentally restricted today producible tomorrow.
  • Establishing an Oil & Gas Desert Test Center near Pecos, Texas, on the edge of the Chihuahua desert, to evaluate low-impact drilling technology in desert ecosystems such as those found in the Western United States.
  • Establishing a systems approach to optimize drilling decisions and ensure that the activities selected satisfy chosen criteria; the approach has been successfully used in the EFD program to determine the optimum system for a given site.
  • Developing a small footprint, low-impact process based on sound engineering and biological principles to convert drilling wastes to a useable product.

Thursday, February 19, 2009

Most US late-stage human clinical trials done abroad

Most Clinical Trials Done Abroad, by Shirley S Wang
WSJ, Feb 19, 2009

Most testing for the U.S. drug industry's late-stage human trials is now done at sites outside the country, where results often can be obtained cheaper and faster, according to a study.

The study found that 13,521 of 24,206 sites being used in November 2007 for studies sponsored by the 20 largest U.S. drug makers were international, and that the number of countries conducting testing has doubled over the past 10 years. The study was published in Wednesday's New England Journal of Medicine.

The findings add to concerns about the ethical treatment of participants and the integrity of the research data produced in developing countries. Experts also say patients in developing countries may be taken advantage of because they are poorer and less familiar with the research process.

In November, Indian drug regulators halted a trial of a Wyeth vaccine after an infant died, in order to investigate whether babies were properly screened before being enrolled in the study. In a Polish study in 2007 of a bird-flu vaccine being developed by Novartis AG, two elderly patients died who should have been excluded based on age. Other companies faced criticism earlier in the decade about testing drugs in populations that couldn't afford the medicines.

"Clearly there are major challenges both in terms of ethical oversight of the research and the scientific rigor," said Seth Glickman, an assistant professor of emergency medicine at the University of North Carolina-Chapel Hill who was first author of the study.

Helping to make overseas trials cheaper and faster, patients in developing countries are often more willing to enroll in studies because of lack of alternative treatment options, and often they aren't taking other medicines. Such "drug-naive" patients can be sought after because it is easier to show that experimental treatments are better than placebos, rather than trying to show an improvement over currently available drugs.

Reviewing published studies, authors of the journal article found proper research oversight and adequate informed consent for participants was inconsistent in developing countries. In one study reviewed, only 56% of 670 researchers who conducted trials in those countries said their studies had been approved by ethics boards or health officials. In another study, only 18% of researchers appropriately informed participants about the study before enrolling them.

The article comes at a time when some say the U.S. is moving in the wrong direction with regard to ethical treatment of study participants. Last year, the Food and Drug Administration updated its guidelines for conducting international clinical trials, adopting a standard used by many countries and organizations known as "good clinical practices."

The shift has been controversial. Critics believe the updated guidelines are less ethically rigorous and more industry friendly compared to the former guidelines, known as the Declaration of Helsinki. One version of that declaration forbade placebo-controlled trials and had provisions in it about companies' obligations to provide access to medicines to those populations in which the treatments had been tested.

It "set a higher standard," said Sonal Singh, an assistant professor at Wake Forest who studies international clinical trials and wasn't involved in Wednesday's report. "You're kind of dispensing with an ethical document," he said of the updated guidelines.

The FDA says that placebo-controlled trials are necessary under certain circumstances and that it also encourages post-market medication access to be discussed during protocol design. The new standards ensure protection for participants by mandating that studies be reviewed by international ethics committees and that informed consent be obtained from all participants, the agency says.

"Good clinical practice is meant to assure quality clinical trials, as well as the implementation of high-level clinical trial ethics," said David Lepay, senior adviser for clinical science at the FDA. "We do not see that there are fundamental differences between [good clinical practice] and other ethical standards in assuring the fundamental rights of subjects."

The authors of the new report also suggest that bureaucracy and regulatory hurdles in the U.S. are partly responsible for making going abroad so enticing. The requirements stretch out the amount of time it takes to complete a study and can add to costs as well. "Many of the policies in regards to the regulatory framework are well intentioned," said Dr. Glickman. "They have the unintended effect of being very onerous from the administrative standpoint."

In the U.S., each site seeking to conduct a study must have its ethics board approve it. But many studies these days are considered "multisite," where one company or sponsor runs the same trial at different centers and pools the data. The U.S. review process means redundant effort for such studies, according to Kevin Schulman, a professor of medicine and business administration at Duke University and another study author.

A Prenatal Link to Alzheimer's?

A Prenatal Link to Alzheimer's? By Ron Winslow
Researchers Propose Process in Fetal Development Is Reactivated Later in Life
WSJ, Feb 19, 2009

New research at Genentech Inc. is challenging conventional thinking about Alzheimer's disease, providing a provocative theory about its cause and suggesting potential new targets for therapies to treat it.

The researchers propose that a normal process in which excess nerve cells and nerve fibers are pruned from the brain during prenatal development is somehow reactivated in the adult brain and "hijacked" to cause the death of such cells in Alzheimer's patients.

The dominant view of Alzheimer's disease today is that it is caused by deposits called beta amyloid that accumulate in the brain because of bad luck or other unknown reasons, degrading and destroying nerve cells and robbing victims of their memory.

The new findings offer evidence that "Alzheimer's is not just bad luck, but rather it is the activation of a pathway that is there for development purposes," says Marc Tessier-Lavigne, executive vice president, research drug discovery, at Genentech. "It suggests a different way of looking at Alzheimer's disease."

[photos in original article]

The report, being published Thursday in the journal Nature, is based on laboratory and mouse experiments, and further study is needed to validate the hypothesis.
Genentech, a South San Francisco, Calif., biotech company, says it has identified potential drug candidates based on the findings, but even if they prove promising, it would take several years for any potential treatment to be developed.

Beta amyloid, a fragment of a larger molecule called amyloid precursor protein, or APP, has long been the dominant focus of Alzheimer's research. Many drug companies have compounds in development that are intended to block or clear the buildup of beta amyloid plaques in the brain. But the track record for developing effective drugs has been unimpressive so far. Moreover, some people accumulate beta amyloid in the brain without any apparent effect on memory, adding to confusion about its role in Alzheimer's.

During human development, the prenatal brain makes about twice the number of nerve cells it needs, Dr. Tessier-Lavigne explained. Those neurons, in turn, each make hundreds of nerve fibers that seek to make connections with other cells. The cells and nerve fibers that succeed in making connections survive -- while those that don't naturally trigger a self-destruction mechanism called apoptosis that clears out the unneeded cells.

"We make too many, and then we prune back," Dr. Tessier-Lavigne said. "The system gets sculpted so you have the right set of connections."

What he and his colleagues, including scientists from the Salk Institute, La Jolla, Calif., found is that the amyloid precursor protein linked to Alzheimer's also plays a critical role in triggering the prenatal pruning process. But the beta amyloid that appears to kill nerve cells in Alzheimer's patients isn't involved in the developing embryo. Instead, the pruning is sparked by another fragment of APP called N-APP, causing a cascade of events that results in the death of excess nerve cells and nerve fibers.

"This suggests that APP may go through a novel pathway rather than beta amyloid to cause Alzheimer's disease," says Paul Greengard, a scientist at Rockefeller University, New York, and a Nobel laureate who wasn't involved in the research. He called the paper "an important step" in understanding the pathology of Alzheimer's -- something that is necessary to develop better drugs.

Don Nicholson, a Merck & Co. vice president and author of a commentary that accompanies the Tessier-Lavigne study in Nature, said the paper doesn't rule out a role for beta amyloid. He added that given the intense focus on the role of beta amyloid in the disease, the finding that another part of the precursor protein may be important in Alzheimer's is "unexpected biology."

Exactly what triggers the reappearance in the adult brain of a process fundamental to its early prenatal development isn't clear and is the subject of continuing research, Dr. Tessier-Lavigne said. Meantime, there are several steps in the cascade of events that lead to the death of the developing neurons and nerve fibers. If the process reflects the unwanted death of such cells in Alzheimer's, it presents several places where a drug could block or affect the process to possibly prevent the damage.

"We've identified a mechanism of nerve-cell death and degeneration involving amyloid precursor protein in the embryo," he said. "What Alzheimer's is doing is hijacking not only the molecule but the whole mechanism of degeneration."

Meantime, a second paper published last month by a team including researchers at Buck Institute for Age Research, Novato, Calif., reported that a protein called netrin-1 appears to regulate production of beta amyloid. The finding, which appeared in the journal Cell Death and Differentiation, is behind the authors' belief that Alzheimer's is the result of normal processes going awry.

Together the papers add to intriguing evidence that beta amyloid is perhaps only part of the Alzheimer's story. "What we're seeing is a completely different view of the disease," said Dale Bredesen, a Buck Institute researcher and co-author of the paper. The brain has to make connections and break connections all the time. Alzheimer's, he suggests, is the result when that process is out of balance.

Tuesday, February 17, 2009

To save the planet, build more skyscrapers—especially in California

Green Cities, Brown Suburbs. By Edward L. Glaeser
To save the planet, build more skyscrapers—especially in California.
http://www.city-journal.org/2009/19_1_green-cities.html

On a pleasant April day in 1844, Henry David Thoreau—the patron saint of American environmentalism—went for a walk along the Concord River in Massachusetts. With a friend, he built a fire in a pine stump near Fair Haven Pond, apparently to cook a chowder. Unfortunately, there hadn’t been much rain lately, the fire soon spread to the surrounding grass, and in the end, over 300 acres of prime woodland burned. Thoreau steadily denied any wrongdoing. “I have set fire to the forest, but I have done no wrong therein, and now it is as if the lightning had done it,” he later wrote. The other residents of Concord were less forgiving, taking a reasonably dim view of even inadvertent arson. “It is to be hoped that this unfortunate result of sheer carelessness, will be borne in mind by those who may visit the woods in future for recreation,” the Concord Freeman opined.

Thoreau’s accident illustrates a point that is both paradoxical and generally true: if you want to be good to the environment, stay away from it. Move to high-rise apartments surrounded by plenty of concrete. Americans who settle in leafy, low-density suburbs will leave a significantly deeper carbon footprint, it turns out, than Americans who live cheek by jowl in urban towers. And a second paradox follows from the first. When environmentalists resist new construction in their dense but environmentally friendly cities, they inadvertently ensure that it will take place somewhere else—somewhere with higher carbon emissions. Much local environmentalism, in short, is bad for the environment.

Matthew Kahn, a professor of economics at UCLA, and I have quantified the first paradox. We begin by estimating the amount of carbon dioxide that an average household would emit if it settled in each of the 66 major metropolitan areas in the United States. Then we calculate, for 48 of those areas, the difference between what that average household would emit if it settled in the central city and what it would emit in the suburbs. (The remaining 18 areas had too little data for our calculations.) A few key points about our methodology follow; if you’re interested in all the methodological details, click here.

First, by “average household,” we mean average in terms of income and family size, so that we aren’t comparing urban singles with large suburban families. However, we don’t want to standardize the physical size of the home. People who move to suburban Dallas aren’t likely to live in apartments as small as those in Manhattan. Smaller housing units are one of the important environmental benefits of big-city living.

Second, we try to estimate the energy use from a typical new home in an area—specifically, one built in the last 20 years—which sometimes means something quite different from an average one. The reason is simple. We aren’t playing a ratings game to figure out which city pollutes least; rather, we’re trying to determine where future home construction would do the least environmental damage.

We calculate carbon emissions from four different sources: home heating (that is, fuel oil and natural gas); electricity; driving; and public transportation. Residential energy use and non-diesel motor fuel are each responsible for about 1.2 billion tons of carbon dioxide emissions each year, out of total U.S. emissions of 6 billion tons. So these sources together reflect about 40 percent of America’s carbon footprint. Our procedure is admittedly imperfect and incomplete: for example, we do not include carbon dioxide generated as a by-product of workplace activity.

The table below shows some of the results of our research. The five metropolitan areas with the lowest levels of carbon emissions are all in California: San Francisco, San Jose, San Diego, Los Angeles, and Sacramento. These areas have remarkably low levels of both home heating and electricity use. There are cold places, like Rochester, that don’t air-condition much and thus use comparably little electricity. There are warm places, like Houston, that don’t heat much and thus have comparably low heating emissions. But coastal California has little of both sorts of emissions, because of its extremely temperate climate and because California’s environmentalists have battled for rules that require energy-efficient appliances, like air conditioners and water heaters, and for green sources of electricity, such as natural gas and hydropower. (Some analysts argue that this greenness is partly illusory—see “California’s Potemkin Environmentalism,” Spring 2008—but certainly, by our measures, California homes use less energy.) Also, despite the stereotypes about California highways and urban sprawl, some of these five cities, like San Francisco, have only moderate levels of transportation emissions, since their residents actually live at relatively high densities, which cuts down on driving.

Read more.

Edward L. Glaeser is a professor of economics at Harvard University, a City Journal contributing editor, and a Manhattan Institute senior fellow. His article describes research jointly performed with Matthew Kahn of UCLA.

Large-Scale Test to Inject One Million Metric Tonnes of Carbon Dioxide into Saline Formation

Carbon Sequestration Partner Initiates Drilling of CO2 Injection Well in Illinois Basin
Large-Scale Test to Inject One Million Metric Tonnes of Carbon Dioxide into Saline Formation
US Energy Dept, February 17, 2009

Washington, D.C. — The Midwest Geological Sequestration Consortium (MGSC), one of seven regional partnerships created by the U.S. Department of Energy (DOE) to advance carbon sequestration technologies nationwide, has begun drilling the injection well for their large-scale carbon dioxide (CO2) injection test in Decatur, Illinois. The test is part of the development phase of the Regional Carbon Sequestration Partnerships program, an Office of Fossil Energy initiative launched in 2003 to determine the best approaches for capturing and permanently storing gases that can contribute to global climate change.

The large-scale project will capture CO2 from the Archer Daniels Midland (ADM) Ethanol Production Facility in Decatur, Ill., and inject it in a deep saline formation more than a mile underground. Starting in early 2010, up to one million metric tonnes of CO2 from the ADM ethanol facility will be compressed to a liquid-like dense phase and injected over a three-year period.

The rock formation targeted for the injection is the Mt. Simon Sandstone, at a depth between 6,000 and 7,000 feet. The Mt. Simon Sandstone is the thickest and most widespread saline reservoir in the Illinois Basin, with an estimated CO2 storage capacity of 27 to 109 billion metric tonnes. Analysis of data collected during the characterization phase of the project indicated that the lower Mt. Simon formation has the necessary geological characteristics to be a good injection target.

In January, ADM — in collaboration with the Illinois State Geologic Survey at the University of Illinois, which leads the MGSC — was issued an Underground Injection Control permit by the Illinois Environmental Protection Agency for the injection well. Obtaining the permit is significant because it allows the consortium to proceed with drilling, making the MGSC the first DOE Regional Partnership to begin drilling a development phase injection well. The drilling is expected to take about 2 months to complete.

Following injection, a comprehensive monitoring program will be implemented to ensure that the injected CO2 is safely and permanently stored. The position of the underground CO2 plume will be tracked, and deep subsurface, groundwater, and surface monitoring around the injection site will be conducted. The monitoring program will be evaluated yearly and modified as needed.

The project under which this effort is being performed will, on average, create nearly 250 full-time jobs per year. These jobs will be supported throughout the project’s life of more than ten years, thus resulting in more than 2,500 job-years (calculated as the number of full-time jobs per year times the number of years that the jobs are supported.)

MGSC is one of seven regional partnerships in a nationwide network that is investigating the comparative merits of numerous carbon capture and storage approaches to determine those best suited for different regions of the country. The consortium is investigating options for the 60,000 square mile Illinois Basin, which underlies most of Illinois, southwestern Indiana, and western Kentucky. Emissions in this area exceed 304 million metric tonnes of CO2 yearly, mostly attributed to the region's 126 coal-fired power plants.

Friday, February 13, 2009

China, Rice, and GMOs: Navigating the Global Rift on Genetic Engineering

China, Rice, and GMOs: Navigating the Global Rift on Genetic Engineering. By Ron Herring
The Asia-Pacific Journal, Jan 12, 2009

A recent article in Nature [1] asked provocatively: Is China ready for GM rice? The title reflects widely shared anxiety over genetic engineering in agriculture. The use of the term “GM” specifically conjures a politically charged object: “GMOs” or “genetically modified organisms.” Is anyone ready for FrankenFoods? Strawberries with fish genes? Human cloning? The question has an ominous overtone, though both reporter and venue are identified with science. The question derives its energy from the decision of the Chinese Government to go full speed ahead with genetically engineered rice to confront what the state constructs as a gathering Malthusian crisis of hunger. What the article does not tell the reader is that the farmers are way ahead of the government: ready and able. Transgenic rice – officially unauthorized within China – has for several years been showing up in exports from China to Europe, to Japan, to New Zealand – and probably many other places that simply are not checking.

To ask if China is “ready” for “GM” rice is then doubly loaded. The necessity of getting ready implies threat; “GM” ties a specific cultivar to global anxiety about transgenic crops. The anxiety is multi-pronged: does the spread of transgenics entail threats of corporate dominance? Environmental risk? Food safety? The anxiety is heightened because these crops are spreading faster globally than perhaps any previous agricultural innovation, both through official channels of firm and state and underground, like films on DVDs or business software on CDs.[2] The transgenic genie is out of the bottle.

Then the question of who must be “ready” becomes especially curious. Farmers are clearly ready. As in many countries, cultivators in China risk prosecution to grow unauthorized transgenic crops, including Bt rice. They do so because they are impatient with bureaucratic delays and unwilling to pay corporate technology fees. And in fact, though urban consumers of GM politics think otherwise, there is not a lot to get ready for on farm: all the technology is in the seed, typically with a few altered genes, often only one. There is no more preparation than in playing an illegal DVD of a Bollywood film, once you know how to operate a player.

But is the state ready? Here the construction of transgenic rice as a special category designated by “GM” indicates why the issue carries political freight. Being “ready” implies a state of preparation, alertness, and consequences of not being ready, all of which are bad. No one was ready for the financial meltdown of 2008, most especially pensioners and homeowners. Is China ready for democracy? Open internet? But no one has ever asked -- in Europe, or in China, or in India -- if nations were “ready” for transgenic pharmaceuticals – which have been with us, and thoroughly normalized, since successful production of human insulin via transgenic bacteria began in 1978. There are no FrankenPills on posters. Useful to urban consumers and endorsed by the authority of medical science, transgenic pharmaceuticals have not drawn protests. Agriculture is different. The category “GM” as site of risk has become so normalized in political discourse about agriculture that no one ever asks: what is especially risky about any particular cultivar? Is China ready for “GM rice” really means: is the state ready to confront the political and administrative complexities of seed surveillance contrary to farmer interests?

The answer is probably “no.”We already know that stealth transgenic rice – and unauthorized Bt cotton as well – are being grown by Chinese farmers without permission of the state. What Jane Qiu’s article highlights is why the state or farmers or anyone else should care.

The Government of China, like many governments in nations with large agricultural sectors – e.g. India, Brazil – officially promotes and invests in biotechnology as a means of responding to what are constructed as urgent crises on the land. Rice stands for the larger problematic of increasing food production. Much of the corporate propaganda for transgenic technologies evokes the Malthusian threat, but here the evocation of urgency is from the Chinese state. This is no small issue: regimes incapable of feeding their populations have not fared well historically. Nor have their citizens. Being dependent on the global economy for fuel and food runs counter to imperatives of statecraft itself, across many ideological gradients. The threat conjured in China is quite explicit: inadequate productive capacity projected into the future. Against this threat is posed a promise: technical change in plant breeding. Genetic engineering – the possibility of rearranging DNA in plants to produce traits that are not in the genome of the plant itself, such as insect resistance, virus resistance, enhanced nutrient content, and on the horizon drought and salinity resistance – has long been official policy of the Chinese government. The controversy implied by the Nature article rests on two changes in the context of biotechnology: first, rice would be the first food crop authorized officially in China, and secondly, rice as a plant raises questions of agro-ecology not presented by cotton, China’s first transgenic. But the same recombinant DNA technology that the state constructs as promise has been constructed as threat in a very powerful global discourse.[3]

What exactly is the threat that China may or may not be ready for? “GM” is a political label, but it is one that sticks: it has political effects. All plants in agriculture are genetically modified. We no longer live in the world of Gregor Mendel puttering with peas: rather, plant genomes have been for decades radically altered and re-assembled in order to get phenotypic variation that plant breeders and farmers want. Transgenic techniques came later, and may indeed cause less disruption of gene networks than alternative [non “GM”] techniques [Batista et al 2008], but are socially constructed as something one must be ready for. No other kind of plant is subjected to the level of scrutiny of a plant bred by recombinant DNA techniques. Nor do transgenic pharmaceuticals constitute a special object of regulation, surveillance and control. Recombinant DNA techniques are constructed as threats only in agriculture.

The thing China may or may not be “ready” for is the global governance regime that sets transgenic plants apart. “GM” rice constitutes a plant that must be plugged into international norms of bio-safety as laid out in the Cartagena Protocol. The Protocol itself is the product of transnational advocacy networks and EU politics; it was resisted by major transgenic crop exporters such as the United States and Argentina. The protocol reflects the fact that half the globe holds “GMOs” to require special surveillance, monitoring, and governance.[4] To be ready is to have institutions that can promise effective rural governmentality ; in this sense the question is rhetorical: China lacks that kind of state, as do most nations.

A global rift divides the planet into places that see special needs for bio-safety regulation of “GMOs” – except pharmaceuticals – and those that express no more concern with transgenic plants than with agricultural plants in general. The world is divided between an American construction of “GM” plants as “substantially equivalent” to their non-“GM” equivalents – because no difference can be found by scientific measurement – and a European view privileging the “precautionary principle” – that something truly terrible may be lurking in the new gene networks created by DNA splicing. Prince Charles refers to rDNA work on plants – but not pharmaceuticals – as “playing God,” entering “realms that belong to God and God alone.” Hubris is the culprit; genetic engineering, in this view, involves a "gigantic experiment I think with nature and the whole of humanity which has gone seriously wrong. Why else are we facing all these challenges, climate change and everything?" An empty vessel has been created into which multiple anxieties may be bundled, and its name is GM.

The European discourse of playing God does not play well in Asia; it presupposes the God of Genesis, a creator with a plan, a garden, absolute control and a stable equilibrium of species. And in general the Apocalyptic vision of European political activism has not penetrated beyond small numbers of urban professionals in Asia, where grounds of objection of transgenics have to do with consumer preference and resistance to corporate globalization. China is the case that confounds the discourse; not MNCs, but Chinese scientists have been the drivers of transgenic research and development. China showed how public sector investments in transgenics could traget specific problems in agriculture without signing away the farm.

China was the leader among non-OECD nations in responding to biotechnology as a potential growth sector. Recombinant DNA techniques first became viable in laboratories in 1973; by 1980, patents on transgenic organisms became possible in the United States, as always the first-mover in creating and strengthening property in novel fields. With potential property to be made, and valuable discoveries in medicine and pharmaceuticals, a de facto global race began. In India, which established a Department of Biotechnology early, one heard the refrain “we missed the industrial revolution, we cannot afford to miss out on the information revolution.” Much of Asia responded in similar ways, with grand plans for state backing of biotechnology in the mode of developmental statism, but China was the clear leader and only success story. Though much of the political discourse is about MNCs and patents, China represents a now-common alternative dynamic: public funding of transgenic crops by developmental states.[5]

China’s early efforts in biotechnology began with strong state backing in the 1970s, focused on both food crops and cotton. Standard techniques of tissue culture and cell fusion were involved to modify plants before the advent of advanced recombinant DNA techniques in the early 1980s. The so-called 863 plan for advancing biotechnology research started in 1986.[6] The Ministry of Agriculture reported in 1996 that more than 190 genes had been transferred to more than 100 organisms, including plants, micro-organisms and animals. Investment levels were high, and addressed an indigenous sense of the most serious agricultural problems. The Bt cotton developed in China enables insect resistance from within the plant. It was a priority not only because of the massive land investment China had in cotton, but also because of the widely recognized externalities of heavy pesticide use: deleterious environmental and health consequences. China’s Bt cotton is now growing both legally and illegally in far-flung Asian locations.

Southeast Asian states feared that that China would become hegemonic in this new information-intensive sector, and ramped up plans for autonomous development defensively [Barboza 2003]. But plans in Southeast Asia were cut back after a profound European U-turn on genetic engineering in agriculture. Like commercial firms in the United States, European states initially saw the genomics revolution in biology as a potential source of profit and national development; European firms were early leaders; they were backed, especially in France, by governments. The turn away from biotechnology came as a result of transnational social movements joining hands across the Atlantic in opposing corporate environmental irresponsibility. By the end of the 1990s, Europe had crossed over, from support for genetic engineering to attempts to protect its economy from American transgenic imports.[7] Whereas American policy moved to the USFDA conclusion of “substantial equivalence” and society followed in train, Europe moved to a “precautionary principle,” led by social activists.

But not all opposition targeted all biotechnology: food was the crux of the anti-GM campaign in Europe. “White” biotechnologies, such as biodegradable plastics and other industrial applications, as well as “red” biotechnologies in medicine and pharmaceuticals remained strongly supported in Europe [EB 64.3 2006]. In these applications of rDNA technologies, there are large human utilities, such as avoiding death. Food is different. There being no benefit to consumers in GM-food – with a few caveats about reduction of pesticide residues and externalities – European consumers were free to support campaigns to restrict agricultural biotechnology not only in Europe, but all over the world. The most successful efforts were in Africa, as Robert Paarlberg’s new book Starved for Science documents. The WTO has ruled that the European standards are contradicted by EU science, but the EU U-turn remains both politically sticky in Europe and consequential internationally. The EU declaration on “GMOs” structurally segregated world markets: GM or GM-free. It became quite clear in the late 1990s where the smart money would go in poor countries hoping to export to Europe.

China’s current interest in regulation of transgenic rice derives directly from this global regulatory rift. An early leader in state-led biotechnology development, China slowed its approach after the EU U-turn. Cotton is one thing, food another. Bt cotton from China’s public sector not only performed well, and reduced pesticide poisoning of farmers and farm workers, but was smuggled out of China and thrives as stealth seeds in other parts of Asia.[8] Bt cotton is of no concern to powerful players in the international system; national governments such as Vietnam and Pakistan prefer to look the other way in order to avoid a confrontation with both farmers as political agents and their own incapacity to build viable Cartagena-friendly bio-safety regimes. Rice is food, and thus another kettle of fish.

What China is not ready for is another assault on the integrity of its export products; that assault derives from EU regulations as to what food is acceptable and what is not. Spot checks carried out by several EU countries, including Germany, the UK, and France, have, since 2006, found Chinese shipments of rice and rice products to contain evidence of a genetically-engineered rice, specifically Bt 63. Bt63 is not authorized for commercial cultivation either in China or in the EU; its import into the EU is banned. The formal resolution of the China-EU conflict was to require all rice and rice products from China to have a certificate that there is no transgenic Bt 63 content; one predicts a strong market for certificates over time. Japan and New Zealand, which have similar EU-like restrictions, reported similar findings.[9]

The Cartagena Protocol requires that “Living Modified Organisms” be clearly identified in international trade; the criterion for an LMO is essentially the same as the GMO. This is not surprising: EU support of transnational opponents of biotechnology succeeded in crafting soft law stigmatizing transgenics and their downstream products, whether or not any DNA or trans-gene protein survives processing. Surveillance is to be “from farm to fork.” Though the reality of food systems would seem to make this level of control a dream only bureaucrats could conjure, the consequences are serious. Failure of the Chinese government to enforce the protocol indicates not only non-compliance with international soft law, but inability of the state to control transgenic organisms within its own boundaries or in its exports. China is hardly alone in failure to regulate crops -- – seed police are hard to find – but China does face strong international pressure for tighter regulation of safety in exports in general. Bt proteins have not been shown to kill pets or people, but the net effect is to undermine confidence in Chinese exports to nations with strict regulations.

Though this threat to export products is the main objective risk of growing transgenics in China – the Bt itself has not been shown to be unsafe for humans or animals, and many Bt crops are regularly consumed – the Nature article is more concerned about environmental effects. Given China’s disturbing record on environmental protection, how serious a risk is transgenic rice? In general, Bt crops present a difficult question for environmental policy: if we compare the Bt plants to traditional cultivars, cultivated in traditional ways, the transgenics reduce pesticide use and therefore seem environmentally friendly. Nevertheless, one seldom finds transgenic crops discussed in the frame of biodiversity preservation or sustainability. Rather, the environmental risk assessment of transgenic crops typically poses questions about the potential for gene flow in the environment.

Here the Chinese official caution regarding Bt rice raises the importance of disaggregating transgenic crops. Bt rice raises more and more serious questions of agro-ecology than does Bt cotton, China’s most successful biotechnology venture. Gene flow from Bt cotton presents little if any potential risk; like many cultivated crops, cotton is highly specialized, with no evidence of crossing with wild relatives. Without crossing, there is no gene flow. If genes flow, there is a question of fitness: will the wild and weedy relatives of the cultivated plant now gain an advantage in fitness in the environment from addition of the trait from the transgene [eg insect resistance]? Will this fitness advantage be such that they begin to dominate, thus upsetting agro-ecologies in new ways? This is the “super-weed” scenario stressed by opponents of rDNA technology: FrankenPlants. With cotton, the answer to these questions is almost certainly not; with rice, there is a much greater possibility of agro-ecological risk. Rice is first of all a grass – a more promiscuous kind of plant than cotton – and secondly has wild and weedy relatives in and around cultivated fields.

The bureaucratically sensible resolution would seem to be to test the crops under Chinese conditions. But testing itself comes under attack when the object is “GM.” Uncertainties abound: how long a testing period is long enough to determine safety? For proponents of the precautionary principle, the answer is “forever.” For the US FDA, the answer is “not much”: if composition tests show the same range of variation in transgenic plants as in comparable non-transgenic cultivars [i.e. comparing apples to apples, rather than to oranges], there is no reason for special regulation or labels. The American position risks riding on the side of hubris: we know what we know. The European position imposes nearly impossible[10] standards: how can you prove that something will not happen? Do you check your brakes every time you take your car out to drive? Do you avoid any airplane that could conceivably crash and burn? Do you demand demonstration that your cell phone safe cannot cause cancer?

Of course we all – Europeans and middle-class activists of transnational advocacy networks in poorer countries – dismiss as alarmist “risks” from cell phones. But there has been a recent upsurge in caution concerning cell phones in regard to brain damage from a presumably authoritative source: the Director of the University of Pittsburgh’s University Medical Center Cancer Center.[11] Why do we disregard such warnings – and seldom check our car’s brakes or inquire into the maintenance record of our next flight’s plane? Because the disutility of ascertaining certainty far outweighs a subjective assessment of risk. Moreover, negatives are impossible to prove: how could there be even in principle decisive proof that no critical system on any given 747 will fail? No one can live with the precautionary principle; not only are there innumerable known unknowns, but – and here Donald Rumsfeld for once got something right – the sheer number of unknown unknowns is everywhere daunting. Farmers in China, like those in India, Pakistan, Brazil, Vietnam and much of the world grow Bt transgenics because they make life marginally easier, slightly more profitable, and slightly less destructive of their very local environments. If there are distal and uncertain risks, they pale by comparison to the real risks of pesticide poisoning and crop failure. Farmers make this calculation whether governments approve or not, just as desperate Americans try remedies not yet approved by the Food and Drug Administration.

Do farmers then worry about biodiversity, as the Nature article clearly thinks they should? Yes and no: they worry about destruction of helpful predators on the pests of their crops, but they recognize that spraying poisons across the fields kills friends and foes alike, including some farmers and farm workers. Bt plants, in contrast, are targeted to a class of pests, and contained in the plant tissues. Bt plants represent a kind of poetic justice: if a pest leaves the plant alone, it will not be harmed; if it attacks the plant, it will die. The advantage to the farmer is that the pro-toxin stays in plant tissues, instead of rivers, soils, lungs, birds, toads, ladybugs.

In this one incidence of conflicting pressures on the state in China is contained the global cognitive rift around transgenic organisms, much as the history of imperialism can be drawn from a single cup of tea. The discourse is one of threat and promise, of state responsibilities and international norms. The dichotomous—threat/promise—construction of technical change in agriculture resonates with previous attempts to promote or stop technical change; the “green revolution” of nitrogen-responsive grain varieties still launches many pages of paper. Agriculture is symbolic terrain on which much larger conflicts are joined.

The lessons from China’s consideration of Bt rice then illustrate larger points about transnational politics of “GMOs.”First, disaggregation is necessary to make sense. China’s development and deployment of an indigenous Bt cotton raised no real controversy; rice is a food crop, and the politics around food differ fundamentally from those around purely utilitarian technologies, whether cotton or insulin. Second, rice is not cotton in terms of gene flow: careful science is necessary to sort out risks and benefits to farmers; risks to farmers and agro-ecological systems are much greater in rice than cotton. Third, there is no reason to assume, as is often done instrumentally, that biotechnology entails corporate dominance of either farmers or national governments. China is the giant exception, but not the only one. Finally, nothing in the battle for the formal-legal high ground makes much difference on the real ground. Though the EU battles the US and WTO over whether or not transgenic crops should be allowed, the decision will ultimately be made by farmers.[12] It is the agency of people close to the seeds that will settle the question; in China, that decision leans toward transgenic rice, just as it previously did to transgenic cotton. It is hard to conjure the kind of state that could regulate the seed choices of millions of farmers across dozens of crops; but even if such surveillance and control could be imagined, it is hard not to think that there are better things to do.

Ron Herring teaches political economy and political ecology in the political science department at Cornell. He is the author most recently of Transgenics and the Poor: Biotechnology in Development Studies and coeditor with Rina Agarwal of Whatever Happened to Class?: Reflections from South Asia.

Full article w/notes and references here.

Wednesday, January 28, 2009

Comments On Mooney On Marburger & Science Policy

Mooney Talks Past Marburger II: Science Policy Boogaloo. By David Bruggeman
Prometheus, January 27th, 2009

Today I’ll get into some issues in Mooney’s hatchet job where he and Marburger talk past each other. All quotations not otherwise attributed are from Mooney.

I’d like to indulge in one final Bush-era diatribe against the longest-ever serving White House science adviser: John Marburger, who has been a poor advocate indeed for the science world.

Since when is the president’s science adviser a science advocate? Let’s look at the underlying law dictating how the Office of Science and Technology Policy should operate (Public Law 94-282). Some relevant text:

The Act authorizes OSTP to:

Advise the President and others within the Executive Office of the President on the impacts of science and technology on domestic and international affairs;
Lead an interagency effort to develop and implement sound science and technology policies and budgets;
Work with the private sector to ensure Federal investments in science and technology contribute to economic prosperity, environmental quality, and national security;
Build strong partnerships among Federal, State, and local governments, other countries, and the scientific community;
Evaluate the scale, quality, and effectiveness of the Federal effort in science and technology.

There’s a lot of wiggle room here. But what isn’t here is some dictum that scientific outcomes advanced by OSTP dictate policy outcomes. This path is a small reach from the encouragement of open inquiry and publication without censorship. Many people can’t resist the urge to reach.

Marburger was responsible to the President, first and foremost. He provided scientific and technical information to the White House when needed, in the way that they wanted it. It’s within their rights to dictate how they want the information and how they use it - if they do at all. The same is true for President Obama and his future OSTP director. Would Dr. Holdren resign if President Obama opts for different climate change policies than what he recommends? I doubt it.

This is the problem some are concerned about with respect to Holdren - advocacy over advice. If Energy Secretary Chu’s confirmation hearings are any indication, expect some walkback of Holdren’s strong climate policy statements in the future.

Mooney writes as though Marburger is the only person focused on science policy as budget policy. ASTRA, Research!America, and many scientific societies are very interested in science budgets, often to the exclusion of most anything else. The whole post-war debate over how the federal government would support scientific research revolved around how federal research dollars would be treated.

Mooney discusses specific budget numbers, and their recent decline (in terms of real dollars, accounting for inflation). The implication is that blame for this can be squarely placed at the feet of Bush and Marburger. However, there has been little in the way of leadership from the Democratic Congress to make sure that authorized levels of funding from the America COMPETES Act were appropriated. These dollars are a perennial loser in budget battles. A Democratically-controlled Congress found it either impossible or undesirable to fight for that money. Where’s your disappointment in them, Chris?

“let’s note that on the question of ethics, the Bush administration was also wrong, and the 2001 policy in fact unethical, because it designated several cell lines as eligible for research that did not meet basic ethics guidelines for informed consent”

While there is a problem here, where many of the stem cell lines were obtained in ways that did not follow accepted informed consent procedures, Mooney ignores what is - at least politically - a much bigger ethical issue. There is a legitimate ethical consideration related to definitions of life. Science can inform that decision, but not dictate it. To ignore it, something a president would be more likely to pay attention to, is to continue attacking a strawman that isn’t there.

“This [Marburger's claim that the visibility of the science community was a political strategy of the Democratic party], too, is false. I’m happy to say that I watched the entire politics and science issue evolve over the course of the Bush administration. It wasn’t that the Democrats stirred up the scientists; rather, the scientists stirred up the Democrats and other progressive advocates.”

These chicken-or-the-egg arguments miss a relevant political use of the “War on Science.” I have noted then-Senator Clinton’s use of the term to include decreased aerospace research, muddling what was focused on concerns over misrepresenting data into traditional budget squabbles. This mission creep aside, it was more common to see Democrats simply lumping in allegations of scientific tampering or misuse in their laundry lists of Bush Administration malfeasance. What little direct mention science received in the recent inaugural address is consistent with this framing - “we will restore science to its rightful place.” Democrats took advantage of a potential new voting block and added to their rhetorical weapons. Those are, so far, the only explicit outcomes of listening to the science advocates.

Science and technology can thrive while being ignored in certain policy decisions. It is completely possible for resources to flow toward research and development, and for policies to encourage the use of science and technology, while scientific information that would undercut desired policies is shunted aside. So it is possible for both Mooney and Marburger to be right.

A war suggests a total effort that simply isn’t there in the case of the “War on Science.” There were certain instances of science-related conduct that were problematic and/or skirted the intent of the law (and may be again), but there was no systematic subversion, nor any particular master plan, which is what I expect when I see a war. Mooney and others don’t do science advocacy any favors if they continue to cling to this unrealistic notion of a “War on Science” long past the point of political effectiveness.