The biofuels bust

The news that the UK is to slow the adoption of biofuels, and that the European Parliament has called for a reduction in the EU’s targets for biofuel adoption, is a good point to mark one of the most rapid turnarounds we’ve seen in science policy. Only two years ago, biofuels were seen by many as a benign way for developed countries to increase their energy security and reduce their greenhouse gas emissions without threatening their citizens’ driving habits. Now, we’re seeing the biofuel boom being blamed for soaring food prices, and the environmental benefits are increasingly in doubt. It’s rare to see the rationale for a proposed technological fix for a major societal problem fall apart quite so quickly, and there must surely be some lessons here for other areas of science and policy.

The UK’s volte-face was prompted by a government commissioned report led by the environmental scientist Ed Gallagher. The Gallagher Review is quite an impressive document, given the rapidity with which it has been put together. This issue is in many ways typical of the problem we’re increasingly seeing arising, in which difficult and uncertain science comes together with equally uncertain economics through the unpredictability of human and institutional responses in a rapidly changing environment.

The first issue is whether, looking at the whole process of growing crops for biofuels, including the energy inputs for agriculture and for the conversion process, one actually ends up with a lower output of greenhouse gases than one would using petrol or diesel. Even this most basic question is more difficult than it might seem, as illustrated by the way the report firmly but politely disagrees with a Nobel Laureate in atmospheric chemistry, Paul Crutzen, who last year argued that, if emissions of nitrogen oxides during agriculture were properly accounted for, biofuels actually produce more greenhouse gases than the fossil fuels they replace. Nonetheless, the report finds a wide range of achievable greenhouse gas savings; corn bioethanol, for example, at its best produces a saving of about 35%, but at its worst it actually produces a net increase in greenhouse gases of nearly 30%. Other types of biofuel are better; both Brazilian ethanol from sugar cane and biodiesel from palm oil can achieve savings of between 60% and 70%. But, and this is a big but, these figures assume these crops are grown on existing agricultural land. If new land needs to be taken into cultivation, there’s typically a large release of carbon. Taking into account the carbon cost of changing land use means that there’s a considerable pay-back time before any greenhouse gas savings arise at all. In the worst cases, this can amount to hundreds of years.

This raises the linked questions – how much land is available for growing biofuels, and how much can we expect that the competition from biofuel uses of food crops will lead to further increases in food prices? There seems to be a huge amount of uncertainty surrounding these issues. Certainly the situation will be eased if new technologies arise for the production of cellulosic ethanol, but these aren’t necessarily a panacea, particularly if they involve changes in land-use. The degree to which recent food price increases can be directly attributed to the growth in biofuels is controversial, but no-one can doubt that, in a world with historically low stocks of staple foodstuffs, any increase in demand will result in higher prices than would otherwise have occurred. The price of food is already indirectly coupled to the price of oil because modern intensive agriculture demands high energy inputs, but the extensive use of biofuels makes that coupling direct.

It’s easy to be wise in hindsight, but one might wonder how much of this could have been predicted. I wrote about biofuels here two years ago, and re-reading that entry – Driving on sunshine – it seems that some of the drawbacks were more easy to anticipate than others. What’s sobering about the whole episode, though, is that it does show how complicated things can get when science, politics and economics get closely coupled in situations needing urgent action in the face of major uncertainties.

Synthetic biology – summing up the debate so far

The UK’s research council for biological sciences, the BBSRC, has published a nice overview of the potential ethical and social dimensions to the development of synthetic biology. The report – Synthetic biology: social and ethical challenges (737 KB PDF) – is by Andrew Balmer & Paul Martin at the University of Nottingham’s Institute for Science and Society.

The different and contested definitions and visions that people have for synthetic biology are identified at the outset; the authors distinguish between four rather different conceptions of synthetic biology. There’s the Venter approach, consisting of taking a stripped-down organism with a minimal genome, and building desired functions into that. The identification of modular components and the genetic engineering of whole pathways forms a second, but related approach. Both of these visions of synthetic biology still rely on the re-engineering of existing DNA based life; a more ambitious, but much less completely realised, program for synthetic biology, attempts to make wholly artificial cells from non-biological molecules. A fourth strand, which seems less far-reaching in its ambitions, attempts to make novel biomolecules by mimicking the post-transcriptional modification of proteins that is such a source of variety in biology.

What broader issues are likely to arise from this enterprise? The report identifies five areas to worry about. There’s the potential problems and dangers of the uncontrolled release of synthetic organisms into the biosphere; the worry of these techniques being mis-used for the creation of new pathogens for use in bioterrorism, the potential for the creation of monopolies through an unduly restrictive patenting regime, and implications for trade and global justice. Most far-reaching of all, of course, are the philosophical and cultural implications of creating artificial life, with its connotations of transgressing the “natural order”, and the problems of defining the meaning and significance of life itself.

The recommended prescriptions fall into a well-rehearsed pattern – the need for early consideration of governance and regulation, and the desirability of carrying the public along with early public engagement and resistance to the temptation to overhype the potential applications of the technology. As ever, dialogue between scientists and civil society groups, ethicists and social scientists is recommended, a dialogue which, the authors think, will only be credible if there is a real possibility that some lines of research would be abandoned if they were considered too ethically problematical.

On expertise

Whose advice should we trust when we need to make judgements about difficult political questions with a technical component? Science sociologists Harry Collins and Robert Evans, of Cardiff University, believe that this question of expertise is the most important issue facing science studies at the moment. I review their book on the subject, Rethinking Expertise, in an article – Spot the physicist – in this month’s Physics World.

Asbestos-like toxicity of some carbon nanotubes

It has become commonplace amongst critics of nanotechnology to compare carbon nanotubes to asbestos, on the basis that they are both biopersistent, inorganic fibres with a high aspect ratio. Asbestos is linked to a number of diseases, most notably the incurable cancer mesothelioma, of which there are currently 2000 new cases a year in the UK. A paper published in Nature Nanotechnology today, from Ken Donaldson’s group at the University of Edinburgh, provides the best evidence to date that some carbon nanotubes – specifically, multi-wall nanotubes longer than 20 µm or so – do lead to the same pathogenic effects in the mesothelium as asbestos fibres.

The basis of toxicity of asbestos and other fibrous materials is now reasonably well understood; their toxicity is based on the physical nature of the materials, rather than their chemical composition. In particular, fibres are expected to be toxic if they are long – longer than about 20 µm – and rigid. The mechanism of this pathogenicity is believed to be related to frustrated phagocytosis. Phagocytes are the cells whose job it is to engulf and destroy intruders – when they detect a foreign body like a fibre, they attempt to engulf it, but are unable to complete this process if the fibre is too long and rigid. Instead they release a burst of toxic products, which have no effect on the fibre but instead cause damage to the surrounding tissues. There is every reason to expect this mechanism to be active for nanotubes which are sufficiently long and rigid.

Donaldson’s group tested the hypothesis that long carbon nanotubes would have a similar effect to asbestos by injecting nanotubes into the peritoneal cavity of mice to expose the mesothelium directly to nanotubes, and then directly monitor the response. This is a proven assay for the initial toxic effects of asbestos that subsequently lead to the cancer mesothelioma.

Four multiwall nanotube samples were studied. Two of these samples had long fibres – one was a commercial sample, from Mitsui, and another was produced in a UK academic laboratory. The other two samples had short, tangled fibres, and were commercial materials from NanoLab Inc, USA. The nanotubes were compared to two controls of long and short fibre amosite asbestos and one of non-fibrous, nanoparticulate carbon black. The two nanotube samples containing a significant fraction of long (>20 µm) nanotubes, together with the long-fibre amosite asbestos, produced a characteristic pathological response of inflammation, the production of foreign body giant cells, and the development of granulomas, a characteristic lesion. The nanotubes with short fibres, like the short fibre asbestos sample and the carbon black, produced little or no pathogenic effect. A number of other controls provide good evidence that it is indeed the physical form of the nanotubes rather than any contaminants that leads to the pathogenic effect.

The key finding, then, is that not all carbon nanotubes are equal when it comes to their toxicity. Long nanotubes produce an asbestos-like response, while short nanotubes, and particulate graphene-like materials don’t produce this response. The experiments don’t directly demonstrate the development of the cancer mesothelioma, but it would be reasonable to suppose this would be the eventual consequence of the pathogenic changes observed.

The experiments do seem to rule out a role for other possible contributing factors (presence of metallic catalyst residues), but they do not address whether other mechanisms of toxicity might be important for short nanotubes.

Most importantly, the experiments do not say anything about issues of dose and exposure. In the experiments, the nanotubes were directly injected into peritoneal cavity; to establish whether environmental or workplace exposure to nanotubes present a danger one needs to know how likely it is that realistic exposures of inhaled nanotubes would lead to enough nanotubes crossing the lungs through to the mesothelium lead to toxic effects. This is the most urgent question now waiting for further research.

It isn’t clear what proportion of the carbon nanotubes now being produced on industrial, or at least pilot plant, scale, would have the characteristics – particularly in their length – that would lead to the risk of these toxic effects. However, those nanotubes that are already in the market-place are mostly in the form of advanced composites, in which the nanotubes are tightly bound in a resin matrix, so it seems unlikely that these will pose an immediate danger. We need, with some urgency, research into what might happen to the nanotubes in such products over their whole lifecycle, including after disposal.

How to think about science studies

I’ve been passing my driving time recently listening to the podcasts of an excellent series from the Canadian Broadcasting Corporation, called How to think about science. It’s simply a series of long interviews with academics, generally from the field of science studies. I’ve particularly enjoyed the interviews with historian of science Simon Schaffer, sociologists Ulrich Beck and Brian Wynne, science studies guru Bruno Latour, and Evelyn Fox Keller, who has written some interesting books about some of the tacit philosophies underlying modern biology. With one or two exceptions, even those interviews with people I find less convincing still provided me with a few thought provoking insights .

That strange academic interlude, the “science wars”, gets the occasional mention – this was the time when claims from science studies about the importance of social factors in the construction of scientific knowledge provoked a fierce counter-attack from people anxious to defend science against what they saw as an attack on its claims to objective truth. My perception is that the science wars ended in an armistice, though there are undoubtedly some people still holding out in the jungle, unaware that the war is over. Although the series is clearly presented from the science studies side of the argument, most contributors reflect the terms of the peace treaty, accepting the claims of science to be a way of generating perhaps uniquely reliable knowledge, while still insisting on the importance of the social in the way that knowledge is constructed, and criticising inappropriate ways of using scientific or pseudo-scientific arguments, models and metaphors in public discourse.

USA lagging Europe in nanotechnology risk research

How much resource is being devoted to assessing the potential risks of the nanotechnologies that are currently at or close to market? Not nearly enough, say campaigning groups, while governments, on the other hand, release impressive sounding figures for their research spend. Most recently, the USA’s National Nanotechnology Initiative has estimated its 2006 spend on nano-safety research as $68 million, which sounds very impressive. However, according to Andrew Maynard, a leading nano-risk researcher based at the Woodrow Wilson Center in Washington DC, we shouldn’t take this figure at face value.

Maynard comments on the figure on the SafeNano blog, referring to an analysis recently done by him and described in a news release from the Woodrow Wilson Center’s Project on Emerging Nanotechnologies. It seems that this figure is obtained by adding up all sorts of basic nanotechnology research, some of which might have only tangential relevance to problems of risk. If one applies a tighter definition of research that is either highly relevant to nanotechnology risk – such as a direct toxicology study – or substantially relevant -such as a study of the fate in the body of medical nanoparticles – it seems that the numbers fall substantially. Only $13 million of the $68 million was highly relevant to nanotechnology risk, with this number increasing to $29 million if the substantially relevant category is included too. This compares unfavourably with European spending, which amounts to $24 million in the highly relevant category alone.

Of course, it isn’t the headline figure that matters; what’s important is whether the research is relevant to the actual and potential risks that are out there. The Project on Emerging Nanotechnologies has done a great service by compiling an international inventory of nanotechnology risk research which allows one to see clearly just what sort of risk research is being funded across the world. It’s clear from this that suggestions that nanotechnology is being commercialised with no risk research at all being done are wide of the mark; what requires further analysis is whether all the right research is being done.

How can nanotechnology help solve the world’s water problems?

The lack of availability of clean water to many of the world’s population currently leads to suffering and premature death for millions of people, and as population pressures increase, climate change starts to bite, and food supplies become tighter (perhaps exacerbated by an ill-considered move to biofuels) these problems will only intensify. It’s possible that nanotechnology may be able to contribute to solving these problems (see this earlier post, for example). A couple of weeks ago, Nature magazine ran a special issue on water, which included a very helpful review article: Science and technology for water purification in the coming decades. This article (which seems to be available without subscription) is all the more helpful for not focusing specifically on nanotechnology, instead making it clear where nanotechnology could fit into other existing technologies to create affordable and workable solutions.

One sometimes hears the criticism that there’s no point worrying about the promise of new nanotechnological solutions, when workable solutions are already known but aren’t being implemented, for political or economic reasons. That’s an argument that’s not without force, but the authors do begin to address it, by outlining what’s wrong with existing technical solutions. “These treatment methods are often chemically, energetically and operationally intensive, focused on large systems, and thus require considerable infusion of capital, engineering expertise and infrastructure” Thus we should be looking for decentralised solutions, that can be easily, reliably and cheaply installed using local expertise and preferably without the need for large scale industrial infrastructure.

To start with the problem of the sterilisation of water to kill pathogens, traditional methods start with chlorine. This isn’t ideal, as some pathogens are remarkably tolerant of it, and it can lead to toxic by-products. Ultra-violet sterilisation, on the other hand, offers a lot of promise – it’s good for bacteria, though less effective for viruses. But in combination with photocatalytic surfaces of titanium dioxide nanoparticles it could be very effective. Here what is required is either much cheaper sources of ultraviolet light, (which could come from new nanostructured semiconductor light emitting diodes) or new types of nanoparticles with surfaces excited by lower wavelength light, including sunlight.

Another problem is the removal of contamination by toxic chemicals, which can arise either naturally or through pollution. Problem contaminants include heavy metals, arsenic, pesticide residues, and endocrine disrupters; the difficulty is that these can have dangerous effects even at rather low concentrations, which can’t be detected without expensive laboratory-based analysis equipment. Here methods for robust, low cost chemical sensing would be very useful – perhaps a combination of molecular recognition elements integrated in nanofluidic devices could do the job.

The reuse of waste water offers hard problems because of the high content organic matter that needs to be removed, in addition to the removal of other contaminants. Membrane bioreactors combine the use of the sorts of microbes that are exploited in activated sludge processes of conventional sewage treatment with ultrafiltration through a membrane to get faster throughputs of waste water. The tighter the pores in this sort of membrane, the more effective it is at removing suspended material, but the problem is that this sort of membrane quickly gets blocked up. One solution is to line the micro- and nano- pores of the membranes with a single layer of hairy molecules – one of the paper’s co-authors, MIT’s Anne Mayes, developed a particularly elegant scheme for doing this exploiting self-assembly of comb-shaped copolymers.

Of course, most of the water in the world is salty (97.5%, to be precise), so the ultimate solution to water shortages is desalination. Desalination costs energy – necessarily so, as the second law of thermodynamics puts a lower limit on the cost of separating pure water from the higher entropy solution state. This theoretical limit is 0.7 kWh per cubic meter, and to date the most efficient practical process uses a not at all unreasonable 4 kWh per cubic meter. Achieving these figures, and pushing them down further, is a matter of membrane engineering, achieving precisely nanostructured pores that resist fouling and yet are mechanically and chemically robust.

A methanol economy?

Transport accounts for between a quarter and a third of primary energy use in developed economies, and currently this comes almost entirely from liquid hydrocarbon fuels. Anticipating a world with much more expensive oil and a need to dramatically reduce carbon dioxide emissions, many people have been promoting the idea of a hydrogen economy, in which hydrogen, generated in ways that minimise CO2 emissions, is used as a carrier of energy for transportation purposes. Despite its superficial attractiveness, and high profile political support, the hydrogen economy has many barriers to overcome before it becomes technically and economically feasible. Perhaps most pressing of these difficulties is the question of how this light, low energy density gas can be stored and transported. An entirely new pipeline infrastructure would be needed to move the hydrogen from the factories where it is made to filling stations, and, perhaps even more pressingly, new technologies for storing hydrogen in vehicles will need to be developed. Early hopes that nanotechnology would provide new and cost-effective solutions to these problems – for example, using carbon nanotubes to store hydrogen – don’t seem to be bearing fruit so far. Since using a gas as an energy carrier causes such problems, why don’t we stick with a flammable liquid? One very attractive candidate is methanol, whose benefits have been enthusiastically promoted by George Olah, a Nobel prize winning chemist from the University of Southern California, whose book Beyond Oil and Gas: The Methanol Economy describes his ideas in some technical detail.

The advantage of methanol as a fuel is that it is entirely compatible with the existing infrastructure for distributing and using gasoline; pipes, pumps and tanks would simply need some gaskets changed to switch over to the new fuel. Methanol is an excellent fuel for internal combustion engines; even the most hardened petrol-head should be convinced by the performance figures of a recently launched methanol powered Lotus Exige. However, in the future, greater fuel efficiency might be possible using direct methanol fuel cells if that technology can be improved.

Currently methanol is made from natural gas, but in principle it should be possible to make it economically by reacting carbon dioxide with hydrogen. Given a clean source of energy to make hydrogen (Olah is an evangelist for nuclear power, but if the scaling problems for solar energy were solved that would work too), one could recycle the carbon dioxide from fossil fuel power stations, in effective getting one more pass of energy out of it before releasing it into the atmosphere. Ultimately, it should be possible to extract carbon dioxide directly from the atmosphere, achieving in this way an almost completely carbon-neutral energy cycle. In addition to its use as a transportation fuel, it is also possible to use methanol as a feedstock for the petrochemical industry. In this way we could, in effect, convert atmospheric carbon dioxide into plastic.

A culture of improvement

If one wants to comment on the future of technology, it’s a good idea to have some understanding of its history. A new book by Robert Friedel,
A Culture of Improvement: Technology and the Western Millennium, takes on the ambitious task of telling the story of the development of technology in Europe and North America over the last thousand years.

The book is largely a very readable narrative history of technology, with some rather understated broader arguments. One theme is suggested by the title; in Friedel’s view the advance of technology has been driven, not so much by the spectacular advances of the great inventors, but by a mindset that continually seeks to make incremental improvements in existing technologies. The famous inventors, the James Watts and Alexander Graham Bells of history, certainly get due space, but there’s also an emphasis on placing the best-known inventions in the context of the less well known precursor technologies from which they sprung, and on the way engineers and workers continuously improved the technologies once they were introduced. Another theme is the way in which the culture of improvement was locked into place, as it were, by the institutions that promoted technical and scientific education, and the media that brought new scientific and technical ideas to a wide audience.

This provokes some revision of commonly held ideas about the relationship between science and engineering. In Friedel’s picture, the role of science has been, less to provide fundamental discoveries that engineers can convert into practical devices, and more to provide the mental framework that permits the process of incremental improvement. Those who wish to de-emphasise the importance of science for innovation often point to the example of the development of the steam engine – “thermodynamics owes much more to the steam engine than the steam engine owes to thermodynamics”, the saying goes. This of course is true as far as it goes – the academic subject of thermodynamics was founded by Sadi Carnot’s analysis of the steam engines that were already in widespread use, and which had been extensively developed without the benefit of much theoretical knowledge. But it neglects the degree to which an understanding of formal thermodynamics underlay the development of the more sophisticated types of engines that are still in use today. Rudolph Diesel’s efforts to develop the engine that bears his name, and which is now so important, were based on an explicit project to use the thermodynamics he had learned from his professor, Carl Linde (who also made huge contributions to the technology of refrigeration), to design the most efficient possible internal combustion engine.

Some aspects of the book are open to question. The focus on Europe, and the European offshoots in North America, is justified by the premise that there was something special in this culture that led to the “culture of improvement”; one could argue, though, that the period of unquestioned European technological advantage was a relatively short fraction of the millennium under study (it’s arguable, for example, that China’s medieval technological lead over Europe persisted well into the 18th century). And many will wonder whether technological advances always lead to “improvement”. A chapter on “the corruption of improvement” discusses the application of technology to weapons of mass destruction, but one feels that Friedel’s greatest revulsion is prompted by the outcome of the project to apply the culture of improvement to the human race itself. It’s useful to be reminded that the outcome of this earlier project for “human enhancement” was, particularly in the USA and Scandinavia, a programme of forced sterilisation of those deemed unfit to reproduce that persisted well into living memory. In Germany, of course, this “human enhancement” project moved beyond sterilisation to industrial-scale systematic murder of the disabled and those who were believed to be threats to “racial purity”.

Another UK government statement on nanotechnology

As I mentioned on Wednesday, the UK government took the opportunity of Thursday’s nano-summit organised by the consumer advocate group Which? to release a statement about nanotechnology. The Science Minister’s speech didn’t announce anything new or dramatic – the minister did “confirm our commitment to keep nanotechnology as a Government priority”, though as the event’s chair, Nick Ross, observed, the Government has a great many priorities. The full statement (1.3 MB PDF) is at least a handy summary of what otherwise would be a rather disjointed set of measures and activities.

The other news from the Which? event was the release of the report from their Citizen’s Panel. Some summaries, as well as a complete report, are available from the Which? website. Some flavour of the results can be seen in this summary: “Panellists were generally excited about the potential that nanotechnologies offer and were keen to move ahead with developing them. However, they also recognised the need to balance this with the potential risks. Panellists identified many opportunities for nanotechnologies. They appreciated the range of possible applications and certain specific applications, particularly for health and medicine. The potential to increase consumer choice and to help the environment were also highlighted, along with the opportunity to ‘start again’ by designing new materials with more useful properties. Other opportunities they highlighted were potential economic developments for the UK (and the jobs this might create) and the potential to help developing countries (with food or cheaper energy).” Balanced against this generally positive attitude were concerns about safety, regulation, information, questions about the accessibility of the technology to the poor and the developing world, and worries about possible long-term environmental impacts.

The subject of nanotechnology was introduced at the meeting with this short film.