Deja vu all over again?

Today the UK’s Royal Commission on Environmental Pollution released a new report on the potential risks of new nanomaterials and the implications of this for regulation and the governance of innovation. The report – Novel Materials in the Environment: The case of nanotechnology is well-written and thoughtful, and will undoubtedly have considerable impact. Nonetheless, four years after the Royal Society report on nanotechnology, nearly two years after the Council of Science and Technology’s critical verdict on the government’s response to that report, some of the messages are depressingly familiar. There are real uncertainties about the potential impact of nanoparticles on human health and the environment; to reduce these uncertainties some targeted research is required; this research isn’t going to appear by itself and some co-ordinated programs are needed. So what’s new this time around?

Andrew Maynard picks out some key messages. The Commission is very insistent on the need to move beyond considering nanomaterials as a single class; attempts to regulate solely on the basis of size are misguided and instead one needs to ask what the materials do and how they behave. In terms of the regulatory framework, the Commission was surprisingly (to some observers, I suspect) sanguine about the suitability and adaptability of the EU’s regulatory framework for chemicals, REACH, which, it believes, can readily be modified to meet the special challenges of nanomaterials, as long as the research needed to fill the knowledge gaps gets done.

Where the report does depart from some previous reports is in a rather subtle and wide-ranging discussion of the conceptual basis of regulation for fast-moving new technologies. It identifies three contrasting positions, none of which it finds satisfactory. The “pro-innovation” position calls for regulators to step back and let the technology develop unhindered, pausing only when positive evidence of harm emerges. “Risk-based” approaches allow for controls to be imposed, but only when clear scientific grounds for concern can be stated, and with a balance between the cost of regulating and the probability and severity of the danger. The “precautionary” approach puts the burden of proof on the promoters of new technology to show that it is, beyond any reasonable doubt, safe, before it is permitted. The long history of unanticipated consequences of new technology warn us against the first stance, while the second position assumes that the state of knowledge is sufficient to do these risk/benefit analyses with confidence, which isn’t likely to be the case for most fast moving new technologies. But the precautionary approach falls down, too, if, as the Commission accepts, the new technologies have the potential to yield significant benefits that would be lost if they were to be rejected on the grounds of inevitably incomplete information. To resolve this dilemma, the Commission seeks an adaptive system of regulation that seeks, above all, to avoid technological inflexibility. The key, in their view, is to innovate in a way that doesn’t lead society down paths from which it is difficult to reverse, if new information should arise about unanticipated threats to health or the environment.

The report has generated a substantial degree of interest in the press, and, needless to say, the coverage doesn’t generally reflect these subtle discussions. At one end, the coverage is relatively sober, for example Action urged over nanomaterials, from the BBC, and Tight regulation urged on nanotechnology, from the Financial Times. In the Daily Mail, on the other hand, we have Tiny but toxic: Nanoparticles with asbestos-like properties found in everyday goods. Notwithstanding Tim Harper’s suggestion that some will welcome this sort of coverage if it injects some urgency into the government’s response, this is not a good place for nanotechnology to be finding itself.

Nanocosmetics in the news

Uncertainties surrounding the use of nanoparticles in cosmetics made the news in the UK yesterday; this followed a press release from the consumer group Which? – Beauty must face up to nano. This is related to a forthcoming report in their magazine, in which a variety of cosmetic companies were asked about their use of nanotechnologies (I was one of the experts consulted for commentary on the results of these inquiries).

The two issues that concern Which? are some continuing uncertainties about nanoparticle safety and the fact that it hasn’t generally been made clear to consumers that nanoparticles are being used. Their head of policy, Sue Davies, emphasizes that their position isn’t blanket opposition: “We’re not saying the use of nanotechnology in cosmetics is a bad thing, far from it. Many of its applications could lead to exciting and revolutionary developments in a wide range of products, but until all the necessary safety tests are carried out, the simple fact is we just don’t know enough.” Of 67 companies approached for information about their use of nanotechnologies, only 8 replied with useful information, prompting Sue to comment: “It was concerning that so few companies came forward to be involved in our report and we are grateful for those that were responsible enough to do so. The cosmetics industry needs to stop burying its head in the sand and come clean about how it is using nanotechnology.”

On the other hand, the companies that did supply information include many of the biggest names – L’Oreal, Unilever, Nivea, Avon, Boots, Body Shop, Korres and Green People – all of whom use nanoparticulate titanium dioxide (and, in some cases, nanoparticulate zinc oxide). This makes clear just how widespread the use of these materials is (and goes someway to explaining where the estimated 130 tonnes of nanoscale titanium dioxide being consumed annually in the UK is going).

The story is surprisingly widely covered by the media (considering that yesterday was not exactly a slow news day). Many focus on the angle of lack of consumer information, including the BBC, which reports that “consumers cannot tell which products use nanomaterials as many fail to mention it”, and the Guardian, which highlights the poor response rate. The story is also covered in the Daily Telegraph, while the Daily Mail, predictably, takes a less nuanced view. Under the headline The beauty creams with nanoparticles that could poison your body, the Mail explains that “the size of the particles may allow them to permeate protective barriers in the body, such as those surrounding the brain or a developing baby in the womb.”

What are the issues here? There is, if I can put it this way, a cosmetic problem, in that there are some products on the market making claims that seem at best unwise – I’m thinking here of the claimed use of fullerenes as antioxidants in face creams. It may well be that these ingredients are present in such small quantities that there is no possibility of danger, but given the uncertainties surrounding fullerene toxicology putting products like this on the market doesn’t seem very smart, and is likely to cause reputational damage to the whole industry. There is a lot more data about nanoscale titanium dioxide, and the evidence that these particular nanoparticles aren’t able to penetrate healthy skin looks reasonably convincing. They deliver an unquestionable consumer benefit, in terms of screening out harmful UV rays, and the alternatives – organic small molecule sunscreens – are far from being above suspicion. But, as pointed out by the EU’s Scientific Committee on Consumer Products, there does remain uncertainty about the effect of titanium dioxide nanoparticles on damaged and sun-burned skin. Another issue recently highlighted by Andrew Maynard is the issue of the degree to which the action of light on TiO2 nanoparticles causes reactive and potentially damaging free radicals to be generated. This photocatalytic activity can be suppressed by the choice of crystalline structure (the rutile form of titanium dioxide should be used, rather than anatase), the introduction of dopants, and coating the surface of the nanoparticles. The research cited by Maynard makes it clear that not all sunscreens use grades of titanium dioxide that do completely suppress photocatalytic activity.

This poses a problem. Consumers don’t at present have ready access to information as to whether nanoscale titanium dioxide is used at all, let alone whether the nanoparticles in question are in the rutile or anatase form. Here, surely, is a case where if the companies following best practise provided more information, they might avoid their reputation being damaged by less careful operators.

What’s meant by “food nanotechnology”?

A couple of weeks ago I took part in a dialogue meeting in Brussels organised by the CIAA, the Confederation of the Food and Drink Industries of the EU, about nanotechnology in food. The meeting involved representatives from big food companies, from the European Commission and agencies like the European Food Safety Association, together with consumer groups like BEUC, and the campaigning group Friends of the Earth Europe. The latter group recently released a report on food nanotechnology – Out of the laboratory and on to our plates: Nanotechnology in food and agriculture; according to the press release, this “reveals that despite concerns about the toxicity risks of nanomaterials, consumers are unknowingly ingesting them because regulators are struggling to keep pace with their rapidly expanding use.” The position of the CIAA is essentially that nanotechnology is an interesting technology currently in research rather than having yet made it into products. One can get a good idea of the research agenda of the European food industry from the European Technology Platform Food for Life. As the only academic present, I tried in my contribution to clarify a little the different things people mean by “food nanotechnology”. Here, more or less, is what I said.

What makes the subject of nanotechnology particularly confusing and contentious is the ambiguity of the definition of nanotechnology when applied to food systems. Most people’s definitions are something along the lines of “the purposeful creation of structures with length scales of 100 nm or less to achieve new effects by virtue of those length-scales”. But when one attempts to apply this definition in practise one runs into difficulties, particularly for food. It’s this ambiguity that lies behind the difference of opinion we’ve heard about already today about how widespread the use of nanotechnology in foods is already. On the one hand, Friends of the Earth says they know of 104 nanofood products on the market already (and some analysts suggest the number may be more than 600). On the other hand, the CIAA (the Confederation of Food and Drink Industries of the EU) maintains that, while active research in the area is going on, no actual nanofood products are yet on the market. In fact, both parties are, in their different ways, right; the problem is the ambiguity of definition.

The issue is that food is naturally nano-structured, so that too wide a definition ends up encompassing much of modern food science, and indeed, if you stretch it further, some aspects of traditional food processing. Consider the case of “nano-ice cream”: the FoE report states that “Nestlé and Unilever are reported to be developing a nano- emulsion based ice cream with a lower fat content that retains a fatty texture and flavour”. Without knowing the details of this research, what one can be sure of is that it will involve essentially conventional food processing technology in order to control fat globule structure and size on the nanoscale. If the processing technology is conventional (and the economics of the food industry dictates that it must be), what makes this nanotechnology, if anything does, is the fact that analytical tools are available to observe the nanoscale structural changes that lead to the desirable properties. What makes this nanotechnology, then, is simply knowledge. In the light of the new knowledge that new techniques give us, we could even argue that some traditional processes, which it now turns out involve manipulation of the structure on the nanoscale to achieve some desirable effects, would constitute nanotechnology if it was defined this widely. For example, traditional whey cheeses like ricotta are made by creating the conditions for the whey proteins to aggregate into protein nanoparticles. These subsequently aggregate to form the particulate gels that give the cheese its desirable texture.

It should be clear, then, that there isn’t a single thing one can call “nanotechnology” – there are many different technologies, producing many different kinds of nano-materials. These different types of nanomaterials have quite different risk profiles. Consider cadmium selenide quantum dots, titanium dioxide nanoparticles, sheets of exfoliated clay, fullerenes like C60, casein micelles, phospholipid nanosomes – the risks and uncertainties of each of these examples of nanomaterials are quite different and it’s likely to be very misleading to generalise from any one of these to a wider class of nanomaterials.

To begin to make sense of the different types of nanomaterial that might be present in food, there is one very useful distinction. This is between engineered nanoparticles and self-assembled nanostructures. Engineered nanoparticles are covalently bonded, and thus are persistent and generally rather robust, though they may have important surface properties such as catalysis, and they may be prone to aggregate. Examples of engineered nanoparticles include titanium dioxide nanoparticles and fullerenes.

In self-assembled nanostructures, though, molecules are held together by weak forces, such as hydrogen bonds and the hydrophobic interaction. The weakness of these forces renders them mutable and transient; examples include soap micelles, protein aggregates (for example the casein micelles formed in milk), liposomes and nanosomes and the microcapsules and nanocapsules made from biopolymers such as starch.

So what kind of food nanotechnology can we expect? Here are some potentially important areas:

• Food science at the nanoscale. This is about using a combination of fairly conventional food processing techniques supported by the use of nanoscale analytical techniques to achieve desirable properties. A major driver here will be the use of sophisticated food structuring to achieve palatable products with low fat contents.
• Encapsulating ingredients and additives. The encapsulation of flavours and aromas at the microscale to protect delicate molecules and enable their triggered or otherwise controlled release is already widespread, and it is possible that decreasing the lengthscale of these systems to the nanoscale might be advantageous in some cases. We are also likely to see a range of “nutriceutical” molecules come into more general use.
• Water dispersible preparations of fat-soluble ingredients. Many food ingredients are fat-soluble; as a way of incorporating these in food and drink without fat manufacturers have developed stable colloidal dispersions of these materials in water, with particle sizes in the range of hundreds of nanometers. For example, the substance lycopene, which is familiar as the molecule that makes tomatoes red and which is believed to offer substantial health benefits, is marketed in this form by the German company BASF.

What is important in this discussion is clarity – definitions are important. We’ve seen discrepancies between estimates of how widespread food nanotechnology is in the marketplace now, and these discrepancies lead to unnecessary misunderstanding and distrust. Clarity about what we are talking about, and a recognition of the diversity of technologies we are talking about, can help remove this misunderstanding and give us a sound basis for the sort of dialogue we’re participating in today.

Nanoparticles down the drain

With significant amounts of nanomaterials now entering markets, it’s clearly worth worrying about what’s going to happen these materials after disposal – is there any danger of them entering the environment and causing damage to ecosystems? These are the concerns of the discipline of nano-ecotoxicology; on the evidence of the conference I was at yesterday, on the Environmental effects of nanoparticles, at Birmingham, this is an expanding field.

From the range of talks and posters, there seems to be a heavy focus (at least in Europe) on those few nanomaterials which really are entering the marketplace in quantity – titanium dioxide, of sunscreen fame, and nano-silver, with some work on fullerenes. One talk, by Andrew Johnson, of the UK’s Centre for Ecology and Hydrology at Wallingford, showed nicely what the outline of a comprehensive analysis of the environmental fate of nanoparticles might look like. His estimate is that 130 tonnes of nano-titanium dioxide a year is used in sunscreens in the UK – where does this stuff ultimately go? Down the drain and into the sewers, of course, so it’s worth worrying what happens to it then.

At the sewage plant, solids are separated from the treated water, and the first thing to ask is where the titanium dioxide nanoparticles go. The evidence seems to be that a large majority end up in the sludge. Some 57% of this treated sludge is spread on farmland as fertilizer, while 21% is incinerated and 17% goes to landfill. There’s work to be done, then, in determining what happens to the nanoparticles – do they retain their nanoparticulate identity, or do they aggregate into larger clusters? One needs then to ask whether those that survive are likely to cause damage to soil microorganisms or earthworms. Johnson presented some reassuring evidence about earthworms, but there’s clearly more work to be done here.

Making a series of heroic assumptions, Johnson made some estimates of how many nanoparticles might end up in the river. Taking a worst case scenario, with a drought and heatwave in the southeast of England (they do happen, I’m old enough to remember) he came up with an estimate of 8 micrograms/litre in the Thames, which is still more than an order of magnitude less than that that has been shown to start to affect, for example, rainbow trout. This is reassuring, but, as one questioner pointed out, one still might worry about the nanoparticles accumulating in sediments to the detriment of filter feeders.

Responsible nanotechnology – from discourse to practice

Like many academics, I’ve come back from my summer holiday only to leave immediately for a flurry of conferences. This year has been particularly busy. Last week saw me give a talk at a conference on phase separation in Cambridge last week, this week I’ve been in and out of a conference at Sheffield on thin polymer films, and next week I’m giving talks successively at one conference honouring Dame Julia Higgins and another on the environmental effects of nanoparticles. Yesterday, though, I found myself not amongst scientists, but in the Manchester Business School for a conference on Nanotechnology, Society and Policy.

There were some interesting and provocative talks looking at the empirical evidence for the development, or otherwise, of regional clusters with particular strengths in nanotechnology; under discussion was the issue of whether new industries based on nanotechnologies would inevitably be attracted to existing technological clusters like Silicon Valley and the Boston area, or whether the diverse nature of the technologies grouped under this banner would diffuse this clustering effect.

In the governance section, the University of Twente’s Arie Rip, one of the doyens of European science studies, spoke on the title “Discourse and practice of responsible nanotechnology development”. I must admit that I’d had a preconception that this would be a talk critical of the way so many people had adopted the rhetoric of “responsible development” simply as a way of promoting the subject and deflecting criticism. However, Rip’s message was actually rather more optimistic than this. His view was that, however much such talk begins as rhetoric, it does translate into real practice, and the interactions we’re seeing between technology and society, in the form of public dialogue, discussions between companies and campaigning groups, and the development of codes of practice really are creating “soft structures” and “soft law” that are beginning to have a real, and beneficial, effect on the way these technologies are being introduced.

Can nanotechnology really be green?

This essay was first published in Nature Nanotechnology, February 2007, Volume 2 No 2 pp71-72 (doi:10.1038/nnano.2007.12), abstract here.

In discussions of the possibility of a public backlash against nanotechnology, the comparison that is always made is with the European reaction against agricultural bionanotechnology. “Will nanotechnology turn out to be the new GM?” is an omnipresent question; for nanotechnology proponents a nagging worry, and for opponents a source of hope. Yet, up to now, there’s one important difference – the major campaigning groups – most notably Greenpeace – have so far resisted taking an unequivocal stance against nanotechnology. The reason for this isn’t a sudden outbreak of harmony between environmental groups and the multinationals that are most likely to bring nanotechnology to market in a big way. Instead, it’s a measure of the force of the argument that nanotechnology may lead to new opportunities for sustainable development. Even the most vocal outright opponent of nanotechnology – the small Canada-based group ETC – has recently conceded that nanotechnology might have a role to play in the developing world. Is nanotechnology really going to be the first new technology that big business and the environmental movement can unite behind, or is this the most successful example yet of a greenwash from global techno-science?

The selling points of nanotechnology for the sustainability movement are easily stated. In the lead are the prospects of nano-enabled clean energy and clean water, with some vaguer and more general notions of nanotechnology facilitating cleaner and more sustainable modes of production sitting in the background. On the first issue, many people have argued – perhaps most persuasively the late Richard Smalley – that nanotechnology of a fairly incremental kind has the potential to make a disruptive change to our energy economy. For example, we’re currently seeing rapid growth in solar energy. But the contribution that conventional solar cells can make to our total energy economy is currently limited, not by the total energy supplied by the sun, but by our ability to scale up production of photovoltaics to the massive areas that would be needed to make a real impact. A number of new and promising nano-enabled photovoltaic technologies are positioning themselves to contribute, not by competing with existing solar cells on conversion efficiency, but by their potential for being cheap to produce in very large areas. Meanwhile, as the availability of clean, affordable water becomes more of a problem in many parts of the world, nanotechnology also holds promise. Better control of the nanoscale structure of separation membranes, and surface treatments to prevent fouling, all have the potential to increase the effectiveness and lower the price of water purification.

How can we distinguish between the promises that come so easily in grant applications and press releases, and the true potential that these technologies might have for sustainable development? We need to consider both technical possibilities and the socio-economic realities.

Academic scientists often underestimate the formidable technical obstacles standing in the way of the large scale scale-up of promising laboratory innovations. In the case of alternative, nano-enabled photovoltaics, difficulties with lifetime and stability are still problematic, while many processing issues remain to be ironed out before large scale production can take place. But one reason for optimism is simply the wide variety of possible approaches being tried. One has polymer-based photovoltaics, in which optimal control of self-assembled nanoscale structure could lead to efficient solar cells being printed in very large areas, photochemical cells using dye-sensitised nanoparticles (Grätzel cells) and other hybrid designs involving semi-conductor nanoparticles, or III-V semiconductor heterojunction cells in combination with large area solar concentrators. Surely, one might hope, at least one of these approaches might bear fruit.

The socio-economic realities may prove to be more intractable, at least in some cases. The think-tank Demos, together with the charity Practical Action, recently organised a public engagement event about the possible applications of nanotechnology to clean water in Zimbabwe, which emphasised how remote some of these discussions are from the real problems of poor communities. In the words of Demos’s Jack Stilgoe, “The gulf between Western technoscience and applications for poor communities is far wider than I’d imagined. Ask people what they want from new technologies and they talk about the rope and washer pump, which would stop things (like snakes) falling into their wells.” It’s clear that for nanotechnology to have a real impact in the developing world, a good understanding of local contexts will be vital.

Perhaps, in addition to these promises of direct solutions to sustainability problems, there are some deeper currents here. Given the emphasis that has been given by many writers to the importance of learning from nature in nanotechnology, it’s perhaps not surprising that we’re seeing this idea of nanotechnology as being derived from natural sources, and thus intrinsically benign, cropping up as an important framing device. Referring to the water-repellency of nanostructured surfaces as the “lotus leaf effect” is perhaps the most effective example, both lending itself to comforting imagery and connecting with the long-established symbolism of the lotus leaf as intrinsically, and naturally, spotless and stain-free.

Whatever these deeper cultural contexts, nanotechnology certainly finds itself in the frontline of another important shift, this time in science funding policies. In many countries, the UK included, we’re seeing a shift in emphasis in the aims of publicly funded science, away from narrowly discipline-based objectives, and towards goals defined through societal needs, and in particular towards mitigating global problems such as climate change. As an intrinsically multidisciplinary, and naturally goal-oriented, enterprise, nanotechnology fits very naturally into this new framework and applications of nanotechnology addressing sustainability issues will certainly see increasing emphasis.

Sceptics may see this as just another example of a misguided search for technical fixes for problems that are ultimately socio-political in origin. It may be true that in the past such an approach has simply led to further problems, but nonetheless I strongly believe that we currently have no choice but to continue to look to technological progress to help ameliorate our most pressing difficulties. The “deep green” school may argue that our problems would be cured by abandoning our technological civilisation and returning to simpler ways, but this view utterly fails to recognise the degree to which supporting the earth’s current and projected population levels depends on advanced technology and in particular on intensive energy use. We are existentially dependent on technology, but we know that the technology we have is not sustainable. Green nanotechnology, then, is not just a convenient slogan but an inescapable necessity

What the public think about nanomedicine

A major new initiative on the use of nanotechnology in medicine and healthcare has recently been launched by the UK government’s research councils; around £30 million (US$60 million) is expected to be available for large scale “Grand Challenge” style projects. The closing date for the first call has just gone by, so we will see in a few months how the research community has responded to this opportunity. What’s worth commenting on now, though, is the extent to which public engagement has been integrated into the process by which the call has been defined.

As the number of potential applications of nanotechnology to healthcare is very large, and the funds available relatively limited, there was a need to focus the call on just one or two areas; in the end the call is for applications of nanotechnology in healthcare diagnostics and the targeted delivery of therapeutic agents. As part of the program of consultations with researchers, clinicians and industry people that informed the decision to focus the call in this way, a formal public engagement exercise was commissioned to get an understanding of the hopes and fears the public have about the potential use of nanotechnology in medicine and healthcare. The full report on this public dialogue has just been published by EPSRC, and this is well worth reading.

I’ll be writing in more detail later both about the specific findings of the dialogue, and on the way the results of this public dialogue was incorporated in the decision-making process. Here, I’ll just draw out three points from the report:

  • As has been found by other public engagement exercises, there is a great deal of public enthusiasm for the potential uses of nanotechnology in healthcare, and a sense that this is an application that needs to be prioritised over some others.
  • People value potential technologies that empower them to have more control over their own health and their own lives, while potential technologies that reduce their sense of control are viewed with more caution.
  • People have concerns about who benefits from new technologies – while people generally see nothing intrinsically wrong with business driving nanotechnology, there’s a concern that public investment in science results in the appropriate public value.
  • “Plastics are precious – they’re buried sunshine”

    Disappearing dress at the London College of Fashion
    A disappearing dress from the Wonderland project. Photo by Alex McGuire at the London College of Fashion.

    I’m fascinated by the subtle science of polymers, and it’s a cause of regret to me that the most common manifestations of synthetic polymers are in the world of cheap, disposable plastics. The cheapness and ubiquity of plastics, and the problems caused when they’re carelessly thrown away, blind us to the utility and versatility of these marvellously mutable materials. But there’s something temporary about their cheapness; it’s a consequence of the fact that they’re made from oil, and as oil becomes scarcer and more expensive we’ll need to appreciate the intrinsic value of these materials much more.

    These thoughts are highlighted by a remarkable project put together by the artist and fashion designer Helen Storey and my Sheffield friend and colleague, chemist Tony Ryan. At the centre of the project is an exhibition of exquisitely beautiful dresses, designed by Helen and made from fabrics handmade by textile designer Trish Belford. The essence of fashion is transience, and these dresses literally don’t last long; the textiles they are made from are water soluble and are dissolved during the exhibition in tanks of water. The process of dissolution has a beauty of its own, captured in this film by Pinny Grylls.

    Another film, by the fashion photographer Nick Wright, reminds us of the basic principles underlying the thermodynamics of polymer dissolution. The exhibition will be moving to the Ormeau Baths Gallery in Belfast in October, and you will be able to read more about it in that month’s edition of Vogue.

    The biofuels bust

    The news that the UK is to slow the adoption of biofuels, and that the European Parliament has called for a reduction in the EU’s targets for biofuel adoption, is a good point to mark one of the most rapid turnarounds we’ve seen in science policy. Only two years ago, biofuels were seen by many as a benign way for developed countries to increase their energy security and reduce their greenhouse gas emissions without threatening their citizens’ driving habits. Now, we’re seeing the biofuel boom being blamed for soaring food prices, and the environmental benefits are increasingly in doubt. It’s rare to see the rationale for a proposed technological fix for a major societal problem fall apart quite so quickly, and there must surely be some lessons here for other areas of science and policy.

    The UK’s volte-face was prompted by a government commissioned report led by the environmental scientist Ed Gallagher. The Gallagher Review is quite an impressive document, given the rapidity with which it has been put together. This issue is in many ways typical of the problem we’re increasingly seeing arising, in which difficult and uncertain science comes together with equally uncertain economics through the unpredictability of human and institutional responses in a rapidly changing environment.

    The first issue is whether, looking at the whole process of growing crops for biofuels, including the energy inputs for agriculture and for the conversion process, one actually ends up with a lower output of greenhouse gases than one would using petrol or diesel. Even this most basic question is more difficult than it might seem, as illustrated by the way the report firmly but politely disagrees with a Nobel Laureate in atmospheric chemistry, Paul Crutzen, who last year argued that, if emissions of nitrogen oxides during agriculture were properly accounted for, biofuels actually produce more greenhouse gases than the fossil fuels they replace. Nonetheless, the report finds a wide range of achievable greenhouse gas savings; corn bioethanol, for example, at its best produces a saving of about 35%, but at its worst it actually produces a net increase in greenhouse gases of nearly 30%. Other types of biofuel are better; both Brazilian ethanol from sugar cane and biodiesel from palm oil can achieve savings of between 60% and 70%. But, and this is a big but, these figures assume these crops are grown on existing agricultural land. If new land needs to be taken into cultivation, there’s typically a large release of carbon. Taking into account the carbon cost of changing land use means that there’s a considerable pay-back time before any greenhouse gas savings arise at all. In the worst cases, this can amount to hundreds of years.

    This raises the linked questions – how much land is available for growing biofuels, and how much can we expect that the competition from biofuel uses of food crops will lead to further increases in food prices? There seems to be a huge amount of uncertainty surrounding these issues. Certainly the situation will be eased if new technologies arise for the production of cellulosic ethanol, but these aren’t necessarily a panacea, particularly if they involve changes in land-use. The degree to which recent food price increases can be directly attributed to the growth in biofuels is controversial, but no-one can doubt that, in a world with historically low stocks of staple foodstuffs, any increase in demand will result in higher prices than would otherwise have occurred. The price of food is already indirectly coupled to the price of oil because modern intensive agriculture demands high energy inputs, but the extensive use of biofuels makes that coupling direct.

    It’s easy to be wise in hindsight, but one might wonder how much of this could have been predicted. I wrote about biofuels here two years ago, and re-reading that entry – Driving on sunshine – it seems that some of the drawbacks were more easy to anticipate than others. What’s sobering about the whole episode, though, is that it does show how complicated things can get when science, politics and economics get closely coupled in situations needing urgent action in the face of major uncertainties.

    Synthetic biology – summing up the debate so far

    The UK’s research council for biological sciences, the BBSRC, has published a nice overview of the potential ethical and social dimensions to the development of synthetic biology. The report – Synthetic biology: social and ethical challenges (737 KB PDF) – is by Andrew Balmer & Paul Martin at the University of Nottingham’s Institute for Science and Society.

    The different and contested definitions and visions that people have for synthetic biology are identified at the outset; the authors distinguish between four rather different conceptions of synthetic biology. There’s the Venter approach, consisting of taking a stripped-down organism with a minimal genome, and building desired functions into that. The identification of modular components and the genetic engineering of whole pathways forms a second, but related approach. Both of these visions of synthetic biology still rely on the re-engineering of existing DNA based life; a more ambitious, but much less completely realised, program for synthetic biology, attempts to make wholly artificial cells from non-biological molecules. A fourth strand, which seems less far-reaching in its ambitions, attempts to make novel biomolecules by mimicking the post-transcriptional modification of proteins that is such a source of variety in biology.

    What broader issues are likely to arise from this enterprise? The report identifies five areas to worry about. There’s the potential problems and dangers of the uncontrolled release of synthetic organisms into the biosphere; the worry of these techniques being mis-used for the creation of new pathogens for use in bioterrorism, the potential for the creation of monopolies through an unduly restrictive patenting regime, and implications for trade and global justice. Most far-reaching of all, of course, are the philosophical and cultural implications of creating artificial life, with its connotations of transgressing the “natural order”, and the problems of defining the meaning and significance of life itself.

    The recommended prescriptions fall into a well-rehearsed pattern – the need for early consideration of governance and regulation, and the desirability of carrying the public along with early public engagement and resistance to the temptation to overhype the potential applications of the technology. As ever, dialogue between scientists and civil society groups, ethicists and social scientists is recommended, a dialogue which, the authors think, will only be credible if there is a real possibility that some lines of research would be abandoned if they were considered too ethically problematical.