Whose advice should we trust when we need to make judgements about difficult political questions with a technical component? Science sociologists Harry Collins and Robert Evans, of Cardiff University, believe that this question of expertise is the most important issue facing science studies at the moment. I review their book on the subject, Rethinking Expertise, in an article – Spot the physicist – in this month’s Physics World.
It has become commonplace amongst critics of nanotechnology to compare carbon nanotubes to asbestos, on the basis that they are both biopersistent, inorganic fibres with a high aspect ratio. Asbestos is linked to a number of diseases, most notably the incurable cancer mesothelioma, of which there are currently 2000 new cases a year in the UK. A paper published in Nature Nanotechnology today, from Ken Donaldson’s group at the University of Edinburgh, provides the best evidence to date that some carbon nanotubes – specifically, multi-wall nanotubes longer than 20 µm or so – do lead to the same pathogenic effects in the mesothelium as asbestos fibres.
The basis of toxicity of asbestos and other fibrous materials is now reasonably well understood; their toxicity is based on the physical nature of the materials, rather than their chemical composition. In particular, fibres are expected to be toxic if they are long – longer than about 20 µm – and rigid. The mechanism of this pathogenicity is believed to be related to frustrated phagocytosis. Phagocytes are the cells whose job it is to engulf and destroy intruders – when they detect a foreign body like a fibre, they attempt to engulf it, but are unable to complete this process if the fibre is too long and rigid. Instead they release a burst of toxic products, which have no effect on the fibre but instead cause damage to the surrounding tissues. There is every reason to expect this mechanism to be active for nanotubes which are sufficiently long and rigid.
Donaldson’s group tested the hypothesis that long carbon nanotubes would have a similar effect to asbestos by injecting nanotubes into the peritoneal cavity of mice to expose the mesothelium directly to nanotubes, and then directly monitor the response. This is a proven assay for the initial toxic effects of asbestos that subsequently lead to the cancer mesothelioma.
Four multiwall nanotube samples were studied. Two of these samples had long fibres – one was a commercial sample, from Mitsui, and another was produced in a UK academic laboratory. The other two samples had short, tangled fibres, and were commercial materials from NanoLab Inc, USA. The nanotubes were compared to two controls of long and short fibre amosite asbestos and one of non-fibrous, nanoparticulate carbon black. The two nanotube samples containing a significant fraction of long (>20 µm) nanotubes, together with the long-fibre amosite asbestos, produced a characteristic pathological response of inflammation, the production of foreign body giant cells, and the development of granulomas, a characteristic lesion. The nanotubes with short fibres, like the short fibre asbestos sample and the carbon black, produced little or no pathogenic effect. A number of other controls provide good evidence that it is indeed the physical form of the nanotubes rather than any contaminants that leads to the pathogenic effect.
The key finding, then, is that not all carbon nanotubes are equal when it comes to their toxicity. Long nanotubes produce an asbestos-like response, while short nanotubes, and particulate graphene-like materials don’t produce this response. The experiments don’t directly demonstrate the development of the cancer mesothelioma, but it would be reasonable to suppose this would be the eventual consequence of the pathogenic changes observed.
The experiments do seem to rule out a role for other possible contributing factors (presence of metallic catalyst residues), but they do not address whether other mechanisms of toxicity might be important for short nanotubes.
Most importantly, the experiments do not say anything about issues of dose and exposure. In the experiments, the nanotubes were directly injected into peritoneal cavity; to establish whether environmental or workplace exposure to nanotubes present a danger one needs to know how likely it is that realistic exposures of inhaled nanotubes would lead to enough nanotubes crossing the lungs through to the mesothelium lead to toxic effects. This is the most urgent question now waiting for further research.
It isn’t clear what proportion of the carbon nanotubes now being produced on industrial, or at least pilot plant, scale, would have the characteristics – particularly in their length – that would lead to the risk of these toxic effects. However, those nanotubes that are already in the market-place are mostly in the form of advanced composites, in which the nanotubes are tightly bound in a resin matrix, so it seems unlikely that these will pose an immediate danger. We need, with some urgency, research into what might happen to the nanotubes in such products over their whole lifecycle, including after disposal.
I’ve been passing my driving time recently listening to the podcasts of an excellent series from the Canadian Broadcasting Corporation, called How to think about science. It’s simply a series of long interviews with academics, generally from the field of science studies. I’ve particularly enjoyed the interviews with historian of science Simon Schaffer, sociologists Ulrich Beck and Brian Wynne, science studies guru Bruno Latour, and Evelyn Fox Keller, who has written some interesting books about some of the tacit philosophies underlying modern biology. With one or two exceptions, even those interviews with people I find less convincing still provided me with a few thought provoking insights .
That strange academic interlude, the “science wars”, gets the occasional mention – this was the time when claims from science studies about the importance of social factors in the construction of scientific knowledge provoked a fierce counter-attack from people anxious to defend science against what they saw as an attack on its claims to objective truth. My perception is that the science wars ended in an armistice, though there are undoubtedly some people still holding out in the jungle, unaware that the war is over. Although the series is clearly presented from the science studies side of the argument, most contributors reflect the terms of the peace treaty, accepting the claims of science to be a way of generating perhaps uniquely reliable knowledge, while still insisting on the importance of the social in the way that knowledge is constructed, and criticising inappropriate ways of using scientific or pseudo-scientific arguments, models and metaphors in public discourse.
How much resource is being devoted to assessing the potential risks of the nanotechnologies that are currently at or close to market? Not nearly enough, say campaigning groups, while governments, on the other hand, release impressive sounding figures for their research spend. Most recently, the USA’s National Nanotechnology Initiative has estimated its 2006 spend on nano-safety research as $68 million, which sounds very impressive. However, according to Andrew Maynard, a leading nano-risk researcher based at the Woodrow Wilson Center in Washington DC, we shouldn’t take this figure at face value.
Maynard comments on the figure on the SafeNano blog, referring to an analysis recently done by him and described in a news release from the Woodrow Wilson Center’s Project on Emerging Nanotechnologies. It seems that this figure is obtained by adding up all sorts of basic nanotechnology research, some of which might have only tangential relevance to problems of risk. If one applies a tighter definition of research that is either highly relevant to nanotechnology risk – such as a direct toxicology study – or substantially relevant -such as a study of the fate in the body of medical nanoparticles – it seems that the numbers fall substantially. Only $13 million of the $68 million was highly relevant to nanotechnology risk, with this number increasing to $29 million if the substantially relevant category is included too. This compares unfavourably with European spending, which amounts to $24 million in the highly relevant category alone.
Of course, it isn’t the headline figure that matters; what’s important is whether the research is relevant to the actual and potential risks that are out there. The Project on Emerging Nanotechnologies has done a great service by compiling an international inventory of nanotechnology risk research which allows one to see clearly just what sort of risk research is being funded across the world. It’s clear from this that suggestions that nanotechnology is being commercialised with no risk research at all being done are wide of the mark; what requires further analysis is whether all the right research is being done.
The lack of availability of clean water to many of the world’s population currently leads to suffering and premature death for millions of people, and as population pressures increase, climate change starts to bite, and food supplies become tighter (perhaps exacerbated by an ill-considered move to biofuels) these problems will only intensify. It’s possible that nanotechnology may be able to contribute to solving these problems (see this earlier post, for example). A couple of weeks ago, Nature magazine ran a special issue on water, which included a very helpful review article: Science and technology for water purification in the coming decades. This article (which seems to be available without subscription) is all the more helpful for not focusing specifically on nanotechnology, instead making it clear where nanotechnology could fit into other existing technologies to create affordable and workable solutions.
One sometimes hears the criticism that there’s no point worrying about the promise of new nanotechnological solutions, when workable solutions are already known but aren’t being implemented, for political or economic reasons. That’s an argument that’s not without force, but the authors do begin to address it, by outlining what’s wrong with existing technical solutions. “These treatment methods are often chemically, energetically and operationally intensive, focused on large systems, and thus require considerable infusion of capital, engineering expertise and infrastructure” Thus we should be looking for decentralised solutions, that can be easily, reliably and cheaply installed using local expertise and preferably without the need for large scale industrial infrastructure.
To start with the problem of the sterilisation of water to kill pathogens, traditional methods start with chlorine. This isn’t ideal, as some pathogens are remarkably tolerant of it, and it can lead to toxic by-products. Ultra-violet sterilisation, on the other hand, offers a lot of promise – it’s good for bacteria, though less effective for viruses. But in combination with photocatalytic surfaces of titanium dioxide nanoparticles it could be very effective. Here what is required is either much cheaper sources of ultraviolet light, (which could come from new nanostructured semiconductor light emitting diodes) or new types of nanoparticles with surfaces excited by lower wavelength light, including sunlight.
Another problem is the removal of contamination by toxic chemicals, which can arise either naturally or through pollution. Problem contaminants include heavy metals, arsenic, pesticide residues, and endocrine disrupters; the difficulty is that these can have dangerous effects even at rather low concentrations, which can’t be detected without expensive laboratory-based analysis equipment. Here methods for robust, low cost chemical sensing would be very useful – perhaps a combination of molecular recognition elements integrated in nanofluidic devices could do the job.
The reuse of waste water offers hard problems because of the high content organic matter that needs to be removed, in addition to the removal of other contaminants. Membrane bioreactors combine the use of the sorts of microbes that are exploited in activated sludge processes of conventional sewage treatment with ultrafiltration through a membrane to get faster throughputs of waste water. The tighter the pores in this sort of membrane, the more effective it is at removing suspended material, but the problem is that this sort of membrane quickly gets blocked up. One solution is to line the micro- and nano- pores of the membranes with a single layer of hairy molecules – one of the paper’s co-authors, MIT’s Anne Mayes, developed a particularly elegant scheme for doing this exploiting self-assembly of comb-shaped copolymers.
Of course, most of the water in the world is salty (97.5%, to be precise), so the ultimate solution to water shortages is desalination. Desalination costs energy – necessarily so, as the second law of thermodynamics puts a lower limit on the cost of separating pure water from the higher entropy solution state. This theoretical limit is 0.7 kWh per cubic meter, and to date the most efficient practical process uses a not at all unreasonable 4 kWh per cubic meter. Achieving these figures, and pushing them down further, is a matter of membrane engineering, achieving precisely nanostructured pores that resist fouling and yet are mechanically and chemically robust.
Transport accounts for between a quarter and a third of primary energy use in developed economies, and currently this comes almost entirely from liquid hydrocarbon fuels. Anticipating a world with much more expensive oil and a need to dramatically reduce carbon dioxide emissions, many people have been promoting the idea of a hydrogen economy, in which hydrogen, generated in ways that minimise CO2 emissions, is used as a carrier of energy for transportation purposes. Despite its superficial attractiveness, and high profile political support, the hydrogen economy has many barriers to overcome before it becomes technically and economically feasible. Perhaps most pressing of these difficulties is the question of how this light, low energy density gas can be stored and transported. An entirely new pipeline infrastructure would be needed to move the hydrogen from the factories where it is made to filling stations, and, perhaps even more pressingly, new technologies for storing hydrogen in vehicles will need to be developed. Early hopes that nanotechnology would provide new and cost-effective solutions to these problems – for example, using carbon nanotubes to store hydrogen – don’t seem to be bearing fruit so far. Since using a gas as an energy carrier causes such problems, why don’t we stick with a flammable liquid? One very attractive candidate is methanol, whose benefits have been enthusiastically promoted by George Olah, a Nobel prize winning chemist from the University of Southern California, whose book Beyond Oil and Gas: The Methanol Economy describes his ideas in some technical detail.
The advantage of methanol as a fuel is that it is entirely compatible with the existing infrastructure for distributing and using gasoline; pipes, pumps and tanks would simply need some gaskets changed to switch over to the new fuel. Methanol is an excellent fuel for internal combustion engines; even the most hardened petrol-head should be convinced by the performance figures of a recently launched methanol powered Lotus Exige. However, in the future, greater fuel efficiency might be possible using direct methanol fuel cells if that technology can be improved.
Currently methanol is made from natural gas, but in principle it should be possible to make it economically by reacting carbon dioxide with hydrogen. Given a clean source of energy to make hydrogen (Olah is an evangelist for nuclear power, but if the scaling problems for solar energy were solved that would work too), one could recycle the carbon dioxide from fossil fuel power stations, in effective getting one more pass of energy out of it before releasing it into the atmosphere. Ultimately, it should be possible to extract carbon dioxide directly from the atmosphere, achieving in this way an almost completely carbon-neutral energy cycle. In addition to its use as a transportation fuel, it is also possible to use methanol as a feedstock for the petrochemical industry. In this way we could, in effect, convert atmospheric carbon dioxide into plastic.
If one wants to comment on the future of technology, it’s a good idea to have some understanding of its history. A new book by Robert Friedel,
A Culture of Improvement: Technology and the Western Millennium, takes on the ambitious task of telling the story of the development of technology in Europe and North America over the last thousand years.
The book is largely a very readable narrative history of technology, with some rather understated broader arguments. One theme is suggested by the title; in Friedel’s view the advance of technology has been driven, not so much by the spectacular advances of the great inventors, but by a mindset that continually seeks to make incremental improvements in existing technologies. The famous inventors, the James Watts and Alexander Graham Bells of history, certainly get due space, but there’s also an emphasis on placing the best-known inventions in the context of the less well known precursor technologies from which they sprung, and on the way engineers and workers continuously improved the technologies once they were introduced. Another theme is the way in which the culture of improvement was locked into place, as it were, by the institutions that promoted technical and scientific education, and the media that brought new scientific and technical ideas to a wide audience.
This provokes some revision of commonly held ideas about the relationship between science and engineering. In Friedel’s picture, the role of science has been, less to provide fundamental discoveries that engineers can convert into practical devices, and more to provide the mental framework that permits the process of incremental improvement. Those who wish to de-emphasise the importance of science for innovation often point to the example of the development of the steam engine – “thermodynamics owes much more to the steam engine than the steam engine owes to thermodynamics”, the saying goes. This of course is true as far as it goes – the academic subject of thermodynamics was founded by Sadi Carnot’s analysis of the steam engines that were already in widespread use, and which had been extensively developed without the benefit of much theoretical knowledge. But it neglects the degree to which an understanding of formal thermodynamics underlay the development of the more sophisticated types of engines that are still in use today. Rudolph Diesel’s efforts to develop the engine that bears his name, and which is now so important, were based on an explicit project to use the thermodynamics he had learned from his professor, Carl Linde (who also made huge contributions to the technology of refrigeration), to design the most efficient possible internal combustion engine.
Some aspects of the book are open to question. The focus on Europe, and the European offshoots in North America, is justified by the premise that there was something special in this culture that led to the “culture of improvement”; one could argue, though, that the period of unquestioned European technological advantage was a relatively short fraction of the millennium under study (it’s arguable, for example, that China’s medieval technological lead over Europe persisted well into the 18th century). And many will wonder whether technological advances always lead to “improvement”. A chapter on “the corruption of improvement” discusses the application of technology to weapons of mass destruction, but one feels that Friedel’s greatest revulsion is prompted by the outcome of the project to apply the culture of improvement to the human race itself. It’s useful to be reminded that the outcome of this earlier project for “human enhancement” was, particularly in the USA and Scandinavia, a programme of forced sterilisation of those deemed unfit to reproduce that persisted well into living memory. In Germany, of course, this “human enhancement” project moved beyond sterilisation to industrial-scale systematic murder of the disabled and those who were believed to be threats to “racial purity”.
As I mentioned on Wednesday, the UK government took the opportunity of Thursday’s nano-summit organised by the consumer advocate group Which? to release a statement about nanotechnology. The Science Minister’s speech didn’t announce anything new or dramatic – the minister did “confirm our commitment to keep nanotechnology as a Government priority”, though as the event’s chair, Nick Ross, observed, the Government has a great many priorities. The full statement (1.3 MB PDF) is at least a handy summary of what otherwise would be a rather disjointed set of measures and activities.
The other news from the Which? event was the release of the report from their Citizen’s Panel. Some summaries, as well as a complete report, are available from the Which? website. Some flavour of the results can be seen in this summary: “Panellists were generally excited about the potential that nanotechnologies offer and were keen to move ahead with developing them. However, they also recognised the need to balance this with the potential risks. Panellists identified many opportunities for nanotechnologies. They appreciated the range of possible applications and certain specific applications, particularly for health and medicine. The potential to increase consumer choice and to help the environment were also highlighted, along with the opportunity to ‘start again’ by designing new materials with more useful properties. Other opportunities they highlighted were potential economic developments for the UK (and the jobs this might create) and the potential to help developing countries (with food or cheaper energy).” Balanced against this generally positive attitude were concerns about safety, regulation, information, questions about the accessibility of the technology to the poor and the developing world, and worries about possible long-term environmental impacts.
The subject of nanotechnology was introduced at the meeting with this short film.
It seems likely that nanotechnology will move a little higher up the UK news agenda towards the end of this week – tomorrow sees the launch event for the results of a citizens’ panel run by the consumer group Which?. This will be quite a high profile event, with a keynote speech by the Science Minister, Ian Pearson, outlining current UK nanotechnology policy. This will be the first full statement on nanotechnology at Ministerial level for some time. I’m one the panel responding to the findings, which I will describe tomorrow.
Howard Lovy returns to his coverage of nanotechnology in popular culture with news of a forthcoming film, Nano Dogs the Movie, in which some lovable family pets acquire super abilities after scoffing some carelessly abandoned nanobots. Not to be outdone, I’ve been conducting my own in-depth cultural research, which has revealed that no less an icon of saturday morning children’s TV than Scooby Doo has fully entered the nanotechnology age.
In the current retooling of this venerable cartoon, Shaggy and Scooby Doo Get a Clue, the traditional plot standbys (it was the janitor, back-projecting the ghostly figures onto the clouds, and he’d have got away with it if it hadn’t been for those meddling kids) have been swept away to be replaced by an evil nanobot wielding scientist. But the nanobots aren’t all bad; Scooby Doo’s traditionally energising Scooby snacks have themselves been fortified with nanobots, giving him a number of super-dog powers.
I wasn’t able to follow all the plot twists on Sunday morning, as I had to cook the children’s porridge, but it seems that the imprudent nano-scientist had attempted to mis-use his nanobots in order to make his appearance (formerly plump, ageing, balding and with a bad haircut, as you’d expect) more, well, Californian. Naturally, this all ended badly. I’ve seen some less incisive commentaries on the human (or, indeed, canine) enhancement debate.