Politics and the National Nanotechnology Initiative

The view that the nanobusiness and nanoscience establishment has subverted the originally intended purpose of the USA’s National Nanotechnology Initiative has become received wisdom amongst supporters of the Drexlerian vision of MNT. According to this reading of nanotechnology politics,
any element of support for Drexler’s vision for radical nanotechnology has been stripped out of the NNI to make it safe for mundane near-term applications of incremental nanotechnology like stain resistant fabric. This position is succintly expressed in this Editorial in the New Atlantis, which makes the claim that the legislators who supported the NNI did so in the belief that it was the Drexlerian vision that they were endorsing.

A couple of points about this position worry me. Firstly, we should be very clear that there is a very important dividing line in the relationship between science and politics that any country should be very wary of crossing. In a democratic country, it’s absolutely right that the people’s elected representatives should have the final say about what areas of science and technology are prioritised for public spending, and indeed what areas of science are left unpursued. But we need to be very careful to make sure that this political oversight of science doesn’t spill over into ideological statements about the validity of particular scientific positions. If supporters of MNT were to argue that the government should overrule the judgement of the scientific community about what approach to radical nanotechnology is most likely to work on what are essentially ideological grounds, then I’d suggest they recall the tragic and unedifying history of similar interventions in the past. Biology in the Soviet Union was set back for a generation by Lysenko, who, unable to persuade his colleagues of the validity of his theory of genetics, appealed directly to Stalin. Such perversions aren’t restricted to totalitarian states; Edward Teller used his high level political connections to impose his vision of the x-ray laser on the USA’s defense research establishment, in the face of almost universal scepticism from other physicists. The physicists were right, and the program was abandoned, but not before the waste of many billions of dollars.

But there’s a more immediate criticism of the theory that the NNI has been highjacked by nanopants. This is that it’s not right, even from the point of view of supporters of Drexler. The muddle and inconsistency comes across most clearly on the Center for Responsible Nanotechnology’s
blog. While this entry strongly endorses the New Atlantis line, this entry only a few weeks earlier expresses the opinion that the most likely route to radical nanotechnology will come through wet, soft and biomimetic approaches. Of course, I agree with this (though my vision of what radical nanotechnology will look like is very different from that of supporters of MNT); it is the position I take in my book Soft Machines; it is also, of course, an approach recommended by Drexler himself. Looking across at the USA, I see some great and innovative science being done along these lines. Just look at the work of Ned Seeman, Chad Mirkin, Angela Belcher or Carlo Montemagno, to take four examples that come immediately to mind. Who is funding this kind of work? It certainly isn’t the Foresight Institute – no, it’s all those government agencies that make up the much castigated National Nanotechnology Initiative.

Of course, supporters of MNT will say that, although this work may be moving in the direction that they think will lead to MNT, it isn’t been done with that goal explicitly stated. To this, I would simply ask whether it isn’t a tiny bit arrogant of the MNT visionaries to think that they are in a better position to predict the outcome of these lines of inquiry than the people who are actually doing the research.

Whenever science funding is allocated, there is a real tension between the short-term and the long-term, and this is a legitimate bone of contention between politicians and legislators, who want to see immediate results in terms of money and jobs for the people they represent, and scientists and technologists with longer term goals. If MNT supporters were simply to argue that the emphasis of the NNI should be moved away from incremental applications towards longer term, more speculative research, then they’d find a lot of common cause with many nanoscientists. But it doesn’t do anyone any good to confuse these truly difficult issues with elaborate conspiracy theories.

Politics in the UK

Some readers may have noticed that we are in the middle of an election campaign here in the UK. Unsurprisingly, science and technology have barely been mentioned at all by any of the parties, and I don’t suppose many people will be basing their voting decisions on science policy. It’s nonetheless worth commenting on the parties’ plans for science and technology.

I discussed the Labour Party’s plans for science for the next three years here – this foresees significant real-terms increases in science funding. The Conservative Party has promised to “at least match the current administration’s spending on science, innovation and R&D”. However, the Conservative’s spending plans are predicated on finding ��35 billion in “efficiency savings”, of which ��500 million is going to come from reforming the Department of Trade and Industry’s business support programmes. I believe it is under this heading that the ��200 million support for nanotechnology discussed here comes from, so I think the status of these programmes in a Conservative administration would be far from assured. The Liberal Democrats take a simpler view of the DTI – they just plan to abolish it, and move science to the Department for Education.

So, on fundamental science support, there seems to be a remarkable degree of consensus, with no-one seeking to roll back the substantial increases in science spending that the Labour Party has delivered. The arguments really are on the margins, about the role of government in promoting applied and near-market research in collaboration with industry. I have many very serious misgivings about the way in which the DTI has handled its support for micro- and nano- technology. In principle, though, I do think it is essential that the UK government does provide such support to businesses, if only because all other governments around the world (including, indeed perhaps especially, the USA) practise exactly this sort of interventionist policy.

Nobel Laureates Against Nanotechnology

This small but distinguished organisation has gained another two members. The theoretical condensed matter physicist Robert Laughlin, in his new book A Different Universe: reinventing physics from the bottom down, has a rather scathing assessment of nanotechnology, with which Philip Anderson (who is himself a Nobel Laureate and a giant of theoretical physics), reviewing the book in Nature(subscription required), concurs. Unlike Richard Smalley, Laughlin’s criticism is directed at the academic version of nanotechnology, rather than the Drexlerian version, but adherents of the latter shouldn’t feel too smug because Laughlin’s criticism applies with even more force to their vision. He blames the seductive power of reductionist belief for the delusion: “The idea that nanoscale objects ought to be controllable is so compelling it blinds a person to the overwhelming evidence that they cannot be”.

Nanotechnologists aren’t the only people singled out for Laughlin’s scorn. Other targets include quantum computing, string theory (“the tragic consequence of an obsolete belief system”) and most of modern biology (“an endless and unimaginably expensive quagmire of bad experiments”). But underneath all the iconoclasm and attitude (and personally I blame Richard Feynman for making all American theoretical physicists want to come across like rock stars), is a very serious message.

Laughlin’s argument is that reductionism should be superseded as the ruling ideology of science by the idea of emergence. To quote Anderson “The central theme of the book is the triumph of emergence over reductionism: that large objects such as ourselves are the product of principles of organization and of collective behaviour that cannot in any meaningful sense be reduced to the behaviour of our elementary constituents.” The origin of this idea is Anderson himself, in a widely quoted article from 1971 – More is different. In this view, the idea that physics can find a “Theory of Everything” is fundamentally wrong-headed. Chemistry isn’t simply the application of quantum mechanics, and biology is not simply reducible to chemistry; the organisation principles that underlie, say, the laws of genetics, are just as important as the properties of the things being organised.

Anderson’s views on emergence aren’t as widely known as they should be, in a world dominated by popular science books on string theory and “the search for the God particle”. But they have been influential; an intervention by Anderson is credited or blamed by many people for killing off the Superconducting Supercollider project, and he is one of the founding fathers of the field of complexity. Laughlin explicitly acknowledges his debt to Anderson, but he holds to a particularly strong version of emergence; it isn’t just that there are difficulties in practise in deriving higher level laws of organisation from the laws describing the interactions of their parts. Because the organisational principles themselves are more important than the detailed nature of the interactions between the things being organised, the reductionist program is wrong in principle, and there’s no sense in which the laws of quantum electrodynamics are more fundamental than the laws of genetics (in fact, Laughlin argues on the basis of the strong analogies between QED and condensed matter field theory that QED itself is probably emergent). To my (philosophically untrained) eye, this seems to put Laughlin’s position quite close to that of the philosopher of science Nancy Cartwright. There’s some irony in this, because Cartwright’s book The Dappled World was bitterly criticised by Anderson himself.

This takes us a long way from nanoscience and nanotechnology. It’s not that Laughlin believes that the field is unimportant; in fact he describes the place where nanoscale physics and biology meets as being the current frontier of science. But it’s a place that will only be understood in terms of emergent properties. Some of these, like self-assembly, are starting to be understood, but many others are not. But what is clear is that the reductionist approach of trying to impose simplicity where it doesn’t exist in nature simply won’t work.

How are we doing?

Howard Lovy’s Nanobot draws attention to an interesting piece in SciDevNet discussing bibliometric measures of the volume and impact of nanotechnology research in various parts of the world. This kind of measurement – in which databases are used to count numbers of papers published and the number of times such papers are cited by other papers – is currently very popular among governments attempting to assess whether the investments they make in science are worthwhile. I was shown a similar set of data about the UK, commissioned by the Engineering and Physical Science Research Council, at a meeting last week. The attractions of this kind of analysis are obvious, because it is relatively easily commissioned and done, and it yields results that can be plotted in plausible and scientific looking graphs.

The drawbacks perhaps are less obvious, but are rather serious. How do you tell what papers are actually about nanotechnology, given the difficulties of defining the subject? The obvious thing to do is to search for papers with “nano” in the title or abstract somewhere – this is what the body charged with evaluating the USA’s National Nanotechnology Initiative have done. What’s wrong with this is that many of the best papers on nanotechnology simply don’t feel the need to include the nano- word in their title. Why should they? The title tells us what the paper is about, which is generally a much more restricted and specific subject than this catch-all word. I’ve been looking up papers on single molecule electronics today. I’d have thought that everyone would agree that the business of trying to measure the electrical properties of single molecules, one at a time, and wiring them up to make ultra-miniaturised electronic devices, was as hardcore as nanotechnology comes. But virtually none of the crucial papers on the subject over the last five years would have shown up on such a search.

The big picture that these studies are telling us does ring true; the majority of research in nanoscience and nanotechnology is done outside the USA, and this kind of research in China has been growing exponentially in both volume and impact in recent years. But we shouldn’t take the numbers too seriously; if we do, it’s only a matter of time before some science administrator realises that the road to national nanotechnology success is simply to order all the condensed matter physicists, chemists and materials scientists to stick “nano-” somewhere in the titles of all their papers.

Debating nanotechnologies

To the newcomer, the nanotechnology debate must be very confusing. The idea of a debate implies two sides, but there are many actors debating nanotechnology, and they don’t even share a common understanding of what the word means. The following extended post summarises my view of this many-faceted discussion. Regular readers of Soft Machines will recognise all the themes, but I hope that newcomers will find it helpful to find them all in one place.

Nanotechnology has become associated with some very far-reaching claims. Its more enthusiastic adherents believe that it will be utterly transformational in its effects on the economy and society, making material goods of all sorts so abundant as to be essentially free, restoring the environment to a pristine condition, and revolutionising medicine to the point where death can be abolished. Nanotechnology has been embraced by governments all over the world as a source of new wealth, with the potential to take the place of information technology as a driver for rapid economic growth. Breathless extrapolations of a new, trillion-dollar nanotechnology industry arising from nowhere are commonplace. These optimistic visions have led to new funding being lavished on scientists working on nanotechnology, with the total amount being spent a subject for competition between governments across the developed world. As an antidote to all this optimism, NGOs and environmental groups have begun to mobilise against what they see as another example of excessive scientific technological hubris, which falls clearly in the tradition of nuclear energy and genetic modification, as a technology which promised great things but delivered, in their view, more environmental degradation and social injustice.

And yet, despite this superficial agreement on the transformational power of nanotechnology, whether for good or bad, there are profound disagreements not just about what the technology can deliver, but about what it actually is. The most radical visions originate from the writings of K. Eric Drexler, who wrote an influential and widely read book called “Engines of Creation”. This popularised the term “nanotechnology”, developing the idea that mechanical engineering principles could be applied on a molecular scale to create nano-machines which could build up any desired material or artefact with ultimate precision, atom by atom. It is this vision of nanotechnology, subsequently developed by Drexler in his more technical book Nanosystems, that has entered popular culture through films and science fiction books, perhaps most notably in Neal Stephenson’s novel “The Diamond Age”.

To many scientists, science fiction novels are where Drexler’s visions of nanotechnology should stay. In a falling out which has become personally vituperative, leading scientific establishment figures, notably the Nobel Laureate Richard Smalley, have publically ridiculed the Drexlerian project of shrinking mechanical engineering to molecular dimensions. What is dominating the scientific research agenda is not the single Drexlerian vision, but instead a rather heterogenous collection of technologies, whose common factor is simply a question of scale. These evolutionary nanotechnologies typically involve the shrinking down of existing technologies, notably in information technology, to smaller and smaller scales. Some of the products of these developments are already in the shops. The very small, high density hard disk drives that are now found not just in computers, but in consumer electronics like MP3 players and digital video recorders, rely on the ability to create nanoscale multilayer structures which have entirely new physical properties like giant magnetoresistance. Not yet escaped from the laboratory are new technologies like molecular electronics, in which individual molecules play the role of electronic components. Formidable obstacles remain before these technologies can be integrated to form practical devices that can be commercialised, but the promise is yet another dramatic increase in computing power. Medicine should also benefit from the development of more sophisticated drug delivery devices; this kind of nanotechnology will also play a major role in the development of tissue engineering.

What of the products that are already on shop shelves, boasting of their nanotechnological antecedents? There are two very well publicised examples. The active ingredient in some sunscreens consists of titanium dioxide crystals whose sizes are in the nanoscale range. In this size range, the crystals, and thus the sunscreen, are transparent to visible light, rather than having the intense white characteristic of the larger titanium dioxide crystals familiar in white emulsion paint. Another widely reported applications of nanotechnology are in fabric treatments, which by coating textile fibres with molecular size layers give them properties such as stain resistance. These applications, although mundane, result from the principle that matter when divided on this very fine scale, can have different properties to bulk matter. However, it has to be said that these kinds of products represent the further development of trends in materials science, colloid science and polymer science that have been in train for many years. This kind of incremental nanotechnology, then, does involve new and innovative science, but it isn’t different in character to other applications of materials science that may not have the nano- label. To this extent, the decision to refer to these applications as nanotechnology involves marketing as much as science. But what we will see in the future are more and more of this kind of application making their way to the marketplace, offering real, if not revolutionary, advances over the products that have gone before. These developments won’t be introduced in a single “nanotechnology industry”; rather these innovations will find their way into the products of all kinds of existing industries, often in rather an unobtrusive way.

The idea of a radical nanotechnology, along the lines mapped out by Drexler and his followers, has thus been marginalised on two fronts. Those interested in developing the immediate business applications of nanotechnology have concentrated on the incremental developments that are close to bringing products to market now, and are keen to downplay the radical visions because they detract from the immediate business credibility of their short-term offerings. Meanwhile the nano-science community is energetically pursuing a different evolutionary agenda. Is it possible that both scientists and the nanobusiness community are too eagerly dismissing Drexler’s ideas – could there be, after all, something in the idea of a radical nanotechnology?

My personal view is that while some of Smalley’s specific objections don’t hold up in detail, and it is difficult to dismiss the Drexlerian proposals out of hand as being contrary to the laws of nature, the practical obstacles they face are very large. To quote Philip Moriarty, an academic nanoscientist with a great deal of experience of manipulating single molecules, “the devil is in the details”, and as soon as one starts thinking through how one might experimentally implement the Drexlerian program a host of practical problems emerge.

But one aspect of Drexler’s argument is very important, and undoubtedly correct. We know that a radical nanotechnology, with sophisticated nanoscale machines operating on the molecular scale, can exist, because cell biology is full of such machines. This is beautifully illustrated in David Goodsell’s recent book Bionanotechnology: Lessons from Nature. But Drexler goes further. He argues that if nature can make effective nanomachines from soft and floppy materials, with the essentially random design processes of evolution, then the products of a synthetic nanotechnology, using the strongest materials and the insights of engineering, will be very much more effective. My own view (developed my book “Soft Machines”) is that this underestimates the way in which biological nanotechnology exploits and is optimised for the peculiar features of the nanoscale world. To take just one example of a highly efficient biological nanomachine, ATP-synthase is a remarkable rotary motor which life-forms as different as bacteria and elephants all use to synthesise the energy storage molecular ATP. The efficiency with which it converts energy from one form to another is very close to 100%, a remarkable result when one considers that most human-engineered energy conversion devices, such as steam turbines and petrol engines, struggle to exceed 50% efficiency. This is one example, then, of a biological nanomachine that is close to optimal. The reason for this is that biology uses design principles very different to those we learn about in human-scale engineering, that exploit the special features of the nanoworld. There’s no reason in principle why we could not develop a radical nanotechnology that uses the same design principles as biology, but the result will look very different to the miniaturised cogs and gears of the Drexlerian vision. Radical nanotechnologies will be possible, then, but they will owe more to biology than to conventional engineering.

Discussion of the possible impacts of nanotechnology, both positive and negative, has shown signs of becoming polarised along the same lines as the technical discussion. The followers of Drexler promise on the on hand a world of abundance of all material needs, and an end to disease and death. But they’ve also introduced perhaps the most persistent and gripping notion – the idea that artificial, self-replicating nanoscale robots would escape our control and reproduce indefinitely, consuming all the world’s resources, and rendering existing life extinct. The idea of this plague of “grey goo” has become firmly embedded in our cultural consciousness, despite some indications of regret from Drexler, who has more lately emphasised the idea that self-replication is neither a desirable nor a necessary feature of a nanoscale robot. The reaction of nano-scientists and business people to the idea of “grey goo” has been open ridicule. Actually, it is worth taking the idea seriously enough to give it a critical examination. Implicit in the notion of “grey goo” is the assumption that we will be able to engineer what is effectively a new form of life that is more fit, in a Darwinian sense, and better able to prosper in the earth’s environment than existing life-forms. On the other hand, the argument that biology at the cell level is already close to optimal for the environment of the earth means that the idea that synthetic nano-robots will have an effortless superiority over natural lifeforms is much more difficult to sustain.

Meanwhile, mainstream nanobusiness and nanoscience has concentrated on one very short-term danger, the possibility that new nanoparticles may be more toxic than their macroscale analogues and precursors. This fear is very far from groundless; since one of the major selling points of nanoparticles is that their properties may be different from the analogous matter in a less finely divided state, it isn’t at all unreasonable to worry that toxicity may be another property that depends on size. But I can’t help feeling that there is something odd about the way the debate has become so focused on this one issue; it’s an unlikely alliance of convenience between nanobusiness, nanoscience, government and the environmental movement, all of whom have different reasons for finding it a convenient focus. For the environmental movement, it fits a well-established narrative of reckless corporate interests releasing toxic agents into the environment without due care and attention. For nanoscientists, it’s a very contained problem which suggests a well-defined research agenda (and the need for more funding). By tinkering with regulatory frameworks, governments can be seen to be doing something, and nanobusiness can demonstrate their responsibility by their active participation in the process.

The dominance of nanoparticle toxicity in the debate is a vivid illustration of a danger that James Wilsdon has drawn attention to – the tendency for all debates on the impact of science on society to end up exclusively focused on risk assessment. In the words of a pamphlet by Willis and Wilsdon – “See-through Science” – “in the ‘risk society’ perhaps the biggest risk is that we never get around to talking about anything else.” Nanotechnology – even in its evolutionary form – presents us with plenty of very serious things to talk about. How will privacy and civil liberties survive in a world in which every artefact, no matter how cheap, includes a networked computer? How will medical ethics deal with a blurring of the line between the human and the machine, and the line between remedying illness and enhancing human capabilities?

Some people argue that new technologies like nanotechnology are potentially so dehumanising that we should consciously relinquish them. Bill McKibben, for example, makes this case very eloquently in his book “Enough“. Although I have a great deal of sympathy with McKibben’s rejection of the values of the trans-humanists, who consciously seek to transcend humanity, I don’t think the basic premise of McKibben’s thesis is tenable. The technology we have already is not enough. Mankind currently depends for its very existence at current population levels on technology. To take just one example, our agriculture depends on the artificial fixation of nitrogen, which is made possible by the energy we derive from fossil fuels. And yet the shortcomings of our existing technologies are quite obvious, from the eutrophication that excessive use of synthetic fertilisers causes, to the prospect of global climate change as a result of our dependence on fossil fuels. As the population of the world begins to stabilise, we have the challenge of developing new technologies that will allow for the whole population of the world to have decent standards of living on a sustainable basis. Nanotechnology could play an important role, for example by delivering cheap solar cells and the infrastructure for a hydrogen economy, together with cheap ways of providing clean water. But there’ll need to be real debates about how to set priorities so that the technology bring benefits to the poor as well as the rich.

Soft Machines in agreement with the ETC group shock…

Soft Machines is making a guest appearance on Howard Lovy’s Nanobot, with my impressions of the event at the Science Museum at which the Science Minister, Lord Sainsbury, announced the government response to the Royal Society report on nanotechnology. Howard had hoped that by juxtaposing my report with the report of the ETC group’s Jim Thomas, he’d have an interesting point-counterpoint. Remarkably, Jim and I seem to be rather more in agreement than usual.

I’ll give a more detailed analysis of the government’s written response here later.

Nanotechnology moves up the UK news agenda again

I arrived at my office after my afternoon lecture today to find a note saying a film crew was arriving in 30 minutes; sure enough my colleague, Tony Ryan, and I spent a couple of hours filming interviews amid the bubbling flasks of the chemistry department talking about what nanotechnology is, is not, and might become. This will be boiled down to about a minute and a half on Yorkshire Television’s early evening news magazine. Such is the lot of a would-be science populariser.

The reason for this timing is a bit of pre-positioning that’s going on by the media in the UK at the moment. We’re expecting some significant nanotechnology related news on Friday, so people are getting their stories ready.

Quotations for the week

This week’s quotation on Soft Machines comes from that pioneer of British empiricism, Sir Francis Bacon:

It cannot be that axioms established by argumentation can suffice for the discovery of new works, since the subtlety of nature is greater many times than the subtlety of argument.

I write this with Philip Moriarty in mind, since he’s going to be taking a break from participating in debates on Soft Machines and elsewhere. I would like to record my gratitude to Philip, because he’s made a tremendous contribution to this blog in the last couple of months. I think he’s made a really important contribution to the debate, not least by forcibly reminding us how subtle and complex surface physics can be. As another oft-quoted saying goes (usually attributed to Wolfgang Pauli):

God made solids, but surfaces were the work of the devil.

Who do we think we are?

I’m grateful for this glowing endorsement from TNTlog, and I’m impressed that it takes as few as two scientist bloggers to make a trend. But I’m embarrassed that Howard Lovy’s response seems to have taken the implied criticism so personally. I’ve always enjoyed reading NanoBot. I don’t always agree with Howard’s take on various issues, but he’s always got interesting things to say and his insistence on the importance of appreciating the interaction between nanotechnology and wider culture is spot-on.

But I think Howard’s pained sarcasm – “Scientists, go write about yourselves, and we in the public will read with wide-eyed wonder about the amazing work you’re doing and thank you for lowering yourselves to speak what you consider to be our language” – misses the mark. There are many ways in which scientists can contribute to this debate besides this crude and demeaning de haut en bas caricature, and many of them reflect real deficiencies in the ways in which mainstream journalists cover science.

To many journalists, science is marked by breakthroughs, which are conveniently announced by press releases from publicity hungry university or corporate press offices, or from the highly effective news offices of the scientific glamour magazines, Nature and Science. But scientists never read press releases, and they very rarely write them, because the culture of science doesn’t marry at all well with the event-driven mode of working of journalism. Very rarely, real breakthroughs really are made, though often their significance isn’t recognised at the time. But the usual pattern is of incremental advances, continuous progress and a mixture of cooperation and competition between labs across the world working in the same area. If scientists can write about science as it really is practised, with all its debates and uncertainties, unfiltered by press offices, that seems to me to be entirely positive. It’s also less likely, rather than more likely, to lead to the glorification and self-aggrandisement of scientists that Howard seems to think is our aim.

Converging technologies in Europe and the USA

Last Thursday saw a meeting in London to introduce to the UK a report that came out last summer on the convergence of nanotechnology, biotechnology, information technology and neuroscience. Converging technologies for a diverse Europe can essentially be thought of as the European answer to the 2002 report from the USA, Converging Technologies for Improving Human Performance. The speaker line-up, besides me, included social scientists, futurologists, an arms control expert and an official from the European Commission. What was striking to me was how much this debate was framed in terms Europe trying to position itself somewhat apart from the USA, though perhaps this isn’t surprising in view of the broader flow of international politics at the moment.

It’s almost a clich?� that public opinion is very different on the two continents, with the USA being much more uninhibited in its welcoming of new technology than the more technophobic Europeans. George Gaskell, a sociologist from the London School of Economics, presented survey data that at first seems to confirm this view. In his 2002 surveys, he found that while 50% of people in the USA were sure that nanotechnology would be positive in its outcome, only 29% of Europeans were so optimistic. But the picture isn’t as simple as it first appears; the figures for the proportion who thought that nanotechnology would make things worse were not actually that different – 4% in the USA compared to 6% in Europe. The Europeans were simply taking the attitude that they didn’t know enough to judge. The absence of any across-the-board distrust of technology is shown by a comparison of attitudes to three key technologies – nuclear energy, computers and information technology and biotechnology. The data showed almost overwhelming opposition to nuclear power, equally overwhelming enthusiasm for computers and communication technology, and a mixed picture for biotech. The key issues for acceptance prove not to be any deep enthusiasm or distrust for technology in general; it’s simply a balance of the benefits and risks together with a judgement on how much the governance and regulation of the technology can be trusted.

Where there is a big difference between Europe and the USA is in the importance of the military in driving research. J?�rgen Altmann, a physicist turned arms-control expert from The University of Dortmund, is very worried about the military applications of nanotechnology, and his worries are nicely summarised in this pdf handout. His view is that the USA is currently undertaking an arms race against itself, wasting resources that could otherwise be used both to boost economic competitiveness and to counter the real threat that both the USA and Europe face by more appropriate and low-tech means. Others, of course, will differ on the nature of the threat and the best way to counter it.

The balance between civil and military research and development was also highlighted by Elie Faroult, from the Research Directorate of the European Commission, who pointed out with some glee that the EU was now considerably ahead of the USA in investment in most civil research, and that this trend is accelerating as the USA squeezes spending on non-military science. For him, this gave Europe the opportunity to develop a distinctive set of research goals which emphasised social coherence and environmental sustainability as well as economic competitiveness. But having taken the obligatory side-swipe at the USA he finished by saying that of course, looking to the future, it wasn’t the USA that Europe was in competition with. The real competitor for both the USA and Europe was China.