Nobel Laureates Against Nanotechnology

This small but distinguished organisation has gained another two members. The theoretical condensed matter physicist Robert Laughlin, in his new book A Different Universe: reinventing physics from the bottom down, has a rather scathing assessment of nanotechnology, with which Philip Anderson (who is himself a Nobel Laureate and a giant of theoretical physics), reviewing the book in Nature(subscription required), concurs. Unlike Richard Smalley, Laughlin’s criticism is directed at the academic version of nanotechnology, rather than the Drexlerian version, but adherents of the latter shouldn’t feel too smug because Laughlin’s criticism applies with even more force to their vision. He blames the seductive power of reductionist belief for the delusion: “The idea that nanoscale objects ought to be controllable is so compelling it blinds a person to the overwhelming evidence that they cannot be”.

Nanotechnologists aren’t the only people singled out for Laughlin’s scorn. Other targets include quantum computing, string theory (“the tragic consequence of an obsolete belief system”) and most of modern biology (“an endless and unimaginably expensive quagmire of bad experiments”). But underneath all the iconoclasm and attitude (and personally I blame Richard Feynman for making all American theoretical physicists want to come across like rock stars), is a very serious message.

Laughlin’s argument is that reductionism should be superseded as the ruling ideology of science by the idea of emergence. To quote Anderson “The central theme of the book is the triumph of emergence over reductionism: that large objects such as ourselves are the product of principles of organization and of collective behaviour that cannot in any meaningful sense be reduced to the behaviour of our elementary constituents.” The origin of this idea is Anderson himself, in a widely quoted article from 1971 – More is different. In this view, the idea that physics can find a “Theory of Everything” is fundamentally wrong-headed. Chemistry isn’t simply the application of quantum mechanics, and biology is not simply reducible to chemistry; the organisation principles that underlie, say, the laws of genetics, are just as important as the properties of the things being organised.

Anderson’s views on emergence aren’t as widely known as they should be, in a world dominated by popular science books on string theory and “the search for the God particle”. But they have been influential; an intervention by Anderson is credited or blamed by many people for killing off the Superconducting Supercollider project, and he is one of the founding fathers of the field of complexity. Laughlin explicitly acknowledges his debt to Anderson, but he holds to a particularly strong version of emergence; it isn’t just that there are difficulties in practise in deriving higher level laws of organisation from the laws describing the interactions of their parts. Because the organisational principles themselves are more important than the detailed nature of the interactions between the things being organised, the reductionist program is wrong in principle, and there’s no sense in which the laws of quantum electrodynamics are more fundamental than the laws of genetics (in fact, Laughlin argues on the basis of the strong analogies between QED and condensed matter field theory that QED itself is probably emergent). To my (philosophically untrained) eye, this seems to put Laughlin’s position quite close to that of the philosopher of science Nancy Cartwright. There’s some irony in this, because Cartwright’s book The Dappled World was bitterly criticised by Anderson himself.

This takes us a long way from nanoscience and nanotechnology. It’s not that Laughlin believes that the field is unimportant; in fact he describes the place where nanoscale physics and biology meets as being the current frontier of science. But it’s a place that will only be understood in terms of emergent properties. Some of these, like self-assembly, are starting to be understood, but many others are not. But what is clear is that the reductionist approach of trying to impose simplicity where it doesn’t exist in nature simply won’t work.

Nanotechnology and the developing world

There’s a rather sceptical commentary from Howard Lovy about a BBC report on a study from Peter Singer and coworkers. At the centre of the report is a list of areas in which the authors feel that nanotechnology can make positive contributions to the developing world. Howard’s piece attracted some very sceptical comments from Jim Thomas, of the ETC Group. Jim is very suspicious of high-tech “solutions” to the problems of the developing world which don’t take account of local cultures and conditions. In particular, he sees the role of multinational companies as being particularly problematic, especially with regard to issues of ownership, control and intellectual property.

I see the problem of multinational companies in rather different terms. To take a concrete example, I’d cited the case of insecticide-treated mosquito nets for the control of malaria as a place where nanoscale technology could make a direct impact (and Jim did seem to agree, with some reservations, that this in could, in some circumstances, be an appropriate solution). The technical problem with insecticide treated mosquito nets is that the layer of active material isn’t very robustly attached, and the effectiveness of the nets falls away too rapidly with time, and even more rapidly when the nets are washed. One solution is to use micro- or nano-encapsulation of the insecticide to achieve long-lasting controlled release. The necessary technology to do this is being developed in agrochemical multinationals. The problem, though, is that their R&D efforts are steered by the monetary size of the markets they project. They’d much rather develop termite defenses for wealthy suburbanites in Florida than mosquito nets. The problem, then, isn’t that these multinationals will impose technical fixes on the developing world, it’s that they’ll just ignore the developing world entirely and potentially valuable technologies simply won’t reach the places where they could do some good.

To overcome this market failure needs intervention from governments, foundations and NGOs, as well as some active and informed technology brokering. Looking at it in this light, it seems to me that the Singer paper is a useful contribution.

How are we doing?

Howard Lovy’s Nanobot draws attention to an interesting piece in SciDevNet discussing bibliometric measures of the volume and impact of nanotechnology research in various parts of the world. This kind of measurement – in which databases are used to count numbers of papers published and the number of times such papers are cited by other papers – is currently very popular among governments attempting to assess whether the investments they make in science are worthwhile. I was shown a similar set of data about the UK, commissioned by the Engineering and Physical Science Research Council, at a meeting last week. The attractions of this kind of analysis are obvious, because it is relatively easily commissioned and done, and it yields results that can be plotted in plausible and scientific looking graphs.

The drawbacks perhaps are less obvious, but are rather serious. How do you tell what papers are actually about nanotechnology, given the difficulties of defining the subject? The obvious thing to do is to search for papers with “nano” in the title or abstract somewhere – this is what the body charged with evaluating the USA’s National Nanotechnology Initiative have done. What’s wrong with this is that many of the best papers on nanotechnology simply don’t feel the need to include the nano- word in their title. Why should they? The title tells us what the paper is about, which is generally a much more restricted and specific subject than this catch-all word. I’ve been looking up papers on single molecule electronics today. I’d have thought that everyone would agree that the business of trying to measure the electrical properties of single molecules, one at a time, and wiring them up to make ultra-miniaturised electronic devices, was as hardcore as nanotechnology comes. But virtually none of the crucial papers on the subject over the last five years would have shown up on such a search.

The big picture that these studies are telling us does ring true; the majority of research in nanoscience and nanotechnology is done outside the USA, and this kind of research in China has been growing exponentially in both volume and impact in recent years. But we shouldn’t take the numbers too seriously; if we do, it’s only a matter of time before some science administrator realises that the road to national nanotechnology success is simply to order all the condensed matter physicists, chemists and materials scientists to stick “nano-” somewhere in the titles of all their papers.

The BBC’s Reith Lectures cover nanotechnology

Every year the BBC broadcasts a series of radio lectures on some rather serious subject, given by an appropriately weighty public intellectual. This year’s series is called “The triumph of technology”, and the fourth lecture (to be broadcast at 8 pm on the 27th April), is devoted to nanotechnology and nanoscience. The lecturer is Lord Broers, who certainly qualifies as a prominent member of that class that the British call the Great and Good. He’s recently stepped down from being Vice-Chancellor of Cambridge University, he’s President of the Royal Academy of Engineering, and is undoubtedly an ornament to a great number of important committees. But what’s interesting is that he does describe himself as a nanotechnologist. His early academic work was on scanning electron microscopy and e-beam lithography, and before returning to academia he did R&D for IBM.

The introductory lecture – Technology will Determine the Future of the Human Race – has already been broadcast; you can read the text or download an MP3 from the BBC website. This first lecture is rather general, so it will be interesting to see if he develops any of his themes in more unexpected directions.

Disentangling thin polymer films

Many of the most characteristic properties of polymer materials like plastics come from the fact that their long chain molecules get tangled up. Entanglements between different polymer chains behave like knots, which make a polymer liquid behave like a solid over quite perceptible time scales, just like silly putty. The results of a new experiment show that when you make the polymer film very thin – thinner than an individual polymer molecule – the chains become less entangled with each other, with significant effects on their mechanical properties.

The experiments are published in this weeks Physical Review Letters; I’m a co-author but the main credit lies with my colleagues Lun Si, Mike Massa and Kari Dalnoki-Veress at McMaster University, Canada. The abstract is here, and you can download the full paper as a PDF (this paper is copyright the American Physical Society and is available here under the author rights policy of the APS).

This is the latest in a whole series of discoveries of ways in which the properties of polymer films dramatically change when their thicknesses fall towards 10 nm and below. Another example is the discovery that the glass transition temperature of polymer films – the temperature at which a polymer like polystyrene changes from a glassy solid to a gooey liquid – dramatically decreases in thin films. So a material that would in the bulk be a rigid solid may, in a thin enough film, turn into a much less rigid, liquid-like layer (see this technical presentation for more details). Why does this matter? Well, one reason is that, as feature sizes in the microelectronics industry fall below 100 nm, the sharpness with which one can define a line in a thin film of a polymer resist could limit the perfection of the features one is making. So the fact that the mechanical properties of the polymer themselves change, purely as a function of size, could lead to problems.

Nanoscience, small science, and big science

Quite apart from the obvious pun, it’s tempting to think of nanoscience as typical small science. Most of the big advances are made by small groups working in universities on research programs devised, not by vast committees, but by the individual professors who write the grant applications. Equipment is often quite cheap, by scientific standards – a state of the art atomic force microscope might cost $200,000, and doesn’t need a great deal of expensive infrastructure to house it. If you have the expertise and the manpower, but not the money, you could build one yourself for perhaps a tenth of this price or less. This is an attractive option for scientists in developing countries, and this is one reason why nanoscience has become such a popular field in countries like India and China. It’s all very different from the huge and expensive multinational collaborations that are necessary for progress in particle physics, where a single experiment may involve hundreds of scientists and hundreds of millions of dollars – the archetype of big science.

Big science does impact on the nanoworld, though. Techniques that use the highly intense beams of x-rays obtained from synchrotron sources like the ESRF at Grenoble, France, and the APS, on the outskirts of Chicago, USA, have been vital in determining the structure, at the atomic level, of the complex and efficient nanomachines of cell biology. Neutron beams, too, are unique probes of the structure and dynamics of nanoscale objects like macromolecules. To get a beam of neutrons intense enough for this kind of structure, you either need a research reactor, like the one at the Institut Laue-Langevin, in Grenoble (at which I am writing this), or a spallation source, such as ISIS, near Oxford in the UK. This latter consists of a high energy synchrotron, of the kind developed for particle physics, which smashes pulses of protons into a heavy metal target, producing showers of neutrons.

Synchrotron and neutron sources are run on a time-sharing basis; individual groups apply for time on a particular instrument, and the best applications are allocated a few days of (rather frenetic) experimentation. So in this sense, even these techniques have the character of small science. But the facilities themselves are expensive – the world’s most advanced spallation source for neutrons, the SNS currently being built in Oak Ridge, TN, USA, will cost more than $1.4 billion, and the Japanese source J-PARC, a few years behind SNS, has a budget of $1.8 billion. With this big money comes real politics. How do you set the priorities for the science that is going to be done, not next year, but in ten years time? Do you emphasise the incremental research that you are certain will produce results, or do you gamble on untested ideas that just might produce a spectacular pay-off? This is the sort of rather difficult and uncomfortable discussion I’ve been involved in for the last couple of days – I’m on the Scientific Council of ILL, which has just been having one of its twice yearly meetings.

Cancer and nanotechnology

There’s a good review in Nature Reviews: Cancer (with free access) about the ways in which nanotechnology could help the fight against cancer – Cancer Nanotechnology: Opportunities and Challenges . The article, by Ohio State University’s Mauro Ferrari, concentrates on two themes – how nanotechnologies can help diagnose and monitor cancer, and how it could lead to more effective targeting and delivery of anti-cancer agents to tumours.

The extent to which we urgently need better ways of wrapping up therapeutic molecules and getting them safely to their targets is highlighted by a striking figure that the article quotes – if you inject monoclonal antibodies and monitor how many of these molecules get to a target within an organ, the fraction is less than 0.01%. The rest are wasted, which is bad news if these molecules are expensive and difficult to make, and even worse news if, like many anti-cancer drugs, they are highly toxic. How can we make sure that every one of these drug molecules get to where they are needed? One answer is to stuff them into a nanovector, a nanoscale particle that protects the enclosed drug molecules and delivers them to where they are needed. The simplest example of this approach uses a liposome – a bag made from a lipid bilayer. Liposome encapsulated anti-cancer drugs are now clinically used in the treatment of Karposi’s sarcoma and breast and ovarian cancers. But lots of work remains to make nanovectors that are more robust, more resistant to non-specific protein adsorption, and above all which are specifically targeted to the cells they need to reach. Such specific targeting could be achieved by coating the nanovectors with antibodies with specific molecular recognition properties for groups on the surface of the cancer cells. The article cites one cautionary tale that illustrates that this is all more complicated than it looks – a recent simulation suggests that it is possible to get a situation in which targeting a drug precisely to a tumour can make the situation worse, by causing the tumour to break up. It may be necessary not just to target the drug carriers to a tumour, but to make sure that the spatial distribution of the drug through the tumour is right.

The future will probably see complex nanovectors engineered to perform multiple functions, protecting the drugs, getting them through all the barriers and pitfalls that lie between the point at which the drug is administered and the part of the body where it is needed, and releasing them at their target. The recently FDA approved breast cancer drug, Abraxane, is an advance in the right direction; one can think of it as a nanovector that combines two functions. The core of the nanovector consists of a nanoparticulate form of the drug itself; dispersing it so finely dispenses with the need for toxic solvents. And bound to the drug nanoparticle are protein molecules which help the nanoparticles get across the cells that line blood vessels. It’s clear that as more and more functions are designed into nanovectors, there’s a huge amount of scope for increases in drug effectiveness, increases that could amount to orders of magnitude.

Massive Change

Philip Ball‘s column in the March edition of Nature Materials (subscription required) draws attention to the designer Bruce Mau‘s Massive Change project.

Massive Change is an exhibition, currently on show at the Art Gallery of Ontario in Toronto, a website, and a book, all with the ambitious aim of showing how design can change the world for the better. Design here is interpreted broadly, to encompass town planning, architecture, and above all technology, and the aims are summarised in bold, manifesto statements. These three examples give the flavour:

  • We will bring energy to the entire world
  • We will eradicate poverty
  • We will eliminate the need for raw material and banish all waste
  • Nanotechnology, in various guises, makes frequent appearances in support of these goals, though it’s the incremental and evolutionary versions rather than the Drexlerian kind that are invoked. Nonetheless, advances in materials science are described in these visionary terms:

    Material has traditionally been something to which design is applied. New methods in the fields of nanotechnology have rendered material as the object of design development. Instead of designing a thing, we design a designing thing. In the process we have created superhero substances endowed with superlative characteristics, from the hyperbolic to the almost human. Materials now have strength, agility, memory, intelligence. Mere matter no longer, materials have become active carriers of meaning and program.

    One can quibble at the hyperbole and the lack of detail, but I can’t help applauding a project which is both idealistic and assertive, in the sense that it stresses the view that we aren’t simply helpless victims of the progress of technology, but that we can imagine the outcomes we want and decide how to use technology to get there.

    New book on Nanoscale Science and Technology

    Nanoscale Science and Technology is a new, graduate level interdisciplinary textbook which has just been published by Wiley. It’s based on the Masters Course in Nanoscale Science and Technology that we run jointly between the Universities of Leeds and Sheffield.

    Nanoscale Science and Technology Book Cover

    The book covers most aspects of modern nanoscale science and technology. It ranges from “hard” nanotechnologies, like the semiconductor nanotechnologies that underly applications like quantum dot lasers, and applications of nanomagnetism like giant magnetoresistance read-heads, via semiconducting polymers and molecular electronics, through to “soft” nanotechnologies such as self-assembling systems and bio-nanotechnology. I co-wrote a couple of chapters, but the heaviest work was done by my colleagues Mark Geoghegan, at Sheffield, and Ian Hamley and Rob Kelsall at Leeds, who, as editors, have done a great job of knitting together the contributions of a number of authors with different backgrounds to make a coherent whole.

    Nanotechnology and public engagement

    The UK government today announced funding for work in public engagement in a number of technology areas, including nanotechnology, under their Science Wise scheme. There are two schemes related to nanotechnology. One of these, “Nanodialogues” will be run by the thinktank Demos, and will carry out four experiments in “upstream public engagement”. At the back of everybody’s mind as people try to design these schemes is a previous, not entirely happy, experiment in public engagement over genetic modification of food, GM Nation. There’s a general will to learn from the shortcomings of that experience.

    Entirely co-incidentally, I was in London today, at the Greenpeace UK headquarters, for the first meeting of the steering group of a pilot experiment in nano-public engagement. This is a project to run a citizen’s jury about nanotechnology. The project is supported by Greenpeace UK, the Guardian newspaper, and the Cambridge University Nanoscience Centre, and operations will be run by an outfit from Newcastle University with experience of this sort of thing, Policy, Ethics and Life Sciences. I’m chairing the science advisory panel.

    It’s too early to be saying much about the project yet, but I’ll be reporting on the process as it unfolds over the spring and summer. It’s unknown territory for me, but even this first meeting was fascinating. We had representatives from the NGOs Greenpeace and ETC, high level representation from government and research councils, and a few academics. Just getting this bunch round the table in the first place was impressive enough, but I was surprised at how easily the group was able to reach a consensus.