Radio Nanotechnology

The BBC’s spoken word radio station, Radio 4, is giving nanotechnology full billing at the moment (perhaps they are getting bored with the election). In addition to last night’s Reith Lecture, given by Lord Broers, the consumer program You and Yours covered the subject in some depth this lunchtime (listen to it here).

The piece included a long interview with Ann Dowling, chair of the Royal Society Report, a walkround the Science Museum exhibition – Nanotechnology: small science, big deal, an interview with Erik van der Linden from Wageningen Agricultural University in the Netherlands, talking about nanotechnology in food, mostly in the context of converting plant protein into meat substitutes, and encapsulation of nutriceuticals and flavours. There was, of course, a spokesman from Nanotex telling us all about stain resistant trousers.

What there was no mention at all of was molecular manufacturing. I rather suspect that this will be interpreted in some quarters as a conspiracy of silence.

Politics and the National Nanotechnology Initiative

The view that the nanobusiness and nanoscience establishment has subverted the originally intended purpose of the USA’s National Nanotechnology Initiative has become received wisdom amongst supporters of the Drexlerian vision of MNT. According to this reading of nanotechnology politics,
any element of support for Drexler’s vision for radical nanotechnology has been stripped out of the NNI to make it safe for mundane near-term applications of incremental nanotechnology like stain resistant fabric. This position is succintly expressed in this Editorial in the New Atlantis, which makes the claim that the legislators who supported the NNI did so in the belief that it was the Drexlerian vision that they were endorsing.

A couple of points about this position worry me. Firstly, we should be very clear that there is a very important dividing line in the relationship between science and politics that any country should be very wary of crossing. In a democratic country, it’s absolutely right that the people’s elected representatives should have the final say about what areas of science and technology are prioritised for public spending, and indeed what areas of science are left unpursued. But we need to be very careful to make sure that this political oversight of science doesn’t spill over into ideological statements about the validity of particular scientific positions. If supporters of MNT were to argue that the government should overrule the judgement of the scientific community about what approach to radical nanotechnology is most likely to work on what are essentially ideological grounds, then I’d suggest they recall the tragic and unedifying history of similar interventions in the past. Biology in the Soviet Union was set back for a generation by Lysenko, who, unable to persuade his colleagues of the validity of his theory of genetics, appealed directly to Stalin. Such perversions aren’t restricted to totalitarian states; Edward Teller used his high level political connections to impose his vision of the x-ray laser on the USA’s defense research establishment, in the face of almost universal scepticism from other physicists. The physicists were right, and the program was abandoned, but not before the waste of many billions of dollars.

But there’s a more immediate criticism of the theory that the NNI has been highjacked by nanopants. This is that it’s not right, even from the point of view of supporters of Drexler. The muddle and inconsistency comes across most clearly on the Center for Responsible Nanotechnology’s
blog. While this entry strongly endorses the New Atlantis line, this entry only a few weeks earlier expresses the opinion that the most likely route to radical nanotechnology will come through wet, soft and biomimetic approaches. Of course, I agree with this (though my vision of what radical nanotechnology will look like is very different from that of supporters of MNT); it is the position I take in my book Soft Machines; it is also, of course, an approach recommended by Drexler himself. Looking across at the USA, I see some great and innovative science being done along these lines. Just look at the work of Ned Seeman, Chad Mirkin, Angela Belcher or Carlo Montemagno, to take four examples that come immediately to mind. Who is funding this kind of work? It certainly isn’t the Foresight Institute – no, it’s all those government agencies that make up the much castigated National Nanotechnology Initiative.

Of course, supporters of MNT will say that, although this work may be moving in the direction that they think will lead to MNT, it isn’t been done with that goal explicitly stated. To this, I would simply ask whether it isn’t a tiny bit arrogant of the MNT visionaries to think that they are in a better position to predict the outcome of these lines of inquiry than the people who are actually doing the research.

Whenever science funding is allocated, there is a real tension between the short-term and the long-term, and this is a legitimate bone of contention between politicians and legislators, who want to see immediate results in terms of money and jobs for the people they represent, and scientists and technologists with longer term goals. If MNT supporters were simply to argue that the emphasis of the NNI should be moved away from incremental applications towards longer term, more speculative research, then they’d find a lot of common cause with many nanoscientists. But it doesn’t do anyone any good to confuse these truly difficult issues with elaborate conspiracy theories.

Politics in the UK

Some readers may have noticed that we are in the middle of an election campaign here in the UK. Unsurprisingly, science and technology have barely been mentioned at all by any of the parties, and I don’t suppose many people will be basing their voting decisions on science policy. It’s nonetheless worth commenting on the parties’ plans for science and technology.

I discussed the Labour Party’s plans for science for the next three years here – this foresees significant real-terms increases in science funding. The Conservative Party has promised to “at least match the current administration’s spending on science, innovation and R&D”. However, the Conservative’s spending plans are predicated on finding ��35 billion in “efficiency savings”, of which ��500 million is going to come from reforming the Department of Trade and Industry’s business support programmes. I believe it is under this heading that the ��200 million support for nanotechnology discussed here comes from, so I think the status of these programmes in a Conservative administration would be far from assured. The Liberal Democrats take a simpler view of the DTI – they just plan to abolish it, and move science to the Department for Education.

So, on fundamental science support, there seems to be a remarkable degree of consensus, with no-one seeking to roll back the substantial increases in science spending that the Labour Party has delivered. The arguments really are on the margins, about the role of government in promoting applied and near-market research in collaboration with industry. I have many very serious misgivings about the way in which the DTI has handled its support for micro- and nano- technology. In principle, though, I do think it is essential that the UK government does provide such support to businesses, if only because all other governments around the world (including, indeed perhaps especially, the USA) practise exactly this sort of interventionist policy.

Paint-on lasers and land-mine detection

One of the many interesting features of semiconducting polymers is that they can be made to lase. By creating a population of excited electronic states, a situation can be achieved whereby light is amplified by the process of stimulated emission, giving rise to an intense beam of coherent light. Because semiconducting polymers can be laid down in a thin film from a simple solution, it’s tempting to dream of lasers that are fabricated by simple and cheap processes, like printing, or are simply painted on to a surface. The problem with this is that, so far (and as far as I know), the necessary population of excited states has only been achieved by illuminating the material with another laser. This optical pumping, as it is called, is obviously less useful than the situation where the laser can be pumped electrically, as is the case in the kind of inorganic semiconductor lasers that are now everyday items in CD and DVD players. But a paper in this week’s Nature (abstract free, subscription required for full article) demonstrates another neat use for lasing action in semiconducting polymers – as an ultrasensitive detector for explosives. See also this press release.

The device relies on the fact that lasing is a highly non-linear effect; if an optically-pumped polymer laser is exposed to a material which influences only a few molecules at its surface, this can still kill the lasing action entirely. The molecule that is being used in this work, done at MIT by Timothy Swager’s group, is particularly sensitive to the explosive TNT. This device can work as a sensor that would be sensitive enough (and this needs to be in the parts per billion range) to detect the tiny traces of TNT vapour that a buried land-mine would emit.

This work, rather unsurprisingly, is supported by MIT’s Institute for Soldier Nanotechnologies. The development of these ultrasensitive sensors for the detection of chemicals in the environment forms a big part of the research effort in evolutionary nanotechnology. On the science side, this is driven by the fact that detecting the effects of molecules interacting with surfaces is intrinsically a lot easier in systems with nanoscaled components, simply because the surface in a nanostructured device has a great deal more influence on its properties than it would in a bulk material. On the demand side, the needs of defense and homeland security are, now more than ever, setting the research agenda in the USA.

Nobel Laureates Against Nanotechnology

This small but distinguished organisation has gained another two members. The theoretical condensed matter physicist Robert Laughlin, in his new book A Different Universe: reinventing physics from the bottom down, has a rather scathing assessment of nanotechnology, with which Philip Anderson (who is himself a Nobel Laureate and a giant of theoretical physics), reviewing the book in Nature(subscription required), concurs. Unlike Richard Smalley, Laughlin’s criticism is directed at the academic version of nanotechnology, rather than the Drexlerian version, but adherents of the latter shouldn’t feel too smug because Laughlin’s criticism applies with even more force to their vision. He blames the seductive power of reductionist belief for the delusion: “The idea that nanoscale objects ought to be controllable is so compelling it blinds a person to the overwhelming evidence that they cannot be”.

Nanotechnologists aren’t the only people singled out for Laughlin’s scorn. Other targets include quantum computing, string theory (“the tragic consequence of an obsolete belief system”) and most of modern biology (“an endless and unimaginably expensive quagmire of bad experiments”). But underneath all the iconoclasm and attitude (and personally I blame Richard Feynman for making all American theoretical physicists want to come across like rock stars), is a very serious message.

Laughlin’s argument is that reductionism should be superseded as the ruling ideology of science by the idea of emergence. To quote Anderson “The central theme of the book is the triumph of emergence over reductionism: that large objects such as ourselves are the product of principles of organization and of collective behaviour that cannot in any meaningful sense be reduced to the behaviour of our elementary constituents.” The origin of this idea is Anderson himself, in a widely quoted article from 1971 – More is different. In this view, the idea that physics can find a “Theory of Everything” is fundamentally wrong-headed. Chemistry isn’t simply the application of quantum mechanics, and biology is not simply reducible to chemistry; the organisation principles that underlie, say, the laws of genetics, are just as important as the properties of the things being organised.

Anderson’s views on emergence aren’t as widely known as they should be, in a world dominated by popular science books on string theory and “the search for the God particle”. But they have been influential; an intervention by Anderson is credited or blamed by many people for killing off the Superconducting Supercollider project, and he is one of the founding fathers of the field of complexity. Laughlin explicitly acknowledges his debt to Anderson, but he holds to a particularly strong version of emergence; it isn’t just that there are difficulties in practise in deriving higher level laws of organisation from the laws describing the interactions of their parts. Because the organisational principles themselves are more important than the detailed nature of the interactions between the things being organised, the reductionist program is wrong in principle, and there’s no sense in which the laws of quantum electrodynamics are more fundamental than the laws of genetics (in fact, Laughlin argues on the basis of the strong analogies between QED and condensed matter field theory that QED itself is probably emergent). To my (philosophically untrained) eye, this seems to put Laughlin’s position quite close to that of the philosopher of science Nancy Cartwright. There’s some irony in this, because Cartwright’s book The Dappled World was bitterly criticised by Anderson himself.

This takes us a long way from nanoscience and nanotechnology. It’s not that Laughlin believes that the field is unimportant; in fact he describes the place where nanoscale physics and biology meets as being the current frontier of science. But it’s a place that will only be understood in terms of emergent properties. Some of these, like self-assembly, are starting to be understood, but many others are not. But what is clear is that the reductionist approach of trying to impose simplicity where it doesn’t exist in nature simply won’t work.

Nanotechnology and the developing world

There’s a rather sceptical commentary from Howard Lovy about a BBC report on a study from Peter Singer and coworkers. At the centre of the report is a list of areas in which the authors feel that nanotechnology can make positive contributions to the developing world. Howard’s piece attracted some very sceptical comments from Jim Thomas, of the ETC Group. Jim is very suspicious of high-tech “solutions” to the problems of the developing world which don’t take account of local cultures and conditions. In particular, he sees the role of multinational companies as being particularly problematic, especially with regard to issues of ownership, control and intellectual property.

I see the problem of multinational companies in rather different terms. To take a concrete example, I’d cited the case of insecticide-treated mosquito nets for the control of malaria as a place where nanoscale technology could make a direct impact (and Jim did seem to agree, with some reservations, that this in could, in some circumstances, be an appropriate solution). The technical problem with insecticide treated mosquito nets is that the layer of active material isn’t very robustly attached, and the effectiveness of the nets falls away too rapidly with time, and even more rapidly when the nets are washed. One solution is to use micro- or nano-encapsulation of the insecticide to achieve long-lasting controlled release. The necessary technology to do this is being developed in agrochemical multinationals. The problem, though, is that their R&D efforts are steered by the monetary size of the markets they project. They’d much rather develop termite defenses for wealthy suburbanites in Florida than mosquito nets. The problem, then, isn’t that these multinationals will impose technical fixes on the developing world, it’s that they’ll just ignore the developing world entirely and potentially valuable technologies simply won’t reach the places where they could do some good.

To overcome this market failure needs intervention from governments, foundations and NGOs, as well as some active and informed technology brokering. Looking at it in this light, it seems to me that the Singer paper is a useful contribution.

How are we doing?

Howard Lovy’s Nanobot draws attention to an interesting piece in SciDevNet discussing bibliometric measures of the volume and impact of nanotechnology research in various parts of the world. This kind of measurement – in which databases are used to count numbers of papers published and the number of times such papers are cited by other papers – is currently very popular among governments attempting to assess whether the investments they make in science are worthwhile. I was shown a similar set of data about the UK, commissioned by the Engineering and Physical Science Research Council, at a meeting last week. The attractions of this kind of analysis are obvious, because it is relatively easily commissioned and done, and it yields results that can be plotted in plausible and scientific looking graphs.

The drawbacks perhaps are less obvious, but are rather serious. How do you tell what papers are actually about nanotechnology, given the difficulties of defining the subject? The obvious thing to do is to search for papers with “nano” in the title or abstract somewhere – this is what the body charged with evaluating the USA’s National Nanotechnology Initiative have done. What’s wrong with this is that many of the best papers on nanotechnology simply don’t feel the need to include the nano- word in their title. Why should they? The title tells us what the paper is about, which is generally a much more restricted and specific subject than this catch-all word. I’ve been looking up papers on single molecule electronics today. I’d have thought that everyone would agree that the business of trying to measure the electrical properties of single molecules, one at a time, and wiring them up to make ultra-miniaturised electronic devices, was as hardcore as nanotechnology comes. But virtually none of the crucial papers on the subject over the last five years would have shown up on such a search.

The big picture that these studies are telling us does ring true; the majority of research in nanoscience and nanotechnology is done outside the USA, and this kind of research in China has been growing exponentially in both volume and impact in recent years. But we shouldn’t take the numbers too seriously; if we do, it’s only a matter of time before some science administrator realises that the road to national nanotechnology success is simply to order all the condensed matter physicists, chemists and materials scientists to stick “nano-” somewhere in the titles of all their papers.

The BBC’s Reith Lectures cover nanotechnology

Every year the BBC broadcasts a series of radio lectures on some rather serious subject, given by an appropriately weighty public intellectual. This year’s series is called “The triumph of technology”, and the fourth lecture (to be broadcast at 8 pm on the 27th April), is devoted to nanotechnology and nanoscience. The lecturer is Lord Broers, who certainly qualifies as a prominent member of that class that the British call the Great and Good. He’s recently stepped down from being Vice-Chancellor of Cambridge University, he’s President of the Royal Academy of Engineering, and is undoubtedly an ornament to a great number of important committees. But what’s interesting is that he does describe himself as a nanotechnologist. His early academic work was on scanning electron microscopy and e-beam lithography, and before returning to academia he did R&D for IBM.

The introductory lecture – Technology will Determine the Future of the Human Race – has already been broadcast; you can read the text or download an MP3 from the BBC website. This first lecture is rather general, so it will be interesting to see if he develops any of his themes in more unexpected directions.

Disentangling thin polymer films

Many of the most characteristic properties of polymer materials like plastics come from the fact that their long chain molecules get tangled up. Entanglements between different polymer chains behave like knots, which make a polymer liquid behave like a solid over quite perceptible time scales, just like silly putty. The results of a new experiment show that when you make the polymer film very thin – thinner than an individual polymer molecule – the chains become less entangled with each other, with significant effects on their mechanical properties.

The experiments are published in this weeks Physical Review Letters; I’m a co-author but the main credit lies with my colleagues Lun Si, Mike Massa and Kari Dalnoki-Veress at McMaster University, Canada. The abstract is here, and you can download the full paper as a PDF (this paper is copyright the American Physical Society and is available here under the author rights policy of the APS).

This is the latest in a whole series of discoveries of ways in which the properties of polymer films dramatically change when their thicknesses fall towards 10 nm and below. Another example is the discovery that the glass transition temperature of polymer films – the temperature at which a polymer like polystyrene changes from a glassy solid to a gooey liquid – dramatically decreases in thin films. So a material that would in the bulk be a rigid solid may, in a thin enough film, turn into a much less rigid, liquid-like layer (see this technical presentation for more details). Why does this matter? Well, one reason is that, as feature sizes in the microelectronics industry fall below 100 nm, the sharpness with which one can define a line in a thin film of a polymer resist could limit the perfection of the features one is making. So the fact that the mechanical properties of the polymer themselves change, purely as a function of size, could lead to problems.

Nanoscience, small science, and big science

Quite apart from the obvious pun, it’s tempting to think of nanoscience as typical small science. Most of the big advances are made by small groups working in universities on research programs devised, not by vast committees, but by the individual professors who write the grant applications. Equipment is often quite cheap, by scientific standards – a state of the art atomic force microscope might cost $200,000, and doesn’t need a great deal of expensive infrastructure to house it. If you have the expertise and the manpower, but not the money, you could build one yourself for perhaps a tenth of this price or less. This is an attractive option for scientists in developing countries, and this is one reason why nanoscience has become such a popular field in countries like India and China. It’s all very different from the huge and expensive multinational collaborations that are necessary for progress in particle physics, where a single experiment may involve hundreds of scientists and hundreds of millions of dollars – the archetype of big science.

Big science does impact on the nanoworld, though. Techniques that use the highly intense beams of x-rays obtained from synchrotron sources like the ESRF at Grenoble, France, and the APS, on the outskirts of Chicago, USA, have been vital in determining the structure, at the atomic level, of the complex and efficient nanomachines of cell biology. Neutron beams, too, are unique probes of the structure and dynamics of nanoscale objects like macromolecules. To get a beam of neutrons intense enough for this kind of structure, you either need a research reactor, like the one at the Institut Laue-Langevin, in Grenoble (at which I am writing this), or a spallation source, such as ISIS, near Oxford in the UK. This latter consists of a high energy synchrotron, of the kind developed for particle physics, which smashes pulses of protons into a heavy metal target, producing showers of neutrons.

Synchrotron and neutron sources are run on a time-sharing basis; individual groups apply for time on a particular instrument, and the best applications are allocated a few days of (rather frenetic) experimentation. So in this sense, even these techniques have the character of small science. But the facilities themselves are expensive – the world’s most advanced spallation source for neutrons, the SNS currently being built in Oak Ridge, TN, USA, will cost more than $1.4 billion, and the Japanese source J-PARC, a few years behind SNS, has a budget of $1.8 billion. With this big money comes real politics. How do you set the priorities for the science that is going to be done, not next year, but in ten years time? Do you emphasise the incremental research that you are certain will produce results, or do you gamble on untested ideas that just might produce a spectacular pay-off? This is the sort of rather difficult and uncomfortable discussion I’ve been involved in for the last couple of days – I’m on the Scientific Council of ILL, which has just been having one of its twice yearly meetings.