Letters from Nano-land

What academic journals should one read to get the latest news about nanotechnology research? This isn’t as an easy a question to answer as one might think, and this difficulty reflects the fact that nanoscience and nanotechnology have still not really gelled into a coherent scientific culture. So nanotechnology done by physicists will often end up in physics journals (Physical Review Letters being the most prestigious), while that done by chemists will similarly end up in chemistry journals. The nearest thing we have to specialised nanotechnology journals are the general materials science journals like Nature Materials and Advanced Materials, both of which are essential reading. A recent addition to this space, though, is explictly pitching to be the nanotechnology journal of choice – this is the American Chemical Society’s journal Nano Letters. This is winning a lot of friends in the nanoscience community; the time between papers being submitted and them appearing is very short, which appeals to impatient authors, and the editorial board is a list of some of the most distinguished nanoscientists anywhere. And the impact factor – a crucial measure of where a journal is in the scientific pecking order, defined by the average number papers appearing in the journal are cited by other papers – is high. Nature Materials is still at the top of the pile (not counting Nature and Science, of course), with an impact factor of 13.53, but Nano Letters, at 8.45, has already shaded ahead of Advanced Materials, at 8.08. The long-established Institute of Physics journal Nanotechnology trails a long way behind at 3.32. Journals, and their editorial policies, are important in defining emerging fields, so it’s interesting to take a snapshot of how the Nano Letters editors see the field, on the basis of the papers published in the current edition.

Carbon nanotubes are clearly still objects of nanofascination, accounting for five out of the twenty five papers in the issue. It’s largely the electronic properties of the nanotubes that excite, rather than their mechanical properties, and this theme of nanoelectronics is continued with another five papers on semiconductor nanowires. Soft nanotechnology and bio-nanotechnology is an important theme, accounting for eleven papers. There’s some overlap; a couple of papers use the self-assembling properties of biological molecules like DNA and peptides to guide the assembly of inorganic nanotubes and nanowires. Experiment dominates over theory, with only three purely theoretical papers. Most of the papers are quite a long way from any applications. The work that’s closest to market includes a paper on the use of quantum dots for magnetic resonance imaging, one on using titanium dioxide nanoparticles for solar generation of hydrogen. At the other end of the scale, there’s one paper on the use of the scanning tunneling microscope to mechanically position and react individual molecules on a surface.

It’s interesting to ask where, geographically, the papers comes from. As one would expect from a USA-based journal, the largest contribution comes from the USA, with 56% of the papers. Europe accounts for 36%, with a fair spread of countries represented, while the remainder come from Canada. Interestingly, this issue contains no contributions at all from the far east. In fact, over the whole of 2005 only 2% of the papers in Nano Letters came from China.

I’m not entirely sure what all this means, but one thing that strikes me is there’s relatively little relationship between this (small) sample of what the academic nano- community thinks is exciting work, and what is currently being commercialised by industry. An optimist would take this as a sign that there was a significant pipeline of work that will be coming ready to commercialise maybe 5-10 years from now.

Grey Goo won’t get you across the Valley of Death

The UK’s main funder of academic nanoscience and nanotechnology – the Engineering and Physical Science Research Council (EPSRC) – has published a report of a review of its nanotechnology portfolio held last summer. The report – released in a very low key way last November – is rather critical of the UK’s nanotechnology performance, noting that it falls below what the UK would hope for both in quality and in quantity, and recommends an urgent review of the EPSRC’s strategy in this area. This review is just getting under way (and I’m one of the academics on the working party).

Unlike many other countries, there is no dedicated nanotechnology program in the UK (the Department of Trade and Industry does have a program in micro- and nano- technology, but this is very near-term and focused on current markets and applications) . With the exception of two (small scale, by international comparisons) nanotechnology centres, at Oxford and Cambridge, nanoscience and nanotechnology proposals are judged in competition with other proposals in physics, chemistry and materials science. There’s no earmarked funding for nanotechnology, and the amount of funding given to the area is simply the aggregate of lots of decisions on individual proposals. This means, of course, that even estimating the total size of the UK’s nanotechnology spend is a difficult task that depends on a grant-by-grant judgement of what is nanotechnology and what is not.

This situation isn’t entirely bad; it probably means that the UK has been less affected by the worst excesses of academic nanohype than countries in which funding has been much more directly tied to the nanotechnology brand. But it does mean that the UK’s research in this area has lacked focus, it’s been developed without any long term strategy, and there’s been very little attempt to build research capacity in the area. Now is probably not a bad time to look ahead at where the interesting opportunities in nanotechnology will be, not next year, but in ten to fifteen years time, and try refocus academic nanoscience in a way that will create those longer term opportunities.

One of the perceptions mentioned in the report was that the quality of work was rather patchy, particularly in areas like nanomaterials, with some work of very moderate quality being done. One panelist on the theme day review memorably called this sort of research “grey goo” – work that is neither particularly exciting scientifically, but which, despite its apparent applied quality, isn’t particularly likely to be commercialised either. Everyone in government is concerned about the so-called “valley of death” – that trough in the cycle of commercialisation of a good idea which comes after the basic research has been done, but when products and revenues still seem a long way off. Much government intervention aims to get good ideas across this melodramatically named rift, but this carries a real danger. Clearly, funding high quality basic science doesn’t help you here, but there’s a horribly tempting false syllogism – that if a proposal isn’t interesting fundamental science, then it might be just the sort of innovative applied research that gets the good ideas closer to market. Well, it might be, but it’s probably more likely simply to be mediocre “sort-of-applied” work that will never yield a commercial product – it might be “grey goo”. I don’t think this is solely a UK problem – in my view every funding agency should ask themselves: ‘are we funding “grey goo” in a doomed attempt to get across the “valley of death”?’

Keeping the nanotech score

A claim from Lux research (reported in Small Times here) that China is now second only to the USA in its output of academic nanoscience papers is being met with some scepticism over on Nanodot. While there is clearly a real and important story about the huge recent growth in nanoscience capability in China, I’m also a bit sceptical about the central claim of this story, about China’s publication share. Of course, I don’t know about the detailed methodology in the publications study the Lux report cites. But I do know how a study which reached a very similar conclusion, commissioned for the UK’s science funding agency EPSRC, was done. Essentially, a database search was done for papers with “nano” or some compound thereof in the title.

I can do this too. If we look in “Web of Science” at papers published in 2004 and 2005 with “nano” or a compound thereof in title or abstract, we find that from a total of 59,938 papers, 10,546 – 18% – have at least one address from China. This is still behind the USA, with 28%, but is ahead of Japan, at 11% and Germany, at 8%. The UK is futher behind still, at 4%. (actually, the UK shows up only a pitiful total of 27 papers – 2370 are listed under England, with Wales and Scotland adding a further 487. I never realised British science had such separatist tendencies!). Of course, working out the sums this way will give a set of percentages that add to a total of more than 100%, since many papers have coauthors from different countries.

What’s wrong with this is perhaps only clear to scientists who are working in the field. When I think of what I believe to be the most significant papers in nanoscience, most of them simply don’t mention “nano” anywhere in the title. Why should they? Unless they are actually about carbon nanotubes, their title and abstract will generally refer to something much more specific than the rather general and all-encompassing “nano” label. We can get some feel for the fraction of significant and relevant papers that are excluded by this methodology by asking what proportion of papers by leaders in the nanoscience field would actually show up in a search like this. For example, taking a few more or less random US nanoscientists, only 24% of Whitesides’s papers would show up, 50% of James Heath’s, and even the rather radical and hardcore nanoscience of Ned Seeman and Fraser Stoddart still only pass the “nano” test 54% and 31% of the time respectively. Mark Ratner, despite being a prominent “nano” author, similarly would have nearly 70% of his publications slip undetected through the “nano” net.

And here in the UK, are we lagging behind quite so badly? Maybe, but again if we look at the output of some of our most prominent nanoscientists, we find most of their output is missed by this kind of bibliometric analysis. Of Richard Friend’s 35 papers, only 20% show up in this kind of search, while my Sheffield colleague, quantum dot guru Maurice Skolnick, similarly produced 35 papers, of which precisely 1 passed the nano-test.

I’m labouring the point now, and I’m sure the Lux people would say they’ve done their search in a much more sophisticated way. But I’m still convinced that any kind of mechanistic, keyword based search on the scientific literature is likely to lead to a highly distorted result, simply because what counts as “nanoscience” is so ill-defined. What you are seeing is not an accurate measure of nanoscience output, but a reflection of how strong is the fashion for attaching a “nano” label to ones work. This, of course, is somewhat unfair to people who are studying nanotubes, for example, who can hardly avoid putting “nano” in their titles and abstracts, but one’s strongly tempted to view the ratio (nano papers/total papers) as a kind of “nanohype index”. There is clearly genuine growing strength in China’s nanoscience output, and there is probably cause for concern in the UK, but these rather crude measures need to be taken with a substantial pinch of salt.

(And how do I score myself on the nanohype index? 7% on a total of 15 papers, I’m perversely proud to report).

Swimming strategies for nanobots

Physics works differently at the nanoscale, and this means that design principles that are familiar in the macro-scale world may not work when shrunk. A great example of this is the problem of how you would propel a nanoscale swimmer through water. To a human-scale swimmer, water resists forward motion by virtue of the fact that it has inertia. But on the nanoscale, it is the viscosity of water that is the dominant factor. To imagine what it feels like trying to swim at the nanoscale, you need to imagine being immersed in a vat of the most sticky molasses.

The mathematics of this situation is intriguing, and it’s been known for a while that any simple, back-and-forth motion won’t get you anywhere. Imagine a scallop, trying to swim by opening its shell slowly, and then shutting it suddenly. This strategy works fine in the macroscopic world, but on the nanoscale you can show that all the ground the scallop gains when it shuts its shell is lost again when it opens it, no matter how big the difference in speed between the forward and backward strokes. To get anywhere, you need some kind of non-reciprocal motion – a motion that looks different when time-reversed. In 2004, Ramin Golestanian and coworkers showed that three spheres joined together could make a nanoscale swimmer. Here’s an article about this work in Physical Review Focus, with a link to a neat animation; here’s another article in Technology Review: Teaching Nanotech to Swim.

This story has moved forward in two ways since this report. Earlier this year, Ramin Golestanian, together with Tannie Liverpool, from Leeds University, and Armand Ajdari, from ESPCI in Paris, analysed another way of propelling a nanoscale submarine. In this work, published in Physical Review Letters in June this year (abstract here, subscription required for full article), they considered a nanoscale vessel with an enzyme attached to the hull at one point. The enzyme catalyses a chemical reaction that produces a stream of reactants like a rocket’s exhaust. Like a rocket, this has the effect of propelling the vessel along, but the physics underlying the effect is quite different. It’s not the inertia of the exhaust that propels the vessel forward; instead it is the effect of the collisions of the reactant molecules as they undergo random, Brownian motion that have the effect of propelling our nanobot forward.

And today, Nature published an experimental report of a miniature swimmer (editor’s summary; full paper requires subscription) which illustrates some of these principles. In this work (from Bibette and coworkers, also at ESPCI, Paris), chains of magnetic nano-particles form a tail which wiggles when an oscillating magnetic field is applied, pulling a payload along.

Ramin has just joined us in the physics department at Sheffield, so I look forward to working with him to take some more steps on the road to a swimming nanobot.

Anti-cancer nanotechnology – a two-pronged attack

Drug delivery – and in particular the delivery of anti-cancer therapeutics – has emerged as one of the major applications of nanotechnology in medicine. There’s a nice brief review of the subject by Ruth Duncan in this month’s issue of Nano Today: – Nanomedicine gets clinical . An interesting paper in this week’s Nature reports a significant new development from Sasisekharan’s group at MIT, in which two drugs are combined in a single delivery system. This nanovector first selectively targets tumour tissue, then releases a drug which cuts off the blood supply to the tumour, isolating and starving it, and then releases a second drug which directly attacks the tumour cells.

The full article can be found here, and there’s a commentary about it here. A subscription to Nature may be required for these articles, but you can also look at the Nature’s editor’s summary and the press release from MIT.

What’s interesting about this work is the way it brings together quite a lot of different tricks to make something that is starting to look like a piece of integrated nanoengineering. You have the coupling of a drug to a polymer which slowly breaks down in water; then this drug-polymer conjugate is prepared in the form of nanoparticles. These nanoparticles are, together with a second drug, encapsulated in a liposome, a self-assembled hollow shell formed by a phospholipid sheet which has folded round on itself to form an enclosed surface. The size of these liposomes is controlled so they selectively find their way into the tumour tissue and their surfaces are decorated with hairy layers of water soluble polymers to hide them from the immune system. Individually, none of these features is novel, but their combination in an integrated nanosystem is very impressive.

A better alligator clip for molecular electronics

The dream of molecular electronics is to wire up circuits using individual molecules as the basic components. A basic problem is how you connect your (typically semiconducting) molecules to the metallic connectors; the leading candidate at the moment is to use molecules with a terminal thiol (-S-H) group. Thiols stick very effectively to the surface of gold; this thiol-gold chemistry has quietly become one of the most widely used tools of today’s nanotechnologists, and has been referred to as a molecular alligator clip. But it’s not without its drawbacks; rather than bonding to a single metal ion the thiol group complexes with a group of neighbouring gold atoms, and the electrical properties of the bond through the single linking sulphur atom aren’t ideal. Two papers in this week’s Science magazine suggest an alternative.

The two papers – by Siaj and McBreen (Université Laval, Québec) and Nuckolls and coworkers (Columbia University) (subscription required for access to the full articles) both describe ways of getting a molecule linked to a metal surface by a double bond (i.e. M=C- where M is a metal atom and C is the terminal carbon of an organic molecule). The surface bonded organic molecule can then be used to initiate polymerisation by a method known as ring opening metathesis polymerisation (ROMP). This is vey interesting because ROMP provides a way of growing organic semiconducting molecules with great precision. In short, we have here a better alligator clip for wiring up molecular electronics.

Making molecules work

The operation of most living organisms, from bacteria like E. Coli to multi-cellular organisms like ourselves, depends on molecular motors. These are protein-based machines which convert chemical energy to mechanical energy; the work our muscles do depends on many billions of these nanoscale machines all operating together, while individual motors propel bacteria or move materials around inside our cells. Molecular motors work in a very different way to the motors we are familiar with on the macroscopic scale, as has been revealed by some stunning experiments combining structural biology with single molecule biophysics. A good place to start getting a feel for how they work is with these movies of biological motors from Ronald Vale at UCSF.

The motors we use at the macroscopic scale to convert chemical energy to mechanical energy are heat engines, like petrol engines and steam turbines. The fuel is first burnt to convert chemical energy to heat energy, and this heat energy is then converted to useful work. Heat engines rely on the fact that you can maintain part of the engine at a higher temperature than the general environment. For example, in a petrol engine you burn the fuel in a cylinder, and then you extract work by allowing the hot gases expand against a piston. If you made a nanoscale petrol engine, it wouldn’t work, because the heat would diffuse out of the cylinder walls, cooling the gas down before it had a chance to expand. This is because the time taken for a hot body to cool down to ambient temperature depends on the square of its size. At the nanoscale, you can’t maintain significant temperature gradients for any useful length of time, so nanoscale motors have to work at constant temperature. The way biological molecular motors do this is by exploiting molecular shape change – the power stroke is provided by a molecule changing shape in response to the binding and unbinding of the fuel molecules and their products.

In our research at Sheffield we’ve been trying to learn from nature to make crude synthetic molecular motors that operate in the same way, by using molecular shape changes. The molecule we use is a polymer with weak acidic or basic groups along the backbone. For a polyacid, for example, in acidic conditions the molecule is uncharged and hydrophobic; it takes up a collapsed, compact shape. But when the acid is neutralised, the molecule ionises and becomes much more hydrophilic, substantially expanding in size. So, in principle we could use the expansion of a single molecule to do work.

How can we clock the motor, so that rather than just expanding a single time, our molecule will repeatedly cycle between the expanded and the compact shape? In biology, this happens because the reaction of the fuel molecule is actually catalysed by the the motor molecule. Our chemistry isn’t good enough to do this yet, so we use a much cruder approach.

We use a class of chemical reactions in which the chemical conditions spontaneously oscillate, despite the fact that the reactants are added completely steadily. The most famous of these reactions is the Belousov-Zhabotinksy reaction (see here for an explanation and a video of the experiment). With the help of Steve Scott from the University of Leeds, we’ve developed an oscillating reaction in which the acidity spontaneously oscillates over a range that is sufficient to trigger a shape change in our polyacid molecules.

You can see a progress report on our efforts in a paper in Faraday Discussions 128; the abstract is here and you can download the full paper as a PDF here (this is available under the author rights policy of the Royal Society of Chemistry, who own the copyright). We’ve been able to demonstrate the molecular shape change in response to the oscillating chemical reaction at both macroscopic and single chain level in a self-assembled structure. What we’ve not yet been able to do is directly measure the force generated by a single molecule; in principle we should be able to do this with an atomic force microscope whose tip is connected to a single molecule, the other end of which is grafted to a firm surface, but this has proved rather difficult to do in practise. This is high on our list of priorities for the future, together with some ideas about how we can use this motor to do interesting things, like propel a nanoscale object or pump chemicals across a membrane.

This work is a joint effort of my group in the physics department and Tony Ryan’s group in chemistry. In physics, Mark Geoghegan, Andy Parnell, Jon Howse, Simon Martin and Lorena Ruiz-Perez have all been involved in various aspects of the project, while the chemistry has been driven by Colin Crook and Paul Topham.

A visit from Sir Harry Kroto

We’re having a visit today, here at the University of Sheffield, from Sir Harry Kroto. Sir Harry, who shared the 1996 Nobel Prize in chemistry with Robert Curl and Richard Smalley is a graduate of Sheffield University and is here to open a new multidisciplinary research building which is going to be named after him.

Sir Harry gave a public lecture about nanoscience, which was an impassioned statement of his belief that nanoscience and technology (which he believes to be essentially synonymous with chemistry) offers the only way towards achieving a sustainable way of life for the whole of the world’s population.

Paint-on lasers and land-mine detection

One of the many interesting features of semiconducting polymers is that they can be made to lase. By creating a population of excited electronic states, a situation can be achieved whereby light is amplified by the process of stimulated emission, giving rise to an intense beam of coherent light. Because semiconducting polymers can be laid down in a thin film from a simple solution, it’s tempting to dream of lasers that are fabricated by simple and cheap processes, like printing, or are simply painted on to a surface. The problem with this is that, so far (and as far as I know), the necessary population of excited states has only been achieved by illuminating the material with another laser. This optical pumping, as it is called, is obviously less useful than the situation where the laser can be pumped electrically, as is the case in the kind of inorganic semiconductor lasers that are now everyday items in CD and DVD players. But a paper in this week’s Nature (abstract free, subscription required for full article) demonstrates another neat use for lasing action in semiconducting polymers – as an ultrasensitive detector for explosives. See also this press release.

The device relies on the fact that lasing is a highly non-linear effect; if an optically-pumped polymer laser is exposed to a material which influences only a few molecules at its surface, this can still kill the lasing action entirely. The molecule that is being used in this work, done at MIT by Timothy Swager’s group, is particularly sensitive to the explosive TNT. This device can work as a sensor that would be sensitive enough (and this needs to be in the parts per billion range) to detect the tiny traces of TNT vapour that a buried land-mine would emit.

This work, rather unsurprisingly, is supported by MIT’s Institute for Soldier Nanotechnologies. The development of these ultrasensitive sensors for the detection of chemicals in the environment forms a big part of the research effort in evolutionary nanotechnology. On the science side, this is driven by the fact that detecting the effects of molecules interacting with surfaces is intrinsically a lot easier in systems with nanoscaled components, simply because the surface in a nanostructured device has a great deal more influence on its properties than it would in a bulk material. On the demand side, the needs of defense and homeland security are, now more than ever, setting the research agenda in the USA.

Nobel Laureates Against Nanotechnology

This small but distinguished organisation has gained another two members. The theoretical condensed matter physicist Robert Laughlin, in his new book A Different Universe: reinventing physics from the bottom down, has a rather scathing assessment of nanotechnology, with which Philip Anderson (who is himself a Nobel Laureate and a giant of theoretical physics), reviewing the book in Nature(subscription required), concurs. Unlike Richard Smalley, Laughlin’s criticism is directed at the academic version of nanotechnology, rather than the Drexlerian version, but adherents of the latter shouldn’t feel too smug because Laughlin’s criticism applies with even more force to their vision. He blames the seductive power of reductionist belief for the delusion: “The idea that nanoscale objects ought to be controllable is so compelling it blinds a person to the overwhelming evidence that they cannot be”.

Nanotechnologists aren’t the only people singled out for Laughlin’s scorn. Other targets include quantum computing, string theory (“the tragic consequence of an obsolete belief system”) and most of modern biology (“an endless and unimaginably expensive quagmire of bad experiments”). But underneath all the iconoclasm and attitude (and personally I blame Richard Feynman for making all American theoretical physicists want to come across like rock stars), is a very serious message.

Laughlin’s argument is that reductionism should be superseded as the ruling ideology of science by the idea of emergence. To quote Anderson “The central theme of the book is the triumph of emergence over reductionism: that large objects such as ourselves are the product of principles of organization and of collective behaviour that cannot in any meaningful sense be reduced to the behaviour of our elementary constituents.” The origin of this idea is Anderson himself, in a widely quoted article from 1971 – More is different. In this view, the idea that physics can find a “Theory of Everything” is fundamentally wrong-headed. Chemistry isn’t simply the application of quantum mechanics, and biology is not simply reducible to chemistry; the organisation principles that underlie, say, the laws of genetics, are just as important as the properties of the things being organised.

Anderson’s views on emergence aren’t as widely known as they should be, in a world dominated by popular science books on string theory and “the search for the God particle”. But they have been influential; an intervention by Anderson is credited or blamed by many people for killing off the Superconducting Supercollider project, and he is one of the founding fathers of the field of complexity. Laughlin explicitly acknowledges his debt to Anderson, but he holds to a particularly strong version of emergence; it isn’t just that there are difficulties in practise in deriving higher level laws of organisation from the laws describing the interactions of their parts. Because the organisational principles themselves are more important than the detailed nature of the interactions between the things being organised, the reductionist program is wrong in principle, and there’s no sense in which the laws of quantum electrodynamics are more fundamental than the laws of genetics (in fact, Laughlin argues on the basis of the strong analogies between QED and condensed matter field theory that QED itself is probably emergent). To my (philosophically untrained) eye, this seems to put Laughlin’s position quite close to that of the philosopher of science Nancy Cartwright. There’s some irony in this, because Cartwright’s book The Dappled World was bitterly criticised by Anderson himself.

This takes us a long way from nanoscience and nanotechnology. It’s not that Laughlin believes that the field is unimportant; in fact he describes the place where nanoscale physics and biology meets as being the current frontier of science. But it’s a place that will only be understood in terms of emergent properties. Some of these, like self-assembly, are starting to be understood, but many others are not. But what is clear is that the reductionist approach of trying to impose simplicity where it doesn’t exist in nature simply won’t work.