Deccelerating change?

Everyone knows the first words spoken by a man on the moon, but what were the last words? This isn’t just a good pub quiz question, it’s also an affront to the notion that technological progress moves inexorably forward. To critics of the idea that technology is relentlessly accelerating, the fact that space travel now constitutes a technology that the world has essentially relinquished is a prime argument against the idea of inevitable technological progress. The latest of such critics is David Edgerton, whose book The Shock of the Old is now out in paperback.

Edgerton’s book has many good arguments, and serves as a useful corrective to the technological determinism that characterises quite a lot of discussion about technology. His aim is to give a history of innovation which de-emphasises the importance of invention, and to this end he helpfully draws attention to the importance of those innovations which occur during the use and adaptation of technologies, often quite old ones. One very important thing this emphasis on innovation in use does is bring into focus neglected innovations of the developing world, like the auto-rickshaw of India and Bangladesh and the long-tailed boat of Thailand. This said, I couldn’t help finding the book frequently rather annoying. Its standard rhetorical starting point is to present, generally without any reference, a “standard view” of the history of technology that wouldn’t be shared by anyone who knows anything about the subject: a series of straw men, in other words. This isn’t to say that there aren’t a lot of naive views about technology in wide circulation, but to suggest, for example, that it is the “conventional story” that the atomic bomb was the product of academic science, rather than the gigantic military-industrial engineering activity of the Manhatten Project, seems particularly far-fetched.

The style of the book is essentially polemic and anecdotal, the statistics that buttress the argument tending to be of the factoid kind (such as the striking assertion that the UK is home to 3.8 million unused fondue sets). In this and many other respects I found it a much less satisfying book than Vaclav Smil’s excellent 2-volume history of modern technology, Transforming the Twentieth Century: Technical Innovations and Their Consequences and Creating the Twentieth Century: Technical Innovations of 1867-1914 and Their Lasting Impact. These books reach similar conclusions, though Smil’s arguments are supported by substantially more data and carry a greater impact for being less self-consciously contrarian.

Smil’s view – and I suspect that Edgerton would share it, though I don’t think he states it so explicitly – is that the period of history in which there was the greatest leap forward in technology wasn’t present times, but the thirty or forty years of the late 19th and early 20th century that saw the invention of the telephone, the automobile, the aeroplane, electricity, mass production, and most important of all, the Haber-Bosch process. What then of that symbol of what many people think of as the current period of accelerating change – Moore’s law? Moore’s law is an observation about exponential growth of computer power with time, and one should start with an obvious point about exponential growth – it doesn’t come from accelerating change, but constant fractional change. If you are able to improve a process by x% a year, you get exponential growth. Moore’s law simply tells us that the semiconductor industry has been immensely successful at implementing incremental improvements to their technology, albeit at a rapid rate. Stated this way, Moore’s law doesn’t seem so out of place in Edgerton’s narrative of technology as being dominated, not by dramatic new inventions, but by many continuous small improvements in technologies old and new. This story, though, also makes clear how difficult it is to predict, before several generations of this kind of incremental improvement, which technologies are destined to have a major and lasting impact and which ones will peter out and disappoint their proponents. For me, therefore, the lesson to take away is not that new developments in science and technology might not have major and lasting impacts on society, it is simply that some humility is needed when one tries to identify in advance what will have lasting impact and what those impacts will end up being.

On December 17th, 1972, Eugene A. Cernan said the last words by a man on the moon: “OK Jack, let’s get this mutha outta here.”

Invisibility cloaks and perfect lenses – the promise of optical metamaterials

The idea of an invisibility cloak – a material which would divert light undetectably around an object – captured the imagination of the media a couple of years ago. For visible light, the possibility of an invisibility cloak remains a prediction, but it graphically illustrates the potential power of a line of research initiated a few years ago by the theoretical physicist Sir John Pendry of Imperial College, London. Pendry realised that constructing structures with peculiar internal structures of conductors and dielectrics would allow one to make what are in effect new materials with very unusual optical properties. The most spectacular of these new metamaterials would have a negative refractive index. In addition to making an invisibility cloak possible one could in principle use negative refractive index metamaterials to make a perfect lens, allowing one to use ordinary light to image structures much smaller than the limit of a few hundred nanometers currently set by the wavelength of light for ordinary optical microscopy. Metamaterials have been made which operate in the microwave range of the electromagnetic spectrum. But to make an optical metamaterial one needs to be able to fabricate rather intricate structures at the nanoscale. A recent paper in Nature Materials (abstract, subscription needed for full article) describes exciting and significant progress towards this goal. The paper, whose lead author is Na Liu, a student in the group of Harald Giessen at the University of Stuttgart, describes the fabrication of an optical metamaterial. This consists of a regular, three dimensional array of horseshoe shaped, sub-micron sized pieces of gold embedded in a transparent polymer – see the electron micrograph below. This metamaterial doesn’t yet have a negative refractive index, but it shows that a similar structure could have this remarkable property.

An optical metamaterial
An optical metamaterial consisting of split rings of gold in a polymer matrix. Electron micrograph from Harald Giessen’s group at 4. Physikalisches Institut, Universität Stuttgart.

To get a feel for how these things work, it’s worth recalling what happens when light goes through an ordinary material. Light, of course, consists of electromagnetic waves, so as a light wave passes a point in space there’s a rapidly alternating electric field. So any charged particle will feel a force from this alternating field. This leads to something of a paradox – when light passes through a transparent material, like glass or a clear crystal, it seems at first that the light isn’t interacting very much with the material. But since the material is full of electrons and positive nuclei, this can’t be right – all the charged particles in the material must be being wiggled around, and as they are wiggled around they in turn must be behaving like little aerials and emitting electromagetic radiation themselves. The solution to the paradox comes when one realises that all these waves emitted by the wiggled electrons interfere with each other, and it turns out that the net effect is of a wave propagating forward in the same direction as the light thats propagating through the material, only with a somewhat different velocity. It’s the ratio of this effective velocity in the material to the velocity the wave would have in free space that defines the refractive index. Now, in a structure like the one in the picture, we have sub-micron shapes of a metal, which is an electrical conductor. When this sees the oscillating electric field due to an incident light wave, the free electrons in the metal slosh around in a collective oscillation called a plasmon mode. These plasmons generate both electric and magnetic fields, whose behaviour depends very sensitively on the size and shape of the object in which the electrons are sloshing around in (to be strictly accurate, the plasmons are restricted to the region near the surface of the object; its the geometry of the surface that matters). If you design the geometry right, you can find a frequency at which both the magnetic and electric fields generated by the motion of the electrons is out of phase with the fields in the light wave that are exciting the plasmons – this is the condition for the negative refractive index which is needed for perfect lenses and other exciting possibilities.

The metamaterial shown in the diagram has a perfectly periodic pattern, and this is what’s needed if you want a uniform plane wave arriving at the material to excite another uniform plane wave. But, in principle, you should be able to design an metamaterial that isn’t periodic to direct and concentrate the light radiation any way you like, on length scales well below the wavelength of light. Some of the possibilities this might lead to were discussed in an article in Science last year, Circuits with Light at Nanoscales: Optical Nanocircuits Inspired by Metamaterials (abstract, subscription required for full article) by Nader Engheta at the University of Pennsylvania. If we can learn how to make precisely specified, non-periodic arrays of metallic, dielectric and semiconducting shaped elements, we should be able to direct light waves where we want them to go on the nanoscale – well below light’s wavelength. This might allow us to store information, to process information in all-optical computers, to interact with electrons in structures like quantum dots, for quantum computing applications, to image structures using light down to the molecular level, and to detect individual molecules with great sensitivity. I’ve said this before, but I’m more and more convinced that this is a potential killer application for advanced nanotechnology – if one really could place atoms in arbitrary, pre-prescribed positions with nanoscale accuracy, this is what one could do with the resulting materials.

The Tata Nano

The Tata Nano – the newly announced one lakh (100,000 rupees) car from India’s Tata group – hasn’t got a lot to do with nanotechnology (see this somewhat bemused and bemusing piece from the BBC), but since it raises some interesting issues I’ll use the name as an excuse to discuss it here.

The extensive media coverage in the Western media has been characterised by some fairly outrageous hypocrisy – for example, the UK’s Independent newspaper wonders “Can the world afford the Tata Nano?” (The answer, of course, is that what the world can’t afford are the much bigger cars parked outside all those prosperous Independent readers’ houses). With a visit to India fresh in my mind, it’s completely obvious to me why all those families one sees precariously perched on motor-bikes would want a small, cheap, economical car, and not at all obvious that those of us in the West, who are used to enjoying on average 11 times (for the UK) or 23 times (for the USA) more energy per head than the Indians, have any right to complain about the extra carbon dioxide emissions that will result. It’s almost certainly true that the world couldn’t sustain a situation in which all its 6.6 billion population used as much energy as the Americans and Europeans; the way that equation will be squared, though, ultimately must be by the rich countries getting by with less energy rather than by poorer countries being denied the opportunity to use more. It is to be hoped that this transformation takes place in a way that uses better technology to achieve the same or better living standards for everybody from a lot less energy; the probable alternative is the economic disruption and widespread involuntary cuts in living standards that will follow from a prolonged imbalance of energy supply and demand.

A more interesting question to ask about the Tata Nano is to wonder why it was not possible to leapfrog current technology to achieve something even more economical and sustainable – using, one hesitates to suggest, actual nanotechnology? Why is the Nano made from old-fashioned steel, with an internal combustion engine in the back, rather than, say, being made from advanced lightweight composites and propelled by an electric motor and a hydrogen fuel cell? The answers are actually fairly clear – because of cost, the technological capacity of this (or any other) company, and the requirement for maintainability. Aside from these questions, there’s the problem of infrastructure. The problems of creating an infrastructure for hydrogen as a fuel are huge for any country; liquid hydrocarbons are a very convenient store of energy, and, old though it is, the internal combustion engine is a pretty effective and robust device for converting energy. Of course, we can hope that new technologies will lead to new versions of the Tata Nano and similar cars of far greater efficiency, though realism demands that we understand the need for new technology to fit into existing techno-social systems to be viable.

Grand challenges for UK nanotechnology

The UK’s Engineering and Physical Sciences Research Council introduced a new strategy for nanotechnology last year, and some of the new measures proposed are beginning to come into effect (including, of course, my own appointment as the Senior Strategic Advisor for Nanotechnology). Just before Christmas the Science Minister announced the funding allocations for research for the next few years. Nanotechnology is one of six priority programmes that cut across all the Research Councils (to be precise, the cross-council programme has the imposing title: Nanoscience through Engineering to Application).

One strand of the strategy involves the funding of large scale integrated research programmes in areas where nanotechnology can contribute to issues of pressing societal or economic need. The first of these Grand Challenges – in the area of using nanotechnology to enable cheap, efficient and scalable ways to harvest solar energy – was launched last summer. An announcement on which proposals will be funded will be made within the next few months.

The second grand challenge will be launched next summer, and it will be in the general area of nanotechnology for healthcare. This is a very broad theme, of course – I discussed some of the potential areas, which include devices for delivering drugs and for rapid diagnosis, in an earlier post. To narrow the area down, there’s going to be an extensive process of consultation with researchers and people in the relevant industries – for details, see the EPSRC website. There’ll also be a role for public engagement; EPSRC is commissioning a citizens’ jury to consider the options and have an input into the decision of what area to focus on.

UK Government outlines nanorisk research needs

The UK government has released a second report reviewing progress and identifying knowledge gaps about the potential environmental and health risks arising from engineered nanoparticles. This is a comprehensive document, breaking down the problem into five areas. The first of these is the question of how you detect and measure nanoparticles and the second considers the ways in which people and the environment might be exposed to nanoparticles. The third area concerns the assessment of the degree to which some nanoparticles might be toxic to humans, while the fourth area considers potential environmental impacts. Finally, a fifth section considers wider social and economic dimensions of nanotechnology.

The document represents, in part, a response to the very critical verdict on the UK government’s response on nanotoxicology given by the Council for Science and Technology last March. It isn’t, of course, able to address the fundamental criticism: that the Government didn’t act on the recommendation of the Royal Society and set up a coordinated programme of research into the toxicology and health and environmental effects of nanomaterials, with dedicated funding, but instead relied on an ad-hoc process of waiting for proposals to come in through peer review with opportunistic funding from a number of sources. The response from the Royal Society reflects the continuing frustration at opportunities lost: “The Government has recognised the huge potential of nanotechnology and recognised what needs to be done to ensure that advances are realised safely, but by their own admission progress has been slow in some areas. Given the wealth of expertise in UK universities and industries we should be further ahead.

That’s old ground now, of course, so perhaps it’s worth focusing on some of the positive outcomes reported in the report. Quite a lot of work has been carried out or at least started. In the area of nanoparticles in the environment, for example, the Natural Environment Research Council has funded more than £2.3 million worth of projects, in areas ranging from studies of the toxicity of nanoparticles to fish and other aquatic organisms, to studies of the fate of silicon dioxide nanoparticles from pharmaceutical and cosmetic formulations in wastewaters and of the effect of silver nanoparticles on natural bacterial populations.

For another view of the positives and negatives of this report, it’s interesting to see the response of nanoparticle expert Andrew Maynard. More shocking is the way this report is mendaciously misquoted in an article in the Daily Mail: Alert over the march of the ‘grey goo’ in nanotechnology Frankenfoods (via TNTlog).

They

From the poem “They” by R.S. Thomas:

The new explorers don’t go
anywhere and what they discover
we can’t see. But they change our lives.

They interpret absence
as presence, measuring it by the movement
of its neighbours. Their world is

an immense place: deep down is as distant
as far out, but arrived at
in no time. These are the new

linguists, exchanging across closed
borders the currency of their symbols….

All the best for the New Year to all my readers.

Delivering genes

Gene therapy holds out the promise of correcting a number of diseases whose origin lies in the deficiency of a particular gene – given our growing knowledge of the human genome, and our ability to synthesise arbitrary sequences of DNA, one might think that the introduction of new genetic material into cells to remedy the effects of abnormal genes would be straightforward. This isn’t so. DNA is a relatively delicate molecule, and organisms have evolved efficient mechanisms for finding and eliminating foreign DNA. Viruses, on the other hand, whose entire modus operandi is to introduce foreign nucleic acids into cells, have evolved effective ways of packaging their payloads of DNA or RNA into cells. One approach to gene therapy co-opts viruses to deliver the new genetic material, though this sometimes has unpredicted and undesirable side-effects. So an effective, non-viral method of wrapping up DNA, introducing it into target cells and releasing it would be very desirable. My colleagues at Sheffield University, led by Beppe Battaglia, have recently demonstrated an effective and elegant way of introducing DNA into cells, in work recently reported in the journal Advanced Materials (subscription required for full paper).

The technique is based on the use of polymersomes, which I’ve described here before. Polymersomes are bags formed when detergent-like polymer molecules self-assemble to form a membrane which folds round on itself to form a closed surface. They are analogous to the cell membranes of biology, which are formed from soap-like molecules called phospholipids, and the liposomes that can be made in the laboratory from the same materials. Liposomes are used to wrap up and deliver molecules in some commercial applications already, including some drug delivery systems and in some expensive cosmetics. They’ve also been used in the laboratory to deliver DNA into cells, though they aren’t ideal for this purpose, as they aren’t very robust. Polymersomes allow one a great deal more flexibility in designing polymersomes with the properties one needs, and this flexibility is exploited to the full in Battaglia’s experiments.

To make a polymersome, one needs a block copolymer – a polymer with two or three chemically distinct sections joined together. One of these blocks needs to be hydrophobic, and one needs to be hydrophilic. The block copolymers used here, developed and synthesised in the group of Sheffield chemist Steve Armes, have two very nice features. The hydrophilic section is composed of poly(2-(methacryloyloxy)ethyl phosphorylcholine) – this is a synthetic polymer that presents the same chemistry to the adjoining solution as a naturally occurring phospholipid in a cell membrane. This means that polymersomes made from this material are able to circulate undetected within the body for longer than other water soluble polymers. The hydrophobic block is poly(2-(diisopropylamino)ethyl methacrylate). This is a weak base, so it has the property that its state of ionisation depends on the acidity of the solution. In a basic solution, it is un-ionized, and in this state it is strongly hydrophobic, while in an acidic solution it becomes charged, and in this state it is much more soluble in water. This means that polymersomes made from this material will be stable in neutral or basic conditions, but will fall apart in acid. Conversely, if one has the polymers in an acidic solution, together with the DNA one wants to deliver, and then neutralises the solution, polymersomes will spontaneously form, encapsulating the DNA.

The way these polymersomes work to introduce DNA into cells is sketched in the diagram below. On encountering a cell, the polymersome triggers the process of endocytosis, whereby the cell engulfs the polymersome in a little piece of cell membrane that is pinched off inside the cell. It turns out that the solution inside these endosomes is significantly more acidic than the surroundings, and this triggers the polymersome to fall apart, releasing its DNA. This, in turn, generates an osmotic pressure sufficient to burst open the endosome, releasing the DNA into the cell interior, where it is free to make its way to the nucleus.

The test of the theory is to see whether one can introduce a section of DNA into a cell and then demonstrate how effectively the corresponding gene is expressed. The DNA used in these experiments was the gene that codes for a protein that fluoresces – the famous green fluorescent protein, GFP, originally obtained from certain jelly-fish – making it easy to detect whether the protein coded for by the introduced gene has actually been made. In experiments using cultured human skin cells, the fraction of cells in which the new gene was introduced was very high, while few toxic effects were observed, in contrast to a control experiment using an existing, commercially available gene delivery system, which was both less effective at introducing genes and actually killed a significant fraction of the cells.

Polymersome endocytosis
A switchable polymersome as a vehicle for gene delivery. Beppe Battaglia, University of Sheffield.

Soft Machines in paperback

My book, Soft Machines: nanotechnology and life, has now been released in the UK as a paperback, with a price of £9.99. It should be available in the USA early in the new year. It’s available from from Amazon UK here, and can be preordered from from Amazon USA here.

Having an opportunity to make corrections, I re-read the book in the summer. One very embarrassing numerical error needed correcting, and anything to do with the dimensions of semiconductor processes needed to be updated to account for four more years of Moore’s law. But in general I think what I wrote has stood the test of time pretty well.

Bangalore Nano

I’m on my way back from India, where I’ve been at the conference Bangalore Nano 07. The enthusiasm for nanotechnology in India has been well publicised; it’s traditional to bracket the country with China as two rising science powers that see nano as an area in which they can compete on equal terms with the USA, Europe and Japan. So it was great to get an opportunity to see for myself something of what’s going on.

I’ll just mention a couple of highlights from the conference itself. Prof Ramgopal Rao from the Indian Institute of Technology Bombay described a very nice looking project to make an inexpensive point of care system for cardiac diagnostics. He began with the gloomy thought that soon more than half the cases of cardiac disease in the world will be India. If acute myocardial infarction can be detected early enough a heart attack can be prevented, but this currently needs expensive and time consuming tests. The need, then, is for a simple test that’s cheap and reliable enough to be done in a doctor’s office or clinic.

To do this one needs to integrate a microfluidic system to handle the blood sample, a sensor array to detect the appropriate biochemical markers, and a box of electronics to analyse the results. The sensor array and fluid handling system needs to be disposable, and to cost no more than a few hundred rupees (i.e. a couple of dollars), while the box should only cost a few thousand rupees, even though the protocols for diagnosis need to be quite sophisticated and robust. Rao is aiming for a working prototype very soon; the biosensor is based on a cantilever which bends when the marker binds to a bound antibody. He uses a polymer photoresist to make the cantilever, with an embedded poly-silicon piezo-resistor to measure the deflection (this isn’t trivial at all, as the change in resistance amounts to about 10 parts per million).

Another nice talk was from Prof T. Pradeep, a surface chemist from the Indian Institute of Technology Madras. He described a water filter incorporating gold and silver nanoparticles mounted on a substrate of alumina, which is particularly effective at removing halogenated organic compounds such as pesticide residues. This is already marketed, with the filter cartridge costing about a few hundred rupees. He also mentioned a kit that can test for such pesticide residues with a detection limit of around 25 parts per billion.

The closing talk was given by Prof CNR Rao, and consisted of reflections on the future of nanotechnology. His opinions are worth paying attention to, if for no other reason than that he is undoubtedly the most powerful and influential scientist in India, and his views will shape the way nanotechnology is pursued there. What follows are my notes on his talk, tidied up but not verbatim.

Rao is a materials chemist, and he started by observing that now we can make pretty much any material in any form. But the question is, how can we use them, how can we assemble them and integrate them into devices? This is the biggest gap – we need products, devices and machines from nano-objects, and this is still probably at least 10 years, maybe 15 years, away. But we shouldn’t worry just about products and devices – nanotechnology is a new type of science, which will dissolve barriers between physics and chemistry and biology and bring in engineering. As an example – many people have made molecular motors. But… can they be connected together to do something? This sort of thing needs combinations of molecular science and nanoscience. Soft matter is another area with many good people and interesting work, including some in Bangalore. But there’s still a gap in applying them, what about active gels? Similarly, we see big successes in sensors, imaging but there’s much left to do. As an example of one very big challenge, many people suffer in Bangalore and everywhere else from dementia; we know this is related to the nanoscale phenomenon of peptide aggregation, but we need to understand why it happens and how to stop it. Drug delivery and tissue engineering are other examples where nanotechnology can make a real impact on human suffering. If one wants a role model, Robert Langer is a great example of someone who has produced many new results in tissue engineering and drug delivery, many graduate students and many companies; science in India should be done like this. We must remove the barriers and bureaucracy to give more freedom to scientists and engineers. At the moment, public servants like academics, cannot get involved in private enterprise, and this must change. Nanotechnology doesn’t take much money – it’s the archetypal knowledge based industry, and as such it should lead to much more linkage between industry and academia.

Nanotechnology in Korea

One of my engagements in a slightly frantic period last week was to go to a UK-Korea meeting on collaboration in nanotechnology. This had some talks which gave a valuable insight into how the future of nanotechnology is seen in Korea. It’s clearly seen as central to their program of science and technology; according to some slightly out-of-date figures I have to hand about government spending on nanotechnology, Korea ranks 5th, after the USA, Japan, Germany and France, and somewhat ahead of the UK. Dr Hanjo Lim, of the Korea Science and Engineering Foundation, gave a particularly useful overview.

He starts out by identifying the different ways in which going small helps. Nanotechnology exploits a confluence of 3 types of benefits – nanomaterials exploit surface matter, in which benefits arise from their high surface to volume ratio, with most obvious benefits from catalysis. They exploit quantum matter, size dependent quantum effects that are so important for band gap engineering and making quantum dots. And they can exploit soft matter, which is so important for the bio-nano interface. As far as Korea is concerned, as a small country with a well-developed industrial base, he sees four important areas. Applications in information and communication technology will obviously directly impact the strong position Korea has in the semiconductor industry and the display industry, as well as having an impact on automobiles. Robots and ubiquitous devices play to Korea’s general comparative advantage in manufacturing, but applications in Nano foods and medical science are relatively weak in Korea at the moment. Finally, the environmentally important applications in Fuel/solar cells, air and water treatments will be of growing importance in Korea, as everywhere else.

Korea ranks 4th or 5th in the world in terms of nano-patents; the plan is, up to 2010, to expand existing strength in nanotechnology and industrialise this by developing technology specific to applications. Beyond that the emphasis will be on systems level integration and commercialisation of those developments. Clearly, in electronics we are already in the nano- era. Korea has a dominant position in flash memory, where Hwang’s law – that memory density doubles every year – represents a more aggressive scaling than Moore’s law. To maintain this will require perhaps carbon nanotubes or silicon nanowires. Lim finds nanotubes very attractive but given the need for control of chirality and position his prediction is that this is still more than 10 years until commercialisation. An area that he thinks will grow in importance is the integration of optical interconnects in electronics. This, in his view, will be driven by the speed and heat issues in CPU that arise from metal interconnects – he reminds us that a typical CPU has 10 km of electrical wire, so it’s no wonder that heat generation is a big problem, and Google’s data centres come equipped with 5 story cooling towers. Nanophotonics will enable integration of photonic components within silicon multi-chip CPUs – but the problem that silicon is not good for lasers will have to be overcome. Either lasers off the chip will have to be used, or silicon laser diodes developed. His prognosis is, recognising that we have box to box optical interconnects now, and board to board interconnects are coming, that we will have chip to chip intercoonnects on the 1 – 10 cm scale by 2010, with intrachip connects by 2010-2015.

Anyone interested in more general questions of the way the Korean innovation system is developing will find much to interest them in a recent Demos pamphlet: Korea: Mass innovation comes of age. Meanwhile, I’ll be soon reporting on nanotechnology in another part of Asia; I’m writing this from Bangalore/Bengalooru in India, where I will be talking tomorrow at Bangalore Nano 2007.