What’s meant by “food nanotechnology”?

A couple of weeks ago I took part in a dialogue meeting in Brussels organised by the CIAA, the Confederation of the Food and Drink Industries of the EU, about nanotechnology in food. The meeting involved representatives from big food companies, from the European Commission and agencies like the European Food Safety Association, together with consumer groups like BEUC, and the campaigning group Friends of the Earth Europe. The latter group recently released a report on food nanotechnology – Out of the laboratory and on to our plates: Nanotechnology in food and agriculture; according to the press release, this “reveals that despite concerns about the toxicity risks of nanomaterials, consumers are unknowingly ingesting them because regulators are struggling to keep pace with their rapidly expanding use.” The position of the CIAA is essentially that nanotechnology is an interesting technology currently in research rather than having yet made it into products. One can get a good idea of the research agenda of the European food industry from the European Technology Platform Food for Life. As the only academic present, I tried in my contribution to clarify a little the different things people mean by “food nanotechnology”. Here, more or less, is what I said.

What makes the subject of nanotechnology particularly confusing and contentious is the ambiguity of the definition of nanotechnology when applied to food systems. Most people’s definitions are something along the lines of “the purposeful creation of structures with length scales of 100 nm or less to achieve new effects by virtue of those length-scales”. But when one attempts to apply this definition in practise one runs into difficulties, particularly for food. It’s this ambiguity that lies behind the difference of opinion we’ve heard about already today about how widespread the use of nanotechnology in foods is already. On the one hand, Friends of the Earth says they know of 104 nanofood products on the market already (and some analysts suggest the number may be more than 600). On the other hand, the CIAA (the Confederation of Food and Drink Industries of the EU) maintains that, while active research in the area is going on, no actual nanofood products are yet on the market. In fact, both parties are, in their different ways, right; the problem is the ambiguity of definition.

The issue is that food is naturally nano-structured, so that too wide a definition ends up encompassing much of modern food science, and indeed, if you stretch it further, some aspects of traditional food processing. Consider the case of “nano-ice cream”: the FoE report states that “Nestlé and Unilever are reported to be developing a nano- emulsion based ice cream with a lower fat content that retains a fatty texture and flavour”. Without knowing the details of this research, what one can be sure of is that it will involve essentially conventional food processing technology in order to control fat globule structure and size on the nanoscale. If the processing technology is conventional (and the economics of the food industry dictates that it must be), what makes this nanotechnology, if anything does, is the fact that analytical tools are available to observe the nanoscale structural changes that lead to the desirable properties. What makes this nanotechnology, then, is simply knowledge. In the light of the new knowledge that new techniques give us, we could even argue that some traditional processes, which it now turns out involve manipulation of the structure on the nanoscale to achieve some desirable effects, would constitute nanotechnology if it was defined this widely. For example, traditional whey cheeses like ricotta are made by creating the conditions for the whey proteins to aggregate into protein nanoparticles. These subsequently aggregate to form the particulate gels that give the cheese its desirable texture.

It should be clear, then, that there isn’t a single thing one can call “nanotechnology” – there are many different technologies, producing many different kinds of nano-materials. These different types of nanomaterials have quite different risk profiles. Consider cadmium selenide quantum dots, titanium dioxide nanoparticles, sheets of exfoliated clay, fullerenes like C60, casein micelles, phospholipid nanosomes – the risks and uncertainties of each of these examples of nanomaterials are quite different and it’s likely to be very misleading to generalise from any one of these to a wider class of nanomaterials.

To begin to make sense of the different types of nanomaterial that might be present in food, there is one very useful distinction. This is between engineered nanoparticles and self-assembled nanostructures. Engineered nanoparticles are covalently bonded, and thus are persistent and generally rather robust, though they may have important surface properties such as catalysis, and they may be prone to aggregate. Examples of engineered nanoparticles include titanium dioxide nanoparticles and fullerenes.

In self-assembled nanostructures, though, molecules are held together by weak forces, such as hydrogen bonds and the hydrophobic interaction. The weakness of these forces renders them mutable and transient; examples include soap micelles, protein aggregates (for example the casein micelles formed in milk), liposomes and nanosomes and the microcapsules and nanocapsules made from biopolymers such as starch.

So what kind of food nanotechnology can we expect? Here are some potentially important areas:

• Food science at the nanoscale. This is about using a combination of fairly conventional food processing techniques supported by the use of nanoscale analytical techniques to achieve desirable properties. A major driver here will be the use of sophisticated food structuring to achieve palatable products with low fat contents.
• Encapsulating ingredients and additives. The encapsulation of flavours and aromas at the microscale to protect delicate molecules and enable their triggered or otherwise controlled release is already widespread, and it is possible that decreasing the lengthscale of these systems to the nanoscale might be advantageous in some cases. We are also likely to see a range of “nutriceutical” molecules come into more general use.
• Water dispersible preparations of fat-soluble ingredients. Many food ingredients are fat-soluble; as a way of incorporating these in food and drink without fat manufacturers have developed stable colloidal dispersions of these materials in water, with particle sizes in the range of hundreds of nanometers. For example, the substance lycopene, which is familiar as the molecule that makes tomatoes red and which is believed to offer substantial health benefits, is marketed in this form by the German company BASF.

What is important in this discussion is clarity – definitions are important. We’ve seen discrepancies between estimates of how widespread food nanotechnology is in the marketplace now, and these discrepancies lead to unnecessary misunderstanding and distrust. Clarity about what we are talking about, and a recognition of the diversity of technologies we are talking about, can help remove this misunderstanding and give us a sound basis for the sort of dialogue we’re participating in today.

From micro to nano for medical applications

I spent yesterday at a meeting at the Institute of Mechanical Engineers, Nanotechnology in Medicine and Biotechnology, which raised the question of what is the right size for new interventions in medicine. There’s an argument that, since the basic operations of cell biology take place on the nano-scale, that’s fundamentally the right scale for intervening in biology. On the other hand, given that many current medical interventions are very macroscopic, operating on the micro-scale may already offer compelling advantages.

A talk from Glasgow University’s Jon Cooper gave some nice examples illustrating this. His title was Integrating nanosensors with lab-on-a-chip for biological sensing in health technologies, and he began with some true nanotechnology. This involved a combination of fluid handling systems for very small volumes with nanostructured surfaces, with the aim of detecting single biomolecules. This depends on a remarkable effect known as surface enhanced Raman scattering. Raman scattering is a type of spectroscopy that can detect chemical groups with what is normally rather low sensitivity. But if one illuminates metals with very sharp asperities, this hugely magnifies the light field very close to the surface, increasing sensitivity by a factor of ten million or so. Systems based on this effect, using silver nanoparticles coated so that pathogens like anthrax will stick to them, are already in commercial use. But Cooper’s group uses, not free nano-particles, but very precisely structured nanosurfaces. Using electron beam lithography his group creates silver split-ring resonators – horseshoe shapes about 160 nm across. With a very small gap one can get field enhancements of a factor of one hundred billion, and it’s this that brings single molecule detection into prospect.

On a larger scale, Cooper described systems to probe the response of single cells – his example involved using a single heart cell (a cardiomyocyte) to screen responses to potential heart drugs. This involved a pico-litre scale microchamber adjacent to an array of micron size thermocouples, which allow one to monitor the metabolism of the cell as it responds to a drug candidate. His final example was on the millimeter scale, though its sensors incorporated nanotechnology at some level. This was a wireless device incorporating an electrochemical blood sensor – the idea was that one would swallow this to screen for early signs of bowel cancer. Here’s an example where, obviously, smaller would be better, but how small does one need to go?

Nanoparticles down the drain

With significant amounts of nanomaterials now entering markets, it’s clearly worth worrying about what’s going to happen these materials after disposal – is there any danger of them entering the environment and causing damage to ecosystems? These are the concerns of the discipline of nano-ecotoxicology; on the evidence of the conference I was at yesterday, on the Environmental effects of nanoparticles, at Birmingham, this is an expanding field.

From the range of talks and posters, there seems to be a heavy focus (at least in Europe) on those few nanomaterials which really are entering the marketplace in quantity – titanium dioxide, of sunscreen fame, and nano-silver, with some work on fullerenes. One talk, by Andrew Johnson, of the UK’s Centre for Ecology and Hydrology at Wallingford, showed nicely what the outline of a comprehensive analysis of the environmental fate of nanoparticles might look like. His estimate is that 130 tonnes of nano-titanium dioxide a year is used in sunscreens in the UK – where does this stuff ultimately go? Down the drain and into the sewers, of course, so it’s worth worrying what happens to it then.

At the sewage plant, solids are separated from the treated water, and the first thing to ask is where the titanium dioxide nanoparticles go. The evidence seems to be that a large majority end up in the sludge. Some 57% of this treated sludge is spread on farmland as fertilizer, while 21% is incinerated and 17% goes to landfill. There’s work to be done, then, in determining what happens to the nanoparticles – do they retain their nanoparticulate identity, or do they aggregate into larger clusters? One needs then to ask whether those that survive are likely to cause damage to soil microorganisms or earthworms. Johnson presented some reassuring evidence about earthworms, but there’s clearly more work to be done here.

Making a series of heroic assumptions, Johnson made some estimates of how many nanoparticles might end up in the river. Taking a worst case scenario, with a drought and heatwave in the southeast of England (they do happen, I’m old enough to remember) he came up with an estimate of 8 micrograms/litre in the Thames, which is still more than an order of magnitude less than that that has been shown to start to affect, for example, rainbow trout. This is reassuring, but, as one questioner pointed out, one still might worry about the nanoparticles accumulating in sediments to the detriment of filter feeders.

The mis-measure of uncertainty

A couple of pieces in the Financial Times today and yesterday offer some food for thought about the problems of commercialising scientific research. Yesterday’s piece – Drug research needs serendipity (free registration may be required) – concentrates on the pharmaceutical sector, but its observations are more widely applicable. Musing on the current problems of big pharma, with their dwindling pipelines of new drugs, David Shaywitz and Nassim Taleb (author of The Black Swan), identify the problem as a failure to deal with uncertainty; “academic researchers underestimated the fragility of their scientific knowledge while pharmaceuticals executives overestimated their ability to domesticate scientific research.”

They identify two types of uncertainty; there’s the negative uncertainty of all the things that can go wrong as one tries to move from medical research to treatments. Underlying this is the simple fact that we know much less about human biology, in all its complexity, than one might think from all the positive headlines and press releases. It’s in response to this negative uncertainty that managers have attempted to impose more structure and focus to make the outcome of research more predictable. But why this is generally in vain? “Answer: spreadsheets are easy; science is hard.” According to Shaywitz and Taleb, this approach isn’t just doomed to fail on its on terms, it’s positively counterproductive. This is because it doesn’t leave any room for another type of uncertainty: the positive uncertainty of unexpected discoveries and happy accidents.

Their solution is to embrace the trend we’re already seeing, for big Pharma to outsource more and more of its functions, lowering the barriers to entry and leaving room for “a lean, agile organisation able to capture, consider and rapidly develop the best scientific ideas in a wide range of disease areas and aggressively guide these towards the clinic.”

But how are things for the small and agile companies that are going to be driving innovation in this new environment? Not great, says Jonathan Guthrie in today’s FT, but nonetheless “There is hope yet for science park toilers”. The article considers, from a UK perspective, the problems small technology companies are having raising money from venture capitalists. It starts from the position that the problem isn’t shortage of money but shortage of good ideas; perhaps not the end of the age of innovation, but a temporary lull after the excitement of personal computers, the internet and mobile phones. And, for the part of the problem that lies with venture capitalists, misreading this cycle has contributed to their difficulties. In the wake of the technology bubble, venture capital returns aren’t a good advertisement for would-be investors at the moment – “funds set up after 1996 have typically lost 1.4 per cent a year over five years and 1.8 per cent over 10 years, says the British Private Equity and Venture Capital Association.” All is not lost, Guthrie thinks – as the memory of the dotbomb debacles fade the spectacular returns enjoyed by the most successful technology start-ups will attract money back into the sector. Where will the advances take place? Not in nanotechnology, at least in the form of the nanomaterials sector as it has been understood up to now: “materials scientists have engineered a UK nanotechnology sector so tiny it is virtually invisible.” Instead Guthrie points to renewable energy and power saving systems.

USA lagging Europe in nanotechnology risk research

How much resource is being devoted to assessing the potential risks of the nanotechnologies that are currently at or close to market? Not nearly enough, say campaigning groups, while governments, on the other hand, release impressive sounding figures for their research spend. Most recently, the USA’s National Nanotechnology Initiative has estimated its 2006 spend on nano-safety research as $68 million, which sounds very impressive. However, according to Andrew Maynard, a leading nano-risk researcher based at the Woodrow Wilson Center in Washington DC, we shouldn’t take this figure at face value.

Maynard comments on the figure on the SafeNano blog, referring to an analysis recently done by him and described in a news release from the Woodrow Wilson Center’s Project on Emerging Nanotechnologies. It seems that this figure is obtained by adding up all sorts of basic nanotechnology research, some of which might have only tangential relevance to problems of risk. If one applies a tighter definition of research that is either highly relevant to nanotechnology risk – such as a direct toxicology study – or substantially relevant -such as a study of the fate in the body of medical nanoparticles – it seems that the numbers fall substantially. Only $13 million of the $68 million was highly relevant to nanotechnology risk, with this number increasing to $29 million if the substantially relevant category is included too. This compares unfavourably with European spending, which amounts to $24 million in the highly relevant category alone.

Of course, it isn’t the headline figure that matters; what’s important is whether the research is relevant to the actual and potential risks that are out there. The Project on Emerging Nanotechnologies has done a great service by compiling an international inventory of nanotechnology risk research which allows one to see clearly just what sort of risk research is being funded across the world. It’s clear from this that suggestions that nanotechnology is being commercialised with no risk research at all being done are wide of the mark; what requires further analysis is whether all the right research is being done.

The Tata Nano

The Tata Nano – the newly announced one lakh (100,000 rupees) car from India’s Tata group – hasn’t got a lot to do with nanotechnology (see this somewhat bemused and bemusing piece from the BBC), but since it raises some interesting issues I’ll use the name as an excuse to discuss it here.

The extensive media coverage in the Western media has been characterised by some fairly outrageous hypocrisy – for example, the UK’s Independent newspaper wonders “Can the world afford the Tata Nano?” (The answer, of course, is that what the world can’t afford are the much bigger cars parked outside all those prosperous Independent readers’ houses). With a visit to India fresh in my mind, it’s completely obvious to me why all those families one sees precariously perched on motor-bikes would want a small, cheap, economical car, and not at all obvious that those of us in the West, who are used to enjoying on average 11 times (for the UK) or 23 times (for the USA) more energy per head than the Indians, have any right to complain about the extra carbon dioxide emissions that will result. It’s almost certainly true that the world couldn’t sustain a situation in which all its 6.6 billion population used as much energy as the Americans and Europeans; the way that equation will be squared, though, ultimately must be by the rich countries getting by with less energy rather than by poorer countries being denied the opportunity to use more. It is to be hoped that this transformation takes place in a way that uses better technology to achieve the same or better living standards for everybody from a lot less energy; the probable alternative is the economic disruption and widespread involuntary cuts in living standards that will follow from a prolonged imbalance of energy supply and demand.

A more interesting question to ask about the Tata Nano is to wonder why it was not possible to leapfrog current technology to achieve something even more economical and sustainable – using, one hesitates to suggest, actual nanotechnology? Why is the Nano made from old-fashioned steel, with an internal combustion engine in the back, rather than, say, being made from advanced lightweight composites and propelled by an electric motor and a hydrogen fuel cell? The answers are actually fairly clear – because of cost, the technological capacity of this (or any other) company, and the requirement for maintainability. Aside from these questions, there’s the problem of infrastructure. The problems of creating an infrastructure for hydrogen as a fuel are huge for any country; liquid hydrocarbons are a very convenient store of energy, and, old though it is, the internal combustion engine is a pretty effective and robust device for converting energy. Of course, we can hope that new technologies will lead to new versions of the Tata Nano and similar cars of far greater efficiency, though realism demands that we understand the need for new technology to fit into existing techno-social systems to be viable.

Nanotechnology in Korea

One of my engagements in a slightly frantic period last week was to go to a UK-Korea meeting on collaboration in nanotechnology. This had some talks which gave a valuable insight into how the future of nanotechnology is seen in Korea. It’s clearly seen as central to their program of science and technology; according to some slightly out-of-date figures I have to hand about government spending on nanotechnology, Korea ranks 5th, after the USA, Japan, Germany and France, and somewhat ahead of the UK. Dr Hanjo Lim, of the Korea Science and Engineering Foundation, gave a particularly useful overview.

He starts out by identifying the different ways in which going small helps. Nanotechnology exploits a confluence of 3 types of benefits – nanomaterials exploit surface matter, in which benefits arise from their high surface to volume ratio, with most obvious benefits from catalysis. They exploit quantum matter, size dependent quantum effects that are so important for band gap engineering and making quantum dots. And they can exploit soft matter, which is so important for the bio-nano interface. As far as Korea is concerned, as a small country with a well-developed industrial base, he sees four important areas. Applications in information and communication technology will obviously directly impact the strong position Korea has in the semiconductor industry and the display industry, as well as having an impact on automobiles. Robots and ubiquitous devices play to Korea’s general comparative advantage in manufacturing, but applications in Nano foods and medical science are relatively weak in Korea at the moment. Finally, the environmentally important applications in Fuel/solar cells, air and water treatments will be of growing importance in Korea, as everywhere else.

Korea ranks 4th or 5th in the world in terms of nano-patents; the plan is, up to 2010, to expand existing strength in nanotechnology and industrialise this by developing technology specific to applications. Beyond that the emphasis will be on systems level integration and commercialisation of those developments. Clearly, in electronics we are already in the nano- era. Korea has a dominant position in flash memory, where Hwang’s law – that memory density doubles every year – represents a more aggressive scaling than Moore’s law. To maintain this will require perhaps carbon nanotubes or silicon nanowires. Lim finds nanotubes very attractive but given the need for control of chirality and position his prediction is that this is still more than 10 years until commercialisation. An area that he thinks will grow in importance is the integration of optical interconnects in electronics. This, in his view, will be driven by the speed and heat issues in CPU that arise from metal interconnects – he reminds us that a typical CPU has 10 km of electrical wire, so it’s no wonder that heat generation is a big problem, and Google’s data centres come equipped with 5 story cooling towers. Nanophotonics will enable integration of photonic components within silicon multi-chip CPUs – but the problem that silicon is not good for lasers will have to be overcome. Either lasers off the chip will have to be used, or silicon laser diodes developed. His prognosis is, recognising that we have box to box optical interconnects now, and board to board interconnects are coming, that we will have chip to chip intercoonnects on the 1 – 10 cm scale by 2010, with intrachip connects by 2010-2015.

Anyone interested in more general questions of the way the Korean innovation system is developing will find much to interest them in a recent Demos pamphlet: Korea: Mass innovation comes of age. Meanwhile, I’ll be soon reporting on nanotechnology in another part of Asia; I’m writing this from Bangalore/Bengalooru in India, where I will be talking tomorrow at Bangalore Nano 2007.

Giant magnetoresistance – from the iPod to the Nobel Prize

This year’s Nobel Prize for Physics, it was announced today, has been awarded to Albert Fert, from Orsay, Paris, and Peter Grünberg, from the Jülich research centre in Germany, for their discovery of giant magnetoresistance, an effect whereby a structure of layers of alternating magnetic and non-magnetic materials, each only a few atoms thick, has an electrical resistance that is very strongly changed by the presence of a magnetic field.

The discovery was made in 1988, and at first seemed an interesting but obscure piece of solid state physics. But very soon it was realised that this effect would make it possible to make very sensitive magnetic read heads for hard disks. On a hard disk drive, information is stored as tiny patterns of magnetisation. The higher the density of information one is trying to store on a hard drive, the weaker the resulting magnetic field, and so the more sensitive the read head needs to be. The new technology was launched onto the market in 1997, and it is this technology that has made possible the ultra-high density disk drives that are used in MP3 players and digital video recorders, as well as in laptops.

The rapidity with which this discovery was commercialised is remarkable. One probably can’t rely on this happening very often, but this is a salutory reminder that sometimes discoveries can move from the laboratory to a truly industry-disrupting product very quickly indeed, if the right application can be found, and if the underlying technology (in this case the nanotechnology required for making highly uniform films only a few atoms thick) is in place.

Towards the $1000 human genome

It currently costs about a million dollars to sequence an individual human genome. One can expect incremental improvements in current technology to drop this price to around $100,000, but the need that current methods have to amplify the DNA will make it difficult for this price to drop further. So, to meet a widely publicised target of a $1000 genome a fundamentally different technology is needed. One very promising approach uses the idea of threading a single DNA molecule through a nanopore in a membrane, and identifiying each base by changes in the ion current flowing through the pore. I wrote about this a couple of years ago, and a talk I heard yesterday from one of the leaders in the field prompts me to give an update.

The original idea for this came from David Deamer and Dan Branton, who filed a patent for the general scheme in 1998. Hagan Bayley, from Oxford, whose talk I heard yesterday, has been collaborating with Reza Ghadiri from Scripps, to implement this scheme using a naturally occuring pore forming protein, alpha-hemolysin, as the reader.

The key issues are the need to get resolution at a single base level, and the correct identification of the bases. They get extra selectivity by a combination of modification of the pore by genetic engineering, and insertion into the pore of small ring molecules – cyclodextrins. At the moment speed of reading is a problem – when the molecules are pulled through by an electric field they tend to go a little too fast. But, in an alternative scheme in which bases are chopped off the chain one by one and dropped into the pore sequentially, they are able to identify individual bases reliably.

Given that the human genome has about 6 million bases, they estimate that at 1 millisecond reading time per base they’ll need to use 1000 pores in parallel to sequence a genome in under a day (taking into account the need for a certain amount of redundancy for error correction). To prepare the way for commercialisation of this technology, they have a start-up company – Oxford NanoLabs – which is working on making a miniaturised and rugged device, about the size of a palm-top computer, to do this kind of analysis.

Stochastic sensor
Schematic of a DNA reader using the pore forming protein alpha-hemolysin. As the molecule is pulled through the pore, the ionic conduction through the pore varies, giving a readout of the sequence of bases. From the website of the Theoretical and Computational Biophysics group at the University of Illinois at Urbana-Champaign.

Three good reasons to do nanotechnology: 2. For healthcare and medical applications

Part 1 of this series of posts dealt with applications of nanotechnology for sustainable energy. Here I go on to describe why so many people are excited about the possibilities for applying nanotechnology in medicine and healthcare.

It should be no surprise that medical applications of nanotechnology are very prominent in many people’s research agenda. Despite near universal agreement about the desirablility of more medical research, though, there are some tensions in the different visions people have of future nanomedicine. To the general public the driving force is often the very personal experience most people have of illness in themselves or people close to them, and there’s a lot of public support for more work aimed at the well known killers of western world, such as cardiovascular disease, cancer, and degenerative diseases like Alzheimer’s and Parkinson’s. Economic factors, though, are important for those responsible for supplying healthcare, whether that’s the government or a private sector insurer. Maybe it’s a slight exaggeration to say that the policy makers’ ideal would be for people to live in perfect health until they were 85 and then tidily drop dead, but it’s certainly true that the prospect of an ageing population demanding more and more expensive nursing care is one that is exercising policy-makers in a number of prosperous countries. In the developing world, there are many essentially political and economic issues which stand in the way of people being able to enjoy the levels of health we take for granted in Europe and the USA, and matters like the universal provision of clean water are very important. Important though the politics of public health is, the diseases that blight developing world, such as AIDS, tuberculosis and malaria, still present major science challenges. Finally, back in the richest countries of the world, there’s a climate of higher expectations of medicine, where people look to medicine to do more than to fix obvious physical ailments, and to move into the realm of human enhancement and prolonging of life beyond what might formerly be regarded as a “natural” lifespan.

So how can nanotechnology help? There are three broad areas.

1. Therapeutic applications of nanotechnology. An important area of focus for medical applications of nanotechnology has been in the area of drug delivery. This begins from the observation that when a patient takes a conventionally delivered drug, an overwhelmingly large proportion of the administered drug molecules don’t end up acting on the biological systems that they are designed to affect. This is a serious problem if the drug has side effects; the larger the dose that has to be administered to be sure that some of the molecule actually gets to the place where it is needed, the worse these side-effects will be. This is particularly obvious, and harrowing, for the intrinsically toxic molecules the drugs used for cancer chemotherapy. Another important driving force for improving delivery mechanisms is the fact that, rather than the simple and relatively robust small molecules that have been the main active ingredients in drugs to date, we are turning increasingly to biological molecules like proteins (such as monoclonal antibodies) and nucleic acids (for example, DNA for gene therapy and small interfering RNAs). These allow very specific interventions into biological processes, but the molecules are delicate, and are easily recognised and destroyed in the body. To deliver a drug, current approaches include attaching it to a large water soluble polymer molecule which is essentially invisible to the body, or wrapping it up in a self-assembled nanoscale bag – a liposome – formed from soap like molecules like phospholipids or block copolymers. Attaching the drug to a dendrimer – a nanoscale treelike structure which may have a cavity in its centre – is conceptually midway between these two approaches. The current examples of drug delivery devices that have made it into clinical use are fairly crude, but future generations of drug delivery vehicles can be expected to include “stealth” coatings to make them less visible to the body, mechanisms for targeting them to their destination tissue or organ and mechanisms for releasing their payload when they get there. They may also incorporate systems for reporting their progress back to the outside world, even if this is only the passive device of containing some agent that shows up strongly in a medical scanner.

Another area of therapeutics in which nanotechnology can make an impact is in tissue engineering and regenerative medicine. Here it’s not so much a question of making artificial substitutes for tissues or organs; ideally it would be in providing the environment in which a patient’s own cells would develop in such a way as to generate new tissue. This is a question of persuading those cells to differentiate to take up the specialised form of a particular organ. Our cells are social organisms, which respond to chemical and physical signals as they develop and differentiate to produce tissues and organs, and the role of nanotechnology here is to provide an environment (or scaffold) which gives the cells the right physical and chemical signals. Once again, self-assembly is one way forward here, providing soft gels which can be tagged with the right chemical signals to persuade the cells to do the right thing.

2. Diagnostics. Many disease states manifest themselves by the presence of specific molecules, so the ability to detect and identify these molecules quickly and reliably, even when they are present at very low concentrations, would be very helpful for the rapid diagnosis of many different conditions. The relevance of nanotechnology is that many of the most sensitive ways of detecting molecules rely on interactions between the molecule and a specially prepared surface; the much greater importance of the surface relative to the bulk for nanostructured materials makes it possible to make sensors of great sensitivity. Sensors for the levels of relatively simple chemicals, such as glucose or thyroxine, could be integrated with devices that release the chemicals needed to rectify any imbalances (these integrated devices go by the dreadful neologism of “theranostics”); recognising pathogens by recognising stretches of DNA would give a powerful way of identifying infectious diseases without the necessity for time-consuming and expensive culturing steps. One obvious and much pursued goal would be to find a way of reading, at a single molecule level, a whole DNA sequence, making it possible cheaply to obtain an individual’s whole genome.

3. Innovation and biomedical research. A contrarian point of view, which I’ve heard frequently and forcibly expressed by a senior figure from the UK’s pharmaceutical industry, is that the emphasis in nanomedicine on drug delivery is misguided, because fundamentally what it represents is an attempt to rescue bad drug candidates. In this view the place to apply nanotechnology is the drug discovery process itself. It’s a cause for concern for the industry that it seems to be getting harder and more expensive to find new drug candidates, and the hopes that were pinned a few years ago on the use of large scale combinatorial methods don’t seem to be working out. In this view, there should be a move away from these brute force approaches to more rational methods, but this time informed by the very detailed insights into cell biology offered by the single molecule methods of bionanotechnology.