Nanotechnology and the food industry

The use of nanotechnology in the food industry seems to be creeping up the media agenda at the moment. The Times on Saturday published an extended article by Vivienne Parry in its “Body and Soul” supplement, called Food fight on a tiny scale. As the title indicates, the piece is framed around the idea that we are about to see a rerun of the battles about genetic modification of food in the new context of nano-engineered foodstuffs. Another article appeared in the New York Times a few weeks ago: Risks of engineering a better ice cream.

Actually, apart from the rather overdone references to a potential consumer backlash, both articles are fairly well-informed. The body of Vivienne Parry’s piece, in particular, makes it clear why nanotechnology in food presents a confusingly indistinct and diffuse target. Applications in packaging, for example in improving the resistance of plastic bottles to gas permeation, are already with us and are relatively uncontroversial. Longer ranged visions of “smart packaging” also offer potential consumer benefits, but may have downsides yet to be fully explored. More controversial, potentially, is the question of the addition of nanoscaled ingredients to food itself.

But this issue is very problematic, simply because so much of food is made up of components which are naturally nanoscaled, and much of traditional cooking and food processing consists of manipulating this nanoscale structure. To give just one example, the traditional process of making whey cheeses like ricotta consists of persuading whey proteins like beta-lactoglobulin to form nanoparticles each containing a small number of molecules, and then getting those nanoparticles to aggregate in an open, gel structure, giving the cheese its characteristic mechanical properties. The first example in the NY Times article – controlling the fat particle size in ice cream to get richer feeling low fat ice cream – is best understood as simply an incremental development of conventional food science, which uses the instrumentation and methodology of nanoscience to better understand and control food nanostructure.

There is, perhaps, more apparent ground for concern with food additives that are prepared in a nanoscaled form and directly added to foods. The kinds of molecules we are talking about here are molecules which add colour, flavour and aroma, and increasingly molecules which seem to confer some kind of health benefit. One example of this kind of thing is the substance lycopene, which is available from the chemical firm BASF as a dispersion of particles which are a few hundred nanometers in size. Lycopene is the naturally occurring dye molecule that makes tomatoes red, for which there is increasing evidence of health benefits (hence the unlikely sounding claim that tomato ketchup is good for you). Like many other food component molecules, it is not soluble in water, but it is soluble in fat (as anyone who has cooked an olive oil or butter based tomato sauce will know). Hence, if one wants to add it to a water based product, like a drink, one needs to disperse it very finely for it to be available to be digested.

One can expect, then, more products of this kind, in which a nanoscaled preparation is used to deliver a water or oil soluble ingredient, often of natural origin, which on being swallowed will be processed by the digestive system in the normal way. What about the engineered nanoparticles, that are soluble in neither oil nor water, that have raised toxicity concerns in other contexts? These are typically inorganic materials, like carbon in its fullerene forms, or titanium dioxide, as used in sunscreen, or silica. Some of these inorganic materials are used in the form of micron scale particles as food additives. It is conceivable (though I don’t know of any examples) that nanoscaled versions might be used in food, and that these might fall within a regulatory gap in the current legal framework. I talked about the regulatory implications of this, in the UK, a few months ago in the context of a consultation document issued by the UK’s Food Standards Agency. The most recent research report from the UK government’s Nanotechnology Research Coordination Group reveals that the FSA has commissioned a couple of pieces of research about this, but the FSA informs me that it’s too early to say much about what these projects have found.

I’m guessing that the media interest in this area has arisen largely from some promotional activity from the nanobusiness end of things. The consultancy Cientifica recently released a report, Nanotechnologies in the food industry, and there’s a conference in Amsterdam this week on Nano and Microtechnologies in the
Food and Healthfood Industries
.

I’m on my way to London right now, to take part in a press briefing on Nanotechnology in Food at the Science Media Centre. My family seems to be interacting a lot with the press at the moment, but I don’t suppose I’ll do as well as my wife, whose activities last week provoked this classic local newspaper headline in the Derbyshire Times: School Axe Threat Fury. And people complain about scientific writing being too fond of stacked nouns.

Nanotechnology in the UK – judging the government’s performance

The Royal Society report on nanotechnology – Nanoscience and nanotechnologies: opportunities and uncertainties – was published in 2004, and the government responded to its recommendations early in 2005. At that time, many people were disappointed by the government response (see my commentary here); now the time has come to judge whether the government is meeting its commitments. The body that is going to make that judgement is the Council for Science and Technology. This is the government’s highest level advisory committee, reporting directly to the Prime Minister. The CST Nanotechnology Review is now underway, with a public call for evidence now open. Yesterday I attended a seminar in London organised by the working party.

I’ve written already of my disappointment with the government response so far, for example here, so you might think that I’d be confident that this review would be rather critical of the government. However, close reading of the call for evidence reveals a fine piece of “Yes Minister” style legerdemain; the review will judge, not whether the government’s response to the Royal Society report was itself adequate, but solely whether the government had met the commitments it made in that response.

One of the main purposes of yesterday’s seminar was to see if there had been any major new developments in nanotechnology since the publication of the Royal Society report. Some people expressed surprise at how rapid the introduction of nanotechnology into consumer products had been, though as ever it is difficult to judge how many of these applications can truly be described as nanotechnology, and equally how many other applications are in the market which do involve nanotechnology, but which don’t advertise the fact. However, one area in which there has been a demonstrable and striking proliferation is in nanotechnology road-maps, of which there are now, apparently, a total of seventy six.

Printing devices

I spent a couple of days earlier this week at a conference in Manchester called “Printing of Functional Materials”. The premise of this meeting was the growing realisation that printing technologies, both the traditional, like silk-screen and gravure, and modern, like ink-jet, offer scalable, cheap and flexible ways of precisely depositing small quantities of materials on surfaces. Traditional inks are just vehicles for pigments to create static images, but there’s no reason why you can’t use printing to deposit materials that are conductors or semiconductors of electricity, which are electro-luminescent, or which have some biological functionality. Indeed, as one of the organisers of the conference has shown, one can even use ink-jet printing to deposit living human cells, with potential applications in tissue engineering.

The degree of commercial interest in these technologies was indicated by the fact that, unusually for an academic conference, more than a third of the attendees were from the commercial sector. Many of these were from the cluster of small and medium companies developing ink-jet technologies from around Cambridge, but European and American concerns were well represented too. My impression that the sector that is closest to maturity in this area is in electrically functional devices, where there’s a great deal of momentum to drive down the cost of RFID and to develop cheap, flexible displays. But there are still many materials issues to solve. It’s not easy to get a complex fluid to flow in the right way to form the tiny, well-defined droplets that make it ink-jet well, and formulating the ink in a way that makes it dry to give the best properties is awkward too. Silver inks illustrate the problems – commercial inks to write conducting lines sometimes use silver nanoparticles. Making the silver particles very small is helpful in making them coalesce well to make a continuous silver layer; the melting point of materials is lowered when they are in nanoparticulate form, making them sinter at lower temperatures. But then you have to work hard to stop the particles aggregating in the ink (it’s particularly undesirable, or course, if they aggregate in the ink-jet nozzle and block it up). To stabilise them, you need to coat them with surfactants or polymer molecules. But then this organic coating needs to be driven off by a heating step to get good conduction, and this compromises your ability to print on paper and plastics, which can’t take much heating. It seems to me that this technology has a huge amount of promise, but there’s a lot of materials science and colloid science to be done before it can fulfill its potential.

Ken Donaldson on nanoparticle toxicology

I’ve been running in and out of a three day course on nanotechnology intended for chemists working in the chemistry industry (Nanotechnology for Chemists), organised by me and my colleagues at Sheffield on behalf of the Royal Society of Chemistry. Yesterday I swapped from being a lecturer to being a pupil, to hear a lecture about nanoparticle toxicity, given by Ken Donaldson of the University of Edinburgh, the UK’s leading toxicologist specialising in the effects of environmental nanoparticles. This is a brief summary of his lecture as I understood it (all misunderstandings and misapprehensions are my fault, of course).

His lecture began with the disclaimer that most nanotechnology won’t pose a health risk at all; what’s at issue is the single class of free (i.e. not incorporated in a matrix, as happens in a nanocomposite material), manufactured, insoluble nanoparticles. Of the potential portals of entry – the lungs, the gut and the skin – he felt that the main danger was the lungs, so the main potential danger, both for industrial workers and users, was nanoparticles in the air.

It’s been known for a long time that particles cause lung disease; he gave a number of examples (illustrated by gruesome photographs), including coal miner’s lung, cancer and silicosis from quartz particles and asbestos. The latter causes a number of diseases, including mesothelioma, a particularly nasty cancer seen only in people exposed to asbestos, characterised by long latency period and with a uniformly fatal final outcome. So it’s clear that particles do accumulate in the lungs.

In terms of what we know about the effect of nanoparticle exposures, there are four distinct domains. What we know most about are the nanoparticles derived from combustion. We also know a fair amount about bulk manufactured particles, like titanium dioxide, which have been around a long time and which typically contain significant fractions of nanosized particles. Of course, the effects of nanoparticles used in medical contexts have been well studied. The final area is the least studied – the effect of engineered free nanoparticles.

So what can we learn from environmental nanoparticles? The origin of these particles is overwhelmingly from combustion; in the UK only 13% of exposure comes from non-combustion sources, usually the processes of natural atmospheric chemistry. The most important class of nanoparticles by far are those deriving from traffic exhaust, which account for 60% of exposure. These particles have a basic size of tens of nanometers, though they clump with time into micron sized aggregates, which are very easily respirable.

These particles have no problem getting deep within the lungs. Of the 40 nm particles, perhaps 30% can get to the very delicate tissues in the periphery of the lung, where they deposit very efficiently (smaller particles actually are less effective at getting to the lung as they tend to be taken up in the nose). The structures they interact with deep in the lung – the bronchal epithelial cells – are very small and fragile, and the distances separating airways from the blood are very small. Here the particles cause inflammation, which is essentially a defense reaction. We’re familiar with inflammation of the skin, marked by swelling – fluid bathes the region and white blood cells engulf damaged tissue and microbes, leading to pain, heat, redness and loss of function. Of course in the lung one can’t see the inflammation, and there are no pain receptors, so inflammation can be less obvious, though the swelling can easily cut off air flow leading to very disabling and threatening conditions.

It’s believed that there is a generic mechanism for lung inflammation by combustion-derived nanoparticles, despite the wide variety of different kinds of chemistry in these particles. All these have in common the production of free radicals, which leads to oxidative stress, which in turn leads to inflammation. DIfferent types of nanoparticles cause oxidative stress through different mechanisms. Metal nanoparticles – as found in welding fumes – yield one mechanism, surface born organics (as are found in soot), have another, and materials like carbon black, titanium dioxide and polystyrene latex, which are not very intrinsically toxic, operate through some generic surface mechanism. Clearly it is the surface area that is important, so nanoparticles cause more inflammation than the same mass of fine respirable particles, in the 2-3 micron range, composed of the same materials. In passing one can note that diesel fumes are particularly harmful, dealing a triple blow through their combination of surfaces, metals and organics. These pathways to oxidative stress are very well understood now, so this is a well-found paradigm.

Inflammation due to the oxidative stress caused by nanoparticles from pollution then leads to a number of different diseases, including cardiovascular diease, asthma, scarring, cancer and chronic obstructive pulmonary disease). Their involvement in cardiovascular disease is perhaps unexpected, and to understand it we need to understand where the nanoparticles go. We have some rather hypothetical toxicokinetics based on a few experiments using radioactive, insoluble tracer particles. Having entered the nose or lung, a few studies suggest that they can go directly to the brain. The route from the lung to the blood is well understood, and once in the blood there are many possible ultimate destinations. It’s doubtful that nanoparticles could enter the blood directly from the gut or skin. A mechanism for the involvement of nanoparticles in cardiovascular disease is suggested by studies in which healthy Swedish student volunteers rode a bike in an atmosphere of diesel fumes (at levels comparable to highly polluted city streets). This leads to measurable vascular dysfunction throughout the whole body, and a reduction in the ability to dissolve blood clots (similar effects will be observed in smokers, who self-administer nanoparticles). This suggests that pollution nanoparticles could cause cardiovascular disease either through lung inflammation or through the direct effect of bloodborn particles, leading to the worsening of coronary artery disease or increased blood clotting.

A study using radioactive carbon has suggested that nanoparticles can enter the brain directly from the nose, via the olfactory bulb – this is the route into the central nervous system used by the polio virus, and it doesn’t required crossing the blood-brain barrier. Studies of brain tissue in people living in highly polluted cities like Mexico City have shown pathological changes simiilar to those seen in victims of Parkinson’s and Alzheimer’s occurring as a result of the effect of pollution-derived nanoparticles.

The potential comparison between carbon nanotubes and asbestos is worth considering. Very large exposures to asbestos in the past have caused many cases of fatal lung disease. The characteristics of asbestos which cause this disease – and these characteristics are physical, not chemical – are that they are thin, persistent in the body, and long. Carbon nanotubes certainly match the first two requirements, but it is not obvious that they fulfill the third. Asbestos fibres need to be 20 microns long to demonstrate toxic effects; if they are milled to shorter lengths the toxicity goes away. Carbon nanotubes of this length tend to curl up and clump. On the other hand rat experiments on the effect of nanotubes on the lungs show distinctive fibrosing lesions. Donaldson has just written an extensive review article about nanotube toxicity which will be published soon.

From the regulatory point of view there are some difficulties as regulations usually specify exposure limits in terms of mass concentration, while clearly it is surface area that is important. In the USA NIOSH thinking of reducing limits by a factor of 5 for ultrafine TiO2. Fibres, though, are regulated by number density. The difficulties for carbon nanotubes are that they are probably too small to see by standard microscopy, and they curl up, so although they should be classifed as fibres by WHO definitions probably aren’t going to be detected. In terms of workplace protection local exhaust ventilation is much the best, with almost all masks being fairly useless. This applies, for example, to the masks used by some cyclists in polluted cities. They can, however, take comfort from the fact that their exposure to nanoparticles is significantly smaller than the exposure of the people inside the vehicles who are causing the pollution.

My conclusion, then, is if you are worried about inhaling free nanoparticles (and you should be) you should stop travelling by car.

Nanoparticle toxicity: The Royal Society bites back

Last week saw a little bit more bad publicity for the nascent nano industry, in the shape of a news report from the BBC highlighting a call from the Royal Society for industry to disclose the data from its safety testing of free nanoparticles in consumer products. The origin of the report was a press release from the Royal Society, quoting Ann Dowling, the chair of the Royal Society/Royal Academy of Engineering study of nanotechnology.

The pretext for the Royal Society press release was the recent publication of an inventory of consumer products using nanotechnology by the Woodrow Wilson Centre Project on Emerging Nanotechnologies. But this call for disclosure was already one of the recommendations in the Royal Society’s report, and it’s not hard to sense the growing frustration within the Royal Society that, two years on from the publication of that report, we’re not much further forward in implementing many of its recommendations.

On my nanotechnology bookshelf

Following my recent rather negative review of a recent book on nanotechnology, a commenter asked me for some more positive recommendations about books on nanotechnology that are worth reading. So here’s a list of nanotechnology books old and new with brief comments. The only criterion for inclusion on this list is that I have a copy of the book in question; I know that there are a few obvious gaps. I’ll list them in the order in which they were published:

Engines of Creation, by K. Eric Drexler (1986). The original book which launched the idea of nanotechnology into popular consciousness, and still very much worth reading. Given the controversy that Drexler has attracted in recent years, it’s easy to forget that he’s a great writer, with a very fertile imagination. What Drexler brought to the idea of nanotechnology, which then was dominated, on the one hand by precision mechanical engineering (this is the world that the word nanotechnology, coined by Taniguchi, originally came from), and on the other by the microelectronics industry, was an appreciation of the importance of cell biology as an exemplar of nanoscale machines and devices and of ultra-precise nanoscale chemical operations.

Nanosystems: Molecular Machinery, Manufacturing, and Computation , by K. Eric Drexler (1992). This is Drexler’s technical book, outlining his particular vision of nanotechnology – “the principles of mechanical engineering applied to chemistry” – in detail. Very much in the category of books that are often cited, but seldom read – I have, though, read it, in some detail. The proponents of the Drexler vision are in the habit of dismissing any objection with the words “it’s all been worked out in ‘Nanosystems'”. This is often not actually true; despite the deliberately dry and textbook-like tone, and the many quite complex calculations (which are largely based on science that was certainly sound at the time of writing, though there are a few heroic assumptions that need to be made), many of the central designs are left as outlines, with much detail left to be filled in. My ultimate conclusion is that this approach to nanotechnology will turn out to have been a blind alley, though in the process of thinking through the advantages and disadvantages of the mechanical approach we will have learned a lot about how radical nanotechnology will need to be done.

Molecular Devices and Machines : A Journey into the Nanoworld , by Vincenzo Balzani, Alberto Credi and Margherita Venturi (2003). The most recent addition to my bookshelf, I’ve not finished reading it yet, but it’s good so far. This is a technical (and expensive) book, giving an overview of the approach to radical nanotechnology through supramolecular chemistry. This is perhaps the part of academic nanoscience that is closest to the Drexler vision, in that the explicit goal is to make molecular scale machines and devices, though the methods and philosophy are rather different from the mechanical approach. A must, if you’re fascinated by cis-trans isomerisation in azobenzene and intermolecular motions in rotaxanes (and if you’re not, you probably should be).

Bionanotechnology : Lessons from Nature, by David Goodsell (2004). I’m a great admirer of the work of David Goodsell as a writer and illustrator of modern cell biology, and this is a really good overview of the biology that provides both inspiration and raw materials for nanobiotechnology.

Soft Machines : Nanotechnology and Life, by Richard Jones (2004). Obviously I can’t comment on this, apart from to say that three years on I wouldn’t have written it substantially differently.

Nanotechnology and Homeland Security: New Weapons for New Wars , by Daniel and Mark Ratner (2004). I still resent the money I spent on this cynically titled and empty book.

Nanoscale Science and Technology, eds Rob Kelsall, Ian Hamley and Mark Geoghegan (2005). A textbook at the advanced undergraduate/postgraduate level, giving a very broad overview of modern nanoscience. I’m not really an objective commentator, as I co-wrote two of the chapters (on bionanotechnology and macromolecules at interfaces), but I like the way this book combines the hard (semiconductor nanotechnology and nanomagnetism) and the soft (self-assembly and bionano).

Nanofuture: What’s Next For Nanotechnology , by J. Storrs Hall (2005). Best thought of as an update of Engines of Creation, this is a an attractive and well-written presentation of the Drexler vision of nanotechnology. I entirely disagree with the premise, of course.

Nano-Hype: The Truth Behind the Nanotechnology Buzz, by David Berube (2006). A book, not about the science, but about nanotechnology as a social and political phenomenon. I reviewed in detail here. I’ve been referring to it quite a lot recently, and am increasingly appreciating the dry humour hidden within its rather complete historical chronicle.

The Dance of Molecules : How Nanotechnology is Changing Our Lives , by Ted Sargent (2006). Reviewed by me here, it’s probably fairly clear that I didn’t like it much.

The Nanotech Pioneers : Where Are They Taking Us?, by Steve Edwards (2006). In contrast to the previous one, I did like this book, which I can recommend as a good, insightful and fairly nanohype-free introduction to the area. I’ve written a full review of this, which will appear in “Physics World” next month (and here also, copyright permitting).

A quiet policy shift for UK nanotechnology

The centrepiece of the UK’s publically funded nanotechnology effort has been the Department of Trade and Industry’s Micro and Nanotechnology manufacturing initiative (MNT). This had a high profile launch in July 2003 in a speech by the science minister, Lord Sainsbury, with an initial commitment of £90 million. When, last year, the Secretary of State for Trade and Industry announced an increase of DTI nanotechnology funding to £200 million, the future of the MNT program seemed assured. But a close reading of recent announcements from the DTI make it clear that whatever extra funding they may be putting into nanotechnology, it’s not going into the MNT program.

Technology and innovation policy at the DTI is now informed by a Technology Strategy Board, made up largely from figures from industry and venture capital. This board’s first annual report (PDF) was published in November 2005, and contained this recommendation:
“We also recommend incorporating nanotechnology in the competitions for underpinning technologies, such as advanced materials, to avoid confusion. It is important, however, that the DTI keeps track of expenditure on nano-projects to be able to honour its commitments to Parliament in this area.”

It’s now clear that this recommendation has been followed. The spring competition for collaborative R&D, announced here, and to be formally launched on April 26th, does not include a separate micro- and nano- theme. Instead, the call is based around what the DTI calls innovation platforms – societal challenges which many technologies can be combined to address. Undoubtedly, some of these areas will call for nano- enabled solutions. Novel Technologies For Low-Cost, High Efficiency Electronics And Lighting Systems mentions plastic electronics and light emitting diodes as potential technologies of interest, while Low Carbon Energy Technologies talks about the need for novel solar cells.

This is an interesting shift of emphasis. The MNT program had few friends in the world of academic nanoscience and technology; it always seemed happier with the micro- than the nano- , and the insistence that programs be business-oriented seemed on occasion to shade into a positive antipathy to academic nanoscience and led to the perception that the program was considerably friendlier to consultants than either technologists or scientists. On the other hand, the idea of building an applied research program around problems to be solved, rather than technological solutions looking for problems, seems one that is well worth trying.

What needs to happen for this to work? Firstly, the ongoing MNT program needs to become much more effective at connecting the best parts of the UK nanoscience base to potential users of the new technologies, and it needs to give more impression of being a little more forward looking in the technologies it’s sponsoring. Then the Technology Program is going to have to work hard to make sure that the right scientists are engaged and nanotechnology gets an appropriate share of the resources, meeting the very specific commitment to a certain level of spending on nanotechnology made by the Minister.

It’s going to be important to get this right. As I discussed a couple of weeks ago, there’s growing evidence of an external perception of the UK nanotechnology program as being diffuse, unfocused and ineffective. Given the general strength of the UK science base, the UK should be in a much better position; there’s a real danger that this could turn into a big missed opportunity.

Taking the high road to large scale solar power

In principle there’s more than enough sunlight falling on the earth to meet all our energy needs in a sustainable way, but the prospects for large scale solar energy are dimmed by a dilemma. We have very efficient solar cells made from conventional semiconductors, but they are too expensive and difficult to manufacture in very large areas to make a big dent in our energy needs. On the other hand, there are prospects for unconventional solar cells – Graetzel cells or polymer photovoltaics – which can perhaps be made cheaply in large areas, but whose efficiencies and lifetimes are too low. In an article in this month’s Nature Materials (abstract, subscription required for full article, see also this press release), Imperial College’s Keith Barnham suggests a way out of the dilemma.

The efficiencies of the best solar cells available today exceed 30%, and there is every reason to suppose that this figure can be substantially increased with more research. These solar cells are based, not on crystalline silicon, like standard solar cell modules, but on carefully nanostructured compound semiconductors like gallium arsenide (III-V semiconductors, in the jargon). By building up complex layered structures it is possible efficiently to harvest the energy of light of all wavelengths. The problem is that these solar cells are expensive to make, relying on sophisticated techniques for building up different semiconductor layers, like molecular beam epitaxy, and currently are generally only used for applications where cost doesn’t matter, such as on satellites. Barnham argues that the cost disadvantage can be overcome by combining these efficient solar cells with low-cost systems for concentrating sunlight – in his words “our answer to this particular problem is ‘Smart Windows’, which use small, transparent plastic lenses that track the sun and act as effective blinds for the direct sunlight, when combined with innovative light collectors and small 3rd-generation cells,” and he adds “Even in London a system like this would enable a typical office behind a south-facing wall to be electrically self-sufficient.”

Even with conventional technologies, Barnham calculates that if all roofs and south-facing walls were covered in solar cells this would represent three times the total generating capacity of the UK’s current nuclear program – that is, 36 GW. This represents a really substantial dent in the energy needs of the UK, and if we believe Barnham’s calculation that his system would deliver about three times as much energy as conventional solar cells, this represents pretty much a complete solution to our energy problems. What is absent from the article, though, is an estimate of the total production capacity that’s likely to be achievable, merely observing that the UK semiconductor industry has substantial spare capacity after the telecoms downturn. This is the missing calculation that needs to be done before we can accept Barnham’s optimism.

Pitching to Intel

There was some mockery of Apple in nanotech circles for branding their latest MP3 player the iPod Nano, merely, it seemed, because it was impressively thin (at least compared to my own much-loved first generation model). Rationalisations that its solid state memory was made with a 65 nm process didn’t seem to cut much ice with the sceptics. Nonetheless, what feels superficially obvious, that microelectronics companies are deeply involved with nanotechnology, both in their current products, and in their plans for the future, really is true.

This was made clear to me yesterday; I was in Newcastle, at a small meeting put together by the regional technology transfer organisation CENAMPS, in which nano academics from some northern UK Universities were pitching their intellectual wares to a delegation from Intel. Discussion ranged from near term materials science to the further reaches of quantum computing and new neuroscience-inspired, adaptive and multiply connected paradigms for computing without software.

The research needs of Intel, and other microelectronics companies, are made pretty clear by the International Semiconductor Technology Roadmap. In the near-term, what seem on the surface to be merely incremental improvements in reducing critical dimensions need to be underwritten by simultaneous improvements in all kinds of unglamorous but vital materials, like dielectrics, resists, and glues. Even to achieve their current performance, these materials are already pretty sophisticated, and to deliver ever-more demanding requirements for properties like dielectric constant and thermal expansivity will rely even more on the nanoscale control of structure of these materials. Much of this activity takes place under the radar of casual observers, because it consists of business-to-business transactions in unglamorous sounding sectors like chemicals and adhesives, but the volumes, values (and margins) are pretty substantial . Meanwhile, as their products shrink, these companies are huge and demanding consumers of nanometrology products.

In the medium term, to keep Moore’s law on track is going to demand that CMOS gets a radical makeover. Carbon nanotube transistors are a serious possibility – they’re now in the road-map – but the obstacles to integrating them in large-scale systems are formidable, and we’re only talking about a window of ten years or so to do this. And then, beyond 2020, we need to go quite beyond CMOS to something quite revolutionary, like molecular electronics or quantum computing. This is a daunting prospect, given that these technologies barely exist in the lab.

And what will be the societal and economic forces driving the development of nano-electronics twenty years out? Now, it’s the need to sell every teenager an MP3 player and a digital camera. Tomorrow, it’s going to be the end of broadcast television, and putting video-on-demand systems into every family home. By 2025, it’s most likely going to be the need to keep the ageing baby boomers out old peoples homes and hospitals and able to live independently. Robotics equipped with something much closer to real intelligence, ubiquitous sensing and continuous medical monitoring look like good bets to me.