Nanotechnology and the food industry

The use of nanotechnology in the food industry seems to be creeping up the media agenda at the moment. The Times on Saturday published an extended article by Vivienne Parry in its “Body and Soul” supplement, called Food fight on a tiny scale. As the title indicates, the piece is framed around the idea that we are about to see a rerun of the battles about genetic modification of food in the new context of nano-engineered foodstuffs. Another article appeared in the New York Times a few weeks ago: Risks of engineering a better ice cream.

Actually, apart from the rather overdone references to a potential consumer backlash, both articles are fairly well-informed. The body of Vivienne Parry’s piece, in particular, makes it clear why nanotechnology in food presents a confusingly indistinct and diffuse target. Applications in packaging, for example in improving the resistance of plastic bottles to gas permeation, are already with us and are relatively uncontroversial. Longer ranged visions of “smart packaging” also offer potential consumer benefits, but may have downsides yet to be fully explored. More controversial, potentially, is the question of the addition of nanoscaled ingredients to food itself.

But this issue is very problematic, simply because so much of food is made up of components which are naturally nanoscaled, and much of traditional cooking and food processing consists of manipulating this nanoscale structure. To give just one example, the traditional process of making whey cheeses like ricotta consists of persuading whey proteins like beta-lactoglobulin to form nanoparticles each containing a small number of molecules, and then getting those nanoparticles to aggregate in an open, gel structure, giving the cheese its characteristic mechanical properties. The first example in the NY Times article – controlling the fat particle size in ice cream to get richer feeling low fat ice cream – is best understood as simply an incremental development of conventional food science, which uses the instrumentation and methodology of nanoscience to better understand and control food nanostructure.

There is, perhaps, more apparent ground for concern with food additives that are prepared in a nanoscaled form and directly added to foods. The kinds of molecules we are talking about here are molecules which add colour, flavour and aroma, and increasingly molecules which seem to confer some kind of health benefit. One example of this kind of thing is the substance lycopene, which is available from the chemical firm BASF as a dispersion of particles which are a few hundred nanometers in size. Lycopene is the naturally occurring dye molecule that makes tomatoes red, for which there is increasing evidence of health benefits (hence the unlikely sounding claim that tomato ketchup is good for you). Like many other food component molecules, it is not soluble in water, but it is soluble in fat (as anyone who has cooked an olive oil or butter based tomato sauce will know). Hence, if one wants to add it to a water based product, like a drink, one needs to disperse it very finely for it to be available to be digested.

One can expect, then, more products of this kind, in which a nanoscaled preparation is used to deliver a water or oil soluble ingredient, often of natural origin, which on being swallowed will be processed by the digestive system in the normal way. What about the engineered nanoparticles, that are soluble in neither oil nor water, that have raised toxicity concerns in other contexts? These are typically inorganic materials, like carbon in its fullerene forms, or titanium dioxide, as used in sunscreen, or silica. Some of these inorganic materials are used in the form of micron scale particles as food additives. It is conceivable (though I don’t know of any examples) that nanoscaled versions might be used in food, and that these might fall within a regulatory gap in the current legal framework. I talked about the regulatory implications of this, in the UK, a few months ago in the context of a consultation document issued by the UK’s Food Standards Agency. The most recent research report from the UK government’s Nanotechnology Research Coordination Group reveals that the FSA has commissioned a couple of pieces of research about this, but the FSA informs me that it’s too early to say much about what these projects have found.

I’m guessing that the media interest in this area has arisen largely from some promotional activity from the nanobusiness end of things. The consultancy Cientifica recently released a report, Nanotechnologies in the food industry, and there’s a conference in Amsterdam this week on Nano and Microtechnologies in the
Food and Healthfood Industries
.

I’m on my way to London right now, to take part in a press briefing on Nanotechnology in Food at the Science Media Centre. My family seems to be interacting a lot with the press at the moment, but I don’t suppose I’ll do as well as my wife, whose activities last week provoked this classic local newspaper headline in the Derbyshire Times: School Axe Threat Fury. And people complain about scientific writing being too fond of stacked nouns.

A molecular computer that plays tic-tac-toe

I remember, when I was a (probably irritatingly nerdy) child, being absolutely fascinated by making a tic-tac-toe playing automaton out of match-boxes and beads, following a plan in one of Martin Gardner’s books. So my eye was caught by an item on Martyn Amos’s blog, reporting on a recent paper in Nano Letters (abstract and graphic freely available, subscription required for article) from a group in Columbia University, demonstrating a tic-tac-toe playing computer made, not from matchboxes or even more high-tech transistors, but from individual molecules.

The basic logic gate of this molecular computer is a single short DNA strand of a prescribed sequence which can act as a catalyst – a deoxyribozyme. Like the protein molecules used in the molecular computing and signalling operations inside living cells, these molecular logic gates operate by allostery. This is the principle that when one molecule binds to the gate molecule, it changes its shape and makes it either easier or harder for a second, different, molecule to bind. In this way you can get differential catalytic activity – that is, you can get a situation where the logic gate molecule will only catalyse a reaction to produce an output if a given input molecule is present. This simple situation would define a gate that implemented the logical operation YES; if you needed two inputs to stimulate the catalytic activity, you would have an AND gate, and if you have an AND gate whose catalytic activity can be suppressed by the presence of a third molecule, you have the logical operation xANDyANDNOTz. It is these three logical operations that are integrated in their molecular computer, which can play a complete game of tic-tac-toe (or naughts and crosses, as we call it round here) against a human opponent.

The Columbia group have integrated a total of 128 logic gates, plausibly describing it as the first “medium-scale integrated molecular circuit”. In their implementation, the gates were in solution, in macroscopic quantities, in a multi-well plate, and the outputs were determined by detecting the fluorescence of the output molecules. But there’s no reason in principle at all why this kind of molecular computer cannot be scaled down to the level of single or a few molecules, paving the way, as the authors state at the end of their paper, ” for the next generation of fully autonomous molecular devices”.

The work was done by Joanne Macdonald and Milan Stojanovic, of Columbia University, and Benjamin Andrews and Darko Stefanovic of the University of New Mexico – there’s a useful website for the collaboration here. Also on the author list are five NYC high school students, Yang Li, Marko Sutovic, Harvey Lederman, Kiran Pendri, and Wanhong Lu, who must have got a great introduction to the excitement of research by their involvement in this project.

For Spanish speaking readers

A couple of weeks ago, Spanish television broadcast an extended interview with me by the academic, writer, and broadcaster Eduardo Punset (bio in English here). This is the interview I gave on my visit to Sevilla a few months ago. A full transcript of the interview, in Spanish, is now available on the web-site of Radio Televisión Española.

Does “Soft Machines” present arguments for Intelligent Design?

I’m normally pretty pleased when my book Soft Machines gets any kind of notice, but a recent rather favourable review of it leaves me rather troubled. The review is on the website of a new organisation called Truth in Science, whose aim is “to promote good science education in the UK”. This sounds very worthy, but of course the real aim is to introduce creationist thinking into school science lessons, under the guise of “teaching the controversy”. The controversy in question is, of course, the suggestion that “intelligent design” is a real scientific alternative to the Darwinian theory of evolution as an explanation of the origin and development of life.

The review approvingly quotes a passage from Soft Machines about the lack of evidence for how the molecular machine ATP synthase developed as evidence that Darwinian theory has difficulties. Luckily, my Darwinian credentials aren’t put in doubt – the review goes on to say “Despite the lack of hard evidence for how molecules are meant to have evolved via natural selection, Jones believes that evolution must have occurred because it is possible re-create a sort of molecular evolution ‘in silico’ – or via computer simulation. However, as more is discovered about the immense complexity of molecular systems, such simulations become increasing difficult to swallow.” This is wrong on a couple of counts. Firstly, as Soft Machines describes, we have real experiments – not in-silico ones – notably from Sol Spiegelman, that show that molecules really can evolve. The second point is more subtle and interesting. Actually, there’s a strong argument that it is in complex molecular systems that Darwinian evolution’s real power is seen. It’s in searching the huge, multidimensional conformational spaces that define the combinatorially vast number of possible protein conformations, for example, that evolution is so effective.

The review signs off with a reiteration of a very old argument about design: “In the final chapter, ‘Our nanotechnological future’, Jones acknowledges that our ‘…only true example of a nanotechnology…is cell biology…’. Could that lead to an inference of design? “ Maybe, like many scientists, I have brought this sort of comment on myself by talking extensively about “Nature’s design principles”. The point, though, is that evolution is a design method, and a very powerful one (so powerful that we’re seeing more use of it in entirely artificial contexts, such as in software engineering). However, design doesn’t necessarily need a designer.

“Truth in Science” may present itself as simply wishing to encourage a critical approach to evaluating competing scientific theories, but a little research reveals the true motives of its sponsors. The first name on the Board of Directors is Andy Mckintosh, Professor of Thermodynamics and Combustion Science at Leeds University. Far from being a disinterested student of purported controversies in evolutionary theory, this interview reveals him to be a young earth creationist:
“So you believe in a world created about 6,000 years ago, cursed on account of sin, then devastated by Noah’s Flood?
“Absolutely. There’s nothing in real science (if you take all the assumptions into account) to contradict that view.”

I don’t have a problem if people want to believe in the literal truth of either of the creation stories in Genesis. But I don’t think it is honest to pretend that a belief which, in reality, is based on faith, has any relationship to science, and I think it’s quite wrong to attempt to have these beliefs insinuated into science education in publicly funded schools.

Review of David Berube’s Nanohype in Chemical and Engineering News

My review of David Berube’s book Nano-Hype: The Truth Behind the Nanotechnology Buzz has been published in Chemical and Engineering News, the magazine of the American Chemical Society.

The review (which seems to be available without subscription) is a reworked, expanded and generally better edited version of what I wrote about Nanohype earlier this year on this blog.

DNA as a constructional material

The most sophisticated exercises in using self-assembly to make nanoscale structures and machines have used, as a constructional material, the biomolecule DNA. This field was pioneered by NYU’s Ned Seeman. DNA is not exactly stuff we’re familiar with as a constructional material, though, so I don’t suppose many people have much of a feel for some of its basic mechanical properties, like its stiffness. An elegant experiment, reported in Science at the end of last year, Rapid Chiral Assembly of Rigid DNA Building Blocks for Molecular Nanofabrication (abstract free, subscription required for full article), sheds a lot of light on this question.

The achievement of this work, reported also in this Science News article, was to devise a method of making rigid DNA tetrahedra, with edges less than 10 nm in size, at high (95%) yield (previous methods of making DNA polyhedra had much lower yields than this). A model of one of these tetrahedra is shown below. But, not satisfied with just making these tetrahedra, Russell Goodman (a graduate student in Andrew Turberfield’s group at Oxford) was able to image them with an atomic force microscope and measure the response of a tetrahedron to being compressed by the AFM tip. In this way he was able to measure the spring constant of each tetrahedron.

The spring constants he found had an average of 0.18 N/m, which is reasonable in the light of what we know about the stiffness of DNA double helices. We can use this number to estimate what the stiffness – the Young’s Modulus – of the solid that would be made if you coupled together many of these tetrahedra. The precise value will depend on how the tetrahedra are linked, but a good estimate is about 20 MPa. Compared with a covalently bonded solid, like diamond (whose modulus, at around 1000 GPa, is 50 thousand times greater than our DNA solid), it’s very much floppier. In fact, this modulus is in the range of a relatively hard rubber, of the kind a shoe sole might be made of. On the other hand, given that the material would be mostly water, it’s pretty stiff – probably about a thousand times stiffer from Jello, which is similarly made up of a network of biopolymers in water.

A DNA tetrahedron

A rigid tetrahedron formed by self-assembly from DNA, figure from Goodman et al, Science 310 p1661 (2005)

Software control of matter at the atomic and molecular scale

The UK’s physical sciences research council, the EPSRC, has just issued a call for an “ideas factory” with the theme “Software control of matter at the atomic and molecular scale”, a topic proposed by Nottingham University nanophysicist Philip Moriarty. The way these programs work is that 20-30 participants, selected from many different disciplines, spend a week trying to think through new and innovative approaches to a very challenging problem. At the end of the process, it is hoped that some definite research proposals will emerge, and £1.5 million (i.e. not far short of US$ 3 million) has been set aside to fund these. The challenge, as defined by the call, is as follows:

“Can we design and construct a device or scheme that can arrange atoms or molecules according to an arbitrary, user-defined blueprint? This is at the heart of the idea of the software control of matter – the creation, perhaps, of a “matter compiler” which will interpret software instructions to output a macroscopic product in which every atom is precisely placed. Even partial progress towards this goal would significantly open up the range of available functional materials, permitting meta-materials with interesting electronic, optoelectronic, optical and magnetic properties.

One route to this goal might be to take inspiration from 3-d rapid prototyping devices, and conceive of some kind of pick-and-place mechanism operating at the atomic or molecular level, perhaps based on scanning probe techniques. On the other hand, the field of DNA nanotechnology gives us examples of complex structures built by self- assembly, in which the program to guide the construction is implicit within the structure of the building blocks themselves. This problem, then, goes beyond surface chemistry and the physics of self-assembly to some fundamental questions in computer science.

This ideas factory should attract surface physicists and chemists, including specialists in scanning probe and nanorobotic techniques, and those with an interest in self-assembling systems. Theoretical chemists, developmental biologists, and computer scientists, for example those interested in agent-based and evolutionary computing methods and emergent behaviour, will also be able to contribute. “

I’d encourage anyone who is eligible to receive EPSRC research funding (i.e. scientists working in UK universities and research institutes, broadly speaking) who is interested in taking part in this event to apply using the form on the EPSRC website. One person who won’t be getting any funding from this is me, because I’ve accepted the post of director of the activity.

Two forthcoming books

I’ve recently been looking over the page proofs of two interesting popular science books which are due to be published soon, both on subjects close to my heart. “The Middle World – the Restless Heart of Reality” by Mark Haw, is a discursive, largely historical book about Brownian motion. Of all the branches of physics, statistical mechanics is the one that is least well known in the wider world, but its story has both intellectual fascination and real human interest. The phenomenon of Brownian motion is central to understanding the way biology works, and indeed, as I’ve argued at length here and in my own book, learning how to deal with it and how to exploit it is going to be a prerequisite for success in making nanoscale machines and devices. Mark’s book does a nice job of bringing together the historical story, the relevance of Brownian motion to current science in areas like biophysics and soft matter physics, and its future importance in nanotechnology.

Martyn Amos (who blogs here), has a book called “Genesis Machines: The New Science of Biocomputing” coming out soon. Here the theme is the emerging interaction between computing and biology. This interaction takes a number of forms; the bulk of the book concerns Martyn’s own speciality, the various ways in which the biomolecule DNA can be used to do computations, but this leads on to synthetic biology and the re-engineering of the computing systems of individual cells. To me this is perhaps the most fascinating and potentially important area of science there is at the moment, and this book is an excellent introduction.

Neither book is out yet, but both can be preordered: The Middle World – the Restless Heart of Reality from Amazon, and Genesis Machines – the New Science of Biocomputation from Amazon UK.

ETC makes the case against nanomedicine

The most vocal and unequivocal opponent of nanotechnology – the ETC group – has turned its attention to nanomedicine, with a new report Nanotech Rx taking a sceptical look at the recent shift of emphasis we’ve seen towards medical applications of nanotechnology. The report, though, makes more sense as a critique of modern medicine in general rather than making many specific points about nanotechnology. Particularly in the context of health in the third world, the main thrust of the case is that enthusiasts of technocentric medicine have systematically underplayed the importance of non-technological factors (hygiene, better food, etc) on improving general health. As they say, “the global health crisis doesn’t stem from a lack of science innovation or medical technologies; the root problem is poverty and inequality. New medical technologies are irrelevant for poor people if they aren’t accessible or affordable.” However, in an important advance from ETC’s previous blanket opposition to nanotechnology, they do concede that “nanotech R&D related to water is potentially significant for the developing world. Access to clean water could make a greater contribution to global health than any single medical intervention.”

The debate about human enhancement also gets substantial discussion, with a point of view strongly influenced by disability rights activist Gregor Wolbring. (Newcomers to this debate could do a lot worse than to start with the recent Demos pamphlet, Better Humans? which collects essays by those from a variety of points of view, including Wolbring himself.) ETC correctly identifies the crypto-transhumanist position taken in some recent government publications, and gets succinctly to the nub of the matter as follows: “Certain personality traits (e.g., shyness), physical traits (e.g., “average” strength or height), cognitive traits (e.g., “normal” intelligence) will be deemed undesirable and correctable (and gradually unacceptable, not to be tolerated). The line between enhancement and therapy – already blurry – will be completely obliterated. “ I agree that there’s a lot to be concerned about here, but the issue as it now stands doesn’t have a lot to do with nanotechnology – current points of controversy include the use of SSRIs to “treat” shyness, and modafinil to allow soldiers to go without sleep. However, in the future nanotechnology certainly will be increasingly important in permitting human enhancement, in areas such as the development of interfaces with the brain and in regenerative medicine, and so it’s not unreasonable to flag the area as one to watch.

Naturally, the evils of big pharma get a lot of play. There are the well publicised difficulties big pharma seems to have in maintaining their accustomed level of innovation, the large marketing budgets and the concentration on “me-too” drugs for the ailments of the rich west, and the increasing trend to outsource clinical trials to third world countries. Again, these are all very valid concerns, but they don’t seem to have a great deal of direct relevance to nanotechnology.

In the context of the third world, one of the most telling criticisms of the global pharmaceutical industry has been the lack of R&D spend on diseases that affect the poor. Things have recently changed greatly for the better, thanks to Bill and Melinda and their ilk. ETC recognise the importance of public private partnerships of the kind supported by organisations like the Bill and Melinda Gates foundation, despite some evident distaste that this money has come from the disproportionately rich. “Ten years ago, there was not a single PPP devoted to the development of “orphan drugs” – medicines to treat diseases with little or no financial profit potential – and today there are more than 63 drug development projects aimed at diseases prevalent in the global South.” As an example of a Bill and Melinda supported project, ETC quote a project to develop a new synthetic route to the anti-malarial agent artemisinin. This is problematic for ETC, as the project uses synthetic biology, to which ETC is instinctively opposed; yet since artemisinin-based combination treatments seem to be the only effective way of overcoming the problem of drug resistant malaria, it seems difficult to argue that these treatments shouldn’t be universally available.

The sections of the report that are directly concerned with those areas of nanomedicine that are currently receiving the most emphasis seem rather weak. The section on the use of nanotechnology for drug delivery section discusses only one example, a long way from the clinic, and doesn’t really make any comments at all on the current big drive to develop new anti-cancer therapies based on nanotechnology. I’m also surprised that ETC don’t talk more about the current hopes for the widespread application of nanotechnology in diagnostics and sensor devices, not least because this raises some important issues about the degree to which diagnosis can be simply equated to the presence or absence of some biochemical marker.

At the end of all this, ETC are still maintaining their demand for a “moratorium on nanotechnology”, though this seems at odds with statements like this: “Nanotech R&D devoted to safe water and sustainable energy could be a more effective investment to address fundamental health issues.” I actually find more to agree with in this report than in previous ETC reports. And yet I’m left with the feeling that, even more than in previous reports, ETC has not managed to get to the essence of what makes nanotechnology special.

Is nanoscience different from nanotechnology?

In definitions of nanotechnology, it has now become conventional to distinguish between nanoscience and nanotechnology. One definition that is now very widely used is the one introduced by the 2004 Royal Society report, which defined these terms thus:

“Nanoscience is the study of phenomena and manipulation of materials at atomic, molecular and macromolecular scales, where properties differ significantly from those at a larger scale. Nanotechnologies are the design, characterisation, production and application of structures, devices and systems by controlling shape and size at nanometre scale.”

This echoed the definitions introduced earlier in the 2003 ESRC report, Social and Economic Challenges of Nanotechnology (PDF), which I coauthored, in which we wrote:

“We should distinguish between nanoscience, which is here now and flourishing, and nanotechnology, which is still in its infancy. Nanoscience is a convergence of physics, chemistry, materials science and biology, which deals with the manipulation and characterisation of matter on length scales between the molecular and the micron-size. Nanotechnology is an emerging engineering discipline that applies methods from nanoscience to create usable, marketable, and economically viable products.”

And this formulation itself was certainly derivative; I was certainly strongly influenced at the time by a very similar formulation from George Whitesides.

Despite having played a part in propagating this conventional wisdom, I’m now beginning to wonder how valid or helpful the distinction between nanoscience and nanotechnology actually is. Increasingly, it seems to me that the distinction tends to presuppose a linear model of technology transfer. In this picture, which was very widely held in post-war science policy discussions, we imagine a simple progression from fundamental research, predominantly curiosity driven, through a process of applied research, by which possible applications of the knowledge derived from fundamental science are explored, to the technological development of these applications into products or industrial processes. What’s wrong with this picture is that it doesn’t really describe how innovations in the history of technology have actually occurred. In many cases, inventions have been put into use well before the science that explains how they work was developed (the steam engine being one of many examples of this), and in many others it is actually the technology that has facilitated the science.

Meanwhile, the way science and technology is organised has greatly changed from the situation of the 1950’s, 60’s and ’70’s. At that time, a central role both in the generation of pure science and in its commercialisation was played by the great corporate laboratories, like AT&T’s Bell Labs in the USA, and in the UK the central laboratories of companies like ICI and GEC. For better or worse, these corporate labs have disappeared or been reduced to shadows of their former size, as deregulation and global competition has stripped away the monopoly rents that ultimately financed them. Without the corporate laboratories to broker the process of taking innovation from the laboratory to the factory, we are left with a much more fluid and confusing situation, in which there’s much more pressure on universities to move beyond pure science to find applications for their research and to convert this research into intellectual property to provide future revenue streams. Small research-based companies begin whose main assets are their intellectual property and the knowledge of their researchers, and bigger companies talk about “open innovation”, in which invention is just another function to be outsourced.

A useful concept for understanding the limitations of the linear model in this new environment is the idea of “mode II knowledge production” , (introduced, I believe, by Gibbons, M, et al (1994) The New Production of Knowledge. London: Sage). Mode II science would be fundamentally interdisciplinary, and motivated explicitly by applications rather than by the traditional discipline-based criteria of academic interest. These applications don’t necessarily have to be immediately convertible into something marketable; the distinction is that in this kind of science one is motivated not by exploring or explaining some fundamental phenomenon, but by the drive to make some device or gadget that does something interesting (nano-gizmology, as I’ve called this phenomenon in the past).

So in this view, nanotechnology isn’t simply the application of nanoscience. It’s definition is as much sociological as scientific. Prompted, perhaps, by observing the material success of many academic biologists who’ve founded companies in the biotech sector, and motivated by changes in academic funding climates and the wider research environment, we’ve seen physicists, chemists and materials scientists taking a much more aggressively application driven and commercially oriented approach to their science. Or to put it another way, nanotechnology is simply the natural outcome of an outbreak of biology envy amongst physical scientists.