The Royal Society’s verdict on the UK government’s nanotech performance

The UK’s science and engineering academies – the Royal Society and the Royal Academy of Engineering – were widely praised for their 2004 report on nanotechnology – Nanoscience and nanotechnologies: opportunities and uncertainties, which was commissioned by the UK government. So it’s interesting to see, two years on, how they think the government is doing implementing their suggestions. The answer is given in a surprisingly forthright document, published a couple of days ago, which is their formal submission to the review of UK nanotechnology policy by the Council of Science and Technology. The press release that accompanies the submission makes their position fairly clear. Ann Dowling, the chair of the 2004 working group, is quoted as saying “The UK Government was recognised internationally as having taken the lead in encouraging the responsible development of nanotechnologies when it commissioned our 2004 report. So it is disappointing that the lack of progress on our recommendations means that this early advantage has been lost.”

Nanotechnology and the food industry

The use of nanotechnology in the food industry seems to be creeping up the media agenda at the moment. The Times on Saturday published an extended article by Vivienne Parry in its “Body and Soul” supplement, called Food fight on a tiny scale. As the title indicates, the piece is framed around the idea that we are about to see a rerun of the battles about genetic modification of food in the new context of nano-engineered foodstuffs. Another article appeared in the New York Times a few weeks ago: Risks of engineering a better ice cream.

Actually, apart from the rather overdone references to a potential consumer backlash, both articles are fairly well-informed. The body of Vivienne Parry’s piece, in particular, makes it clear why nanotechnology in food presents a confusingly indistinct and diffuse target. Applications in packaging, for example in improving the resistance of plastic bottles to gas permeation, are already with us and are relatively uncontroversial. Longer ranged visions of “smart packaging” also offer potential consumer benefits, but may have downsides yet to be fully explored. More controversial, potentially, is the question of the addition of nanoscaled ingredients to food itself.

But this issue is very problematic, simply because so much of food is made up of components which are naturally nanoscaled, and much of traditional cooking and food processing consists of manipulating this nanoscale structure. To give just one example, the traditional process of making whey cheeses like ricotta consists of persuading whey proteins like beta-lactoglobulin to form nanoparticles each containing a small number of molecules, and then getting those nanoparticles to aggregate in an open, gel structure, giving the cheese its characteristic mechanical properties. The first example in the NY Times article – controlling the fat particle size in ice cream to get richer feeling low fat ice cream – is best understood as simply an incremental development of conventional food science, which uses the instrumentation and methodology of nanoscience to better understand and control food nanostructure.

There is, perhaps, more apparent ground for concern with food additives that are prepared in a nanoscaled form and directly added to foods. The kinds of molecules we are talking about here are molecules which add colour, flavour and aroma, and increasingly molecules which seem to confer some kind of health benefit. One example of this kind of thing is the substance lycopene, which is available from the chemical firm BASF as a dispersion of particles which are a few hundred nanometers in size. Lycopene is the naturally occurring dye molecule that makes tomatoes red, for which there is increasing evidence of health benefits (hence the unlikely sounding claim that tomato ketchup is good for you). Like many other food component molecules, it is not soluble in water, but it is soluble in fat (as anyone who has cooked an olive oil or butter based tomato sauce will know). Hence, if one wants to add it to a water based product, like a drink, one needs to disperse it very finely for it to be available to be digested.

One can expect, then, more products of this kind, in which a nanoscaled preparation is used to deliver a water or oil soluble ingredient, often of natural origin, which on being swallowed will be processed by the digestive system in the normal way. What about the engineered nanoparticles, that are soluble in neither oil nor water, that have raised toxicity concerns in other contexts? These are typically inorganic materials, like carbon in its fullerene forms, or titanium dioxide, as used in sunscreen, or silica. Some of these inorganic materials are used in the form of micron scale particles as food additives. It is conceivable (though I don’t know of any examples) that nanoscaled versions might be used in food, and that these might fall within a regulatory gap in the current legal framework. I talked about the regulatory implications of this, in the UK, a few months ago in the context of a consultation document issued by the UK’s Food Standards Agency. The most recent research report from the UK government’s Nanotechnology Research Coordination Group reveals that the FSA has commissioned a couple of pieces of research about this, but the FSA informs me that it’s too early to say much about what these projects have found.

I’m guessing that the media interest in this area has arisen largely from some promotional activity from the nanobusiness end of things. The consultancy Cientifica recently released a report, Nanotechnologies in the food industry, and there’s a conference in Amsterdam this week on Nano and Microtechnologies in the
Food and Healthfood Industries
.

I’m on my way to London right now, to take part in a press briefing on Nanotechnology in Food at the Science Media Centre. My family seems to be interacting a lot with the press at the moment, but I don’t suppose I’ll do as well as my wife, whose activities last week provoked this classic local newspaper headline in the Derbyshire Times: School Axe Threat Fury. And people complain about scientific writing being too fond of stacked nouns.

A molecular computer that plays tic-tac-toe

I remember, when I was a (probably irritatingly nerdy) child, being absolutely fascinated by making a tic-tac-toe playing automaton out of match-boxes and beads, following a plan in one of Martin Gardner’s books. So my eye was caught by an item on Martyn Amos’s blog, reporting on a recent paper in Nano Letters (abstract and graphic freely available, subscription required for article) from a group in Columbia University, demonstrating a tic-tac-toe playing computer made, not from matchboxes or even more high-tech transistors, but from individual molecules.

The basic logic gate of this molecular computer is a single short DNA strand of a prescribed sequence which can act as a catalyst – a deoxyribozyme. Like the protein molecules used in the molecular computing and signalling operations inside living cells, these molecular logic gates operate by allostery. This is the principle that when one molecule binds to the gate molecule, it changes its shape and makes it either easier or harder for a second, different, molecule to bind. In this way you can get differential catalytic activity – that is, you can get a situation where the logic gate molecule will only catalyse a reaction to produce an output if a given input molecule is present. This simple situation would define a gate that implemented the logical operation YES; if you needed two inputs to stimulate the catalytic activity, you would have an AND gate, and if you have an AND gate whose catalytic activity can be suppressed by the presence of a third molecule, you have the logical operation xANDyANDNOTz. It is these three logical operations that are integrated in their molecular computer, which can play a complete game of tic-tac-toe (or naughts and crosses, as we call it round here) against a human opponent.

The Columbia group have integrated a total of 128 logic gates, plausibly describing it as the first “medium-scale integrated molecular circuit”. In their implementation, the gates were in solution, in macroscopic quantities, in a multi-well plate, and the outputs were determined by detecting the fluorescence of the output molecules. But there’s no reason in principle at all why this kind of molecular computer cannot be scaled down to the level of single or a few molecules, paving the way, as the authors state at the end of their paper, ” for the next generation of fully autonomous molecular devices”.

The work was done by Joanne Macdonald and Milan Stojanovic, of Columbia University, and Benjamin Andrews and Darko Stefanovic of the University of New Mexico – there’s a useful website for the collaboration here. Also on the author list are five NYC high school students, Yang Li, Marko Sutovic, Harvey Lederman, Kiran Pendri, and Wanhong Lu, who must have got a great introduction to the excitement of research by their involvement in this project.

For Spanish speaking readers

A couple of weeks ago, Spanish television broadcast an extended interview with me by the academic, writer, and broadcaster Eduardo Punset (bio in English here). This is the interview I gave on my visit to Sevilla a few months ago. A full transcript of the interview, in Spanish, is now available on the web-site of Radio Televisión Española.

Does “Soft Machines” present arguments for Intelligent Design?

I’m normally pretty pleased when my book Soft Machines gets any kind of notice, but a recent rather favourable review of it leaves me rather troubled. The review is on the website of a new organisation called Truth in Science, whose aim is “to promote good science education in the UK”. This sounds very worthy, but of course the real aim is to introduce creationist thinking into school science lessons, under the guise of “teaching the controversy”. The controversy in question is, of course, the suggestion that “intelligent design” is a real scientific alternative to the Darwinian theory of evolution as an explanation of the origin and development of life.

The review approvingly quotes a passage from Soft Machines about the lack of evidence for how the molecular machine ATP synthase developed as evidence that Darwinian theory has difficulties. Luckily, my Darwinian credentials aren’t put in doubt – the review goes on to say “Despite the lack of hard evidence for how molecules are meant to have evolved via natural selection, Jones believes that evolution must have occurred because it is possible re-create a sort of molecular evolution ‘in silico’ – or via computer simulation. However, as more is discovered about the immense complexity of molecular systems, such simulations become increasing difficult to swallow.” This is wrong on a couple of counts. Firstly, as Soft Machines describes, we have real experiments – not in-silico ones – notably from Sol Spiegelman, that show that molecules really can evolve. The second point is more subtle and interesting. Actually, there’s a strong argument that it is in complex molecular systems that Darwinian evolution’s real power is seen. It’s in searching the huge, multidimensional conformational spaces that define the combinatorially vast number of possible protein conformations, for example, that evolution is so effective.

The review signs off with a reiteration of a very old argument about design: “In the final chapter, ‘Our nanotechnological future’, Jones acknowledges that our ‘…only true example of a nanotechnology…is cell biology…’. Could that lead to an inference of design? “ Maybe, like many scientists, I have brought this sort of comment on myself by talking extensively about “Nature’s design principles”. The point, though, is that evolution is a design method, and a very powerful one (so powerful that we’re seeing more use of it in entirely artificial contexts, such as in software engineering). However, design doesn’t necessarily need a designer.

“Truth in Science” may present itself as simply wishing to encourage a critical approach to evaluating competing scientific theories, but a little research reveals the true motives of its sponsors. The first name on the Board of Directors is Andy Mckintosh, Professor of Thermodynamics and Combustion Science at Leeds University. Far from being a disinterested student of purported controversies in evolutionary theory, this interview reveals him to be a young earth creationist:
“So you believe in a world created about 6,000 years ago, cursed on account of sin, then devastated by Noah’s Flood?
“Absolutely. There’s nothing in real science (if you take all the assumptions into account) to contradict that view.”

I don’t have a problem if people want to believe in the literal truth of either of the creation stories in Genesis. But I don’t think it is honest to pretend that a belief which, in reality, is based on faith, has any relationship to science, and I think it’s quite wrong to attempt to have these beliefs insinuated into science education in publicly funded schools.

Review of David Berube’s Nanohype in Chemical and Engineering News

My review of David Berube’s book Nano-Hype: The Truth Behind the Nanotechnology Buzz has been published in Chemical and Engineering News, the magazine of the American Chemical Society.

The review (which seems to be available without subscription) is a reworked, expanded and generally better edited version of what I wrote about Nanohype earlier this year on this blog.

DNA as a constructional material

The most sophisticated exercises in using self-assembly to make nanoscale structures and machines have used, as a constructional material, the biomolecule DNA. This field was pioneered by NYU’s Ned Seeman. DNA is not exactly stuff we’re familiar with as a constructional material, though, so I don’t suppose many people have much of a feel for some of its basic mechanical properties, like its stiffness. An elegant experiment, reported in Science at the end of last year, Rapid Chiral Assembly of Rigid DNA Building Blocks for Molecular Nanofabrication (abstract free, subscription required for full article), sheds a lot of light on this question.

The achievement of this work, reported also in this Science News article, was to devise a method of making rigid DNA tetrahedra, with edges less than 10 nm in size, at high (95%) yield (previous methods of making DNA polyhedra had much lower yields than this). A model of one of these tetrahedra is shown below. But, not satisfied with just making these tetrahedra, Russell Goodman (a graduate student in Andrew Turberfield’s group at Oxford) was able to image them with an atomic force microscope and measure the response of a tetrahedron to being compressed by the AFM tip. In this way he was able to measure the spring constant of each tetrahedron.

The spring constants he found had an average of 0.18 N/m, which is reasonable in the light of what we know about the stiffness of DNA double helices. We can use this number to estimate what the stiffness – the Young’s Modulus – of the solid that would be made if you coupled together many of these tetrahedra. The precise value will depend on how the tetrahedra are linked, but a good estimate is about 20 MPa. Compared with a covalently bonded solid, like diamond (whose modulus, at around 1000 GPa, is 50 thousand times greater than our DNA solid), it’s very much floppier. In fact, this modulus is in the range of a relatively hard rubber, of the kind a shoe sole might be made of. On the other hand, given that the material would be mostly water, it’s pretty stiff – probably about a thousand times stiffer from Jello, which is similarly made up of a network of biopolymers in water.

A DNA tetrahedron

A rigid tetrahedron formed by self-assembly from DNA, figure from Goodman et al, Science 310 p1661 (2005)