Uncertainties about public engagement

The thinktank Demos has released another report on science and public engagement. The Public Value of Science is, in some ways, a follow up to their earlier pamphlet See-through Science. But whereas the earlier report was rather confident in its diagnosis of the failings of previous attempts to engage the public in science, and in its prescription of a new type of “upstream engagement”, the new report seems much more uncertain in its tone.

On the face of it, this is odd, because the news seems good. There is no evidence of any growing crisis in public confidence in science; on the contrary, the report quotes a recent opinion poll from the UK which found that “86 per cent of people think science ‘makes a good contribution to society’– up 5 per cent on two years ago.”. And the idea of “upstream engagement” is riding high in fashionability, both in government and among the scientific great and good. Nonetheless, there seems to be a nagging worry, a sense that this conversion to real public engagement is only skin deep. It’s true that there’s been some open opposition (for example from Lord Taverne’s organisation, Sense about Science) but this seems to worry Demos less than the feeling that all the attention paid to public engagement still amounts to little more than lip-service, leading to “a well-meaning, professionalised and busy field,propelled along by its own conferences and reports, but never quite impinging on fundamental practices,assumptions and cultures. “

I think they are quite right. The danger they have identified is that all this activity about public engagement still isn’t actually pulling the levers they need to operate to achieve their ambition, which is to steer the direction of the research enterprise itself. The next phase is to work on what they call the “software” of scientific engagement – “the codes,values and norms that govern scientific practice,but which are far harder to access and change.” This is a much more difficult matter than simply setting up a few focus groups and citizens’ juries. In essence, their aim here is to use the input from this kind of deliberative process to redefine the way the scientific community defines “good science”.

This kind of cultural shift isn’t entirely unprecedented. In fact, I’ve argued myself that the rise of nanoscience itself constitutes just such a shift; in this case the definition of good science swung away from testing theories and characterising materials, and towards making widgets or gizmos. But the process of change is difficult, unpredictable and hard to control. It’s not about the Minister for Science issuing a rational order to his obedient research councils; the process is probably closer to the way fashions spread among sub-teenagers. The editors of Nature and Science, like the editors of Smash Hits, might think they have some influence, but they’re at the mercy of the social dynamics of the playground. One obvious difficulty is that the values of the scientific enterprise are now highly globalized. All over the world scientists aspire to publish the same kinds of paper in the same journals, and to be invited to the same conferences. Another difficulty is the sheer self-confidence of the scientific community. Lord Broers’ Reith lectures captured the spirit exactly – paraphrasing Marx, scientists may concede that philosophers and social scientists have done something to understand the world, but scientists and technologists have a deep conviction that it is they who have changed it.

Moving to some more parochial issues, the report identifies some specific barriers that UK scientific politics puts in the way of their vision. The Research Assessment Exercise, which determines the level of baseline research funding in UK universities over a five year period, operates on a strictly disciplinary basis, using peer review of papers describing original research. There’s been some lip-service paid to the notion that there may be valid outputs that aren’t papers in Physical Review Letters, but I’m not sure many people are going to be willing to gamble on this, and I can’t disagree with Demos’s conclusion that ‘”it reinforces the model ofthe highly specialised researcher,locked in a cycle of publish-or-perish”. The research councils clearly see some of the problems and are starting some useful initiatives, but they’re hampered by the difficulty that the different councils have in working cooperatively. The big picture, though, is that there are precious few career incentives for scientists to divert their efforts in this way, and quite a few significant disincentives.

The big weakness in the Demos analysis, in my view, is its failure to address the issue of the power of the market. The authors are very equivocal about the growing emphasis on the commercialisation of university generated research. Agreeing that in principle this is a good thing, they nonetheless report ” growing disquiet among university scientists that the drive for ever closer ties with business is distorting research priorities”, and worry about the effects of this on the openness and integrity of the research process. All these are valid concerns, but what’s missing is a recognition that the market is now the predominant mechanism by which technology impacts on society. Demos says “We believe everyone should be able to make personal choices in their daily lives that contribute to the common good. “ The truth is, the way society is set up now what people buy is one of the major ways in which these choices are made. And the messages that people send through the market by these personal choices might well differ from the messages they would send if you asked them directly. If you ask a bunch of young people where they would like to see money spent to develop nanotechnology, they might well answer that they’d like to see it being spent on improving the environment and on ending world poverty, but then if they go and spend their money on iPods and personal care products their votes are effectively cast for quite different priorities.

This isn’t to say that the market is a very efficient way of setting research priorities – far from it. At the moment we have marketing and product development people making more or less informed guesses (which often turn out to be spectacularly inaccurate) about what people are going to want to buy. On the other hand, researchers are obliged to try and predict some kind of application for the outcome of their research when they apply for funding, and to do this they end up trying to guess, not so much what the potential markets might be, but what they think will best match the preconceptions of referees and research councils. Somehow the idea that in ten years everyone will want flexible television sets, or personal gene testing kits, or neutriceutical laden yoghourts, enters and spreads through the collective mind of the research community like a Pokemon craze. This isn’t to say that these ideas are necessarily wrong; it’s just that the process by which they gain currency is not particularly well controlled or evidence based. It’s this sort of process that sociologists of science ought to understand, but I’m not convinced they do.

On the road again

I’m sorry that I’ve left my blog unattended for a few days; I went away and forgot that I’d changed the blog’s password, so I couldn’t get to it from my laptop.

I’ve been doing a whistlestop tour of the Celtic capitals – to Dublin for the meeting of the British Association, where I was appearing in a panel discussion about whether we should use nanotechnology for human enhancement. Then to Edinburgh, where the EuroNanoForum was discussing nanotechnology and the health of the European citizen. I gave a talk in the session on converging technologies, recorded an interview for French radio, and went to an interesting session on public engagement, after which I had the pleasure of meeting my fellow nano-blogger, David Berube. Then, over a supper of haggis, neeps and tatties, I was subjected to what I thought was rather an aggressive interrogation from some of my fellow European citizens about the quality of the British contribution to international food culture. I’ll post something more substantive tomorrow.

Farewell to Nanobot

Howard Lovy announced last week that he’s drawing a line under his popular and entertaining blog Howard Lovy’s Nanobot. I guess this is the natural consequence of his transition from nanobusiness gamekeeper to poacher, with his new post as Director of Communications at the nanotechnology company Arrowhead Research. I’ve never met Howard in person, though I’ve felt I’ve got to know him through exchanges on our respective blogs and through some email correspondence; I’m delighted that he’s found a niche to use his talents in the nanotechnology sector and I wish him all the best in this new phase of his career.

I’ll miss Nanobot. I certainly didn’t agree with everything Howard said, and I wish he’d got to understand the scientific community better. But it’s been a provocative and interesting read, and its emphasis on the way the idea of nanotechnology is being interpreted in the wider world has been helpful and salutory.

Making life from the bottom up

I wrote below about Craig Venter’s vision of synthetic biology – taking an existing, very simple, organism, reducing its complexity even further by knocking out unneccessary genes, and then inserting new genetic material to accomplish the functions you want. One could think of this as a kind of top-down synthetic biology; one is still using the standard mechanisms and functions of natural biology, but one reprogrammes them as desired. Could there be a bottom-up synthetic biology, in which one designs entirely new structures and systems for metabolism and reproduction?

One approach to this goal has been pioneered by Steven Benner at the University of Florida. He’s been concentrating on creating synthetic genetic systems by analogy with DNA, but he’s not shy about where he wants his research to go: “The ultimate goal of a program in synthetic biology is to develop chemical systems capable of self-reproduction and Darwinian-like evolution.” He’s recently written a review of this kind of approach in Nature Genetics Reviews (subscription only): Synthetic biology.

David Deamer, from UC Santa Cruz, has a slightly different take on the same problem in another recent review, this time in Trends in Biotechnology (again, subscription only, I’m afraid). “A giant step towards artificial life?” concentrates on the idea of creating artificial cells by using self-assembling lipids to make liposomes (the very same creatures that L’Oreal uses in its expensive face creams). Encapsulated within these liposomes are some of the basic elements of metabolism, such as the mechanisms for protein synthesis. How close can this approach get to creating something like a living, reproducing organism? In Deamer’s words: “Everything in the system grows and reproduces except the catalytic macromolecules themselves, the polymerase enzymes or ribosomes. Every other part of the system can grow and reproduce, but the catalysts get left behind. This is the final challenge: to encapsulate a system of macromolecules that can make more of themselves, a molecular version of von Neumann’s replicating machine.” He sees a glimmer of hope in the work of David Bartel at MIT, who has made a RNA enzyme that synthesizes short RNA sequences, pointing the way to RNA-based self-replication.

But all these approaches still follow the pattern set by the life we know about on earth; they depend on the self-assembling properties of a familiar repertoire of lipids and macromolecules, like DNA, RNA and proteins, in watery environments. Could you do without water entirely? Benner is quoted in an article by Philip Ball in this week’s Nature (Water and life: Seeking the solution, subscription required) arguing that you can: “Water is a terrible solvent for life…. We are working to create alternative darwinian systems based on fundamentally different chemistries. We are using different solvent systems as a way to get a precursor for life on Earth.”

A view from the Greenhouse

Here’s another brief report on the Nottingham nanotechnology debate. It’s from Jack Stilgoe, from the thinktank Demos, who was the non-scientist on the panel. He frames the debate in rather nationalistic terms. Is this really just a clash between the habitual rainsoaked pessimism of the British, and sunny American optimism and its associated can-do attitude?

Nanotechnology debate at Nottingham

I don’t know about anybody else, but I enjoyed yesterday’s nanotechnology debate at Nottingham. The whole thing was filmed, and as soon as it’s been edited and tidied up we’ll get the video put up on the web. Given that everyone will soon have the opportunity to judge for themselves how the thing went, I’ll confine myself here to some general observations. There was a big crowd, mostly graduate students attending the surface science summer school, supplemented by a good fraction of the local nanoscientists. The nature of the audience meant that the debate rapidly got quite technical; I don’t think anyone could say that the molecular manufacturing point of view didn’t get a serious hearing. I must say that I was a little apprehensive, given the rancour that has entered previous debates, but I felt the tone was robust but mutually respectful.

My prize for gnomic aphorism of the evening goes to my fellow-panellist Saul Tendler (bionanotechnologist and pharmacy professor). “If a cat had wheels, who would change its tyres?”

Nanotechnology and human enhancement

A session at the British Association’s annual meeting in September, which this year is being held in Dublin, is devoted to a debate on the topic “Should we enhance ourselves: does nanotechnology have limits”. The debate, which is between 7 pm and 9 pm on Tuesday 6 September, has been put together by Donald Bruce, the Director of the Church of Scotland’s Science, Religion and Technology Project. The speakers are myself, Donald, and Paul Galvin, teamleader for Nanobiotechnology at the Tyndall National Institute in Cork.

Under attack

Soft Machines is currently the victim of what amounts to a denial of service attack. This post is by way of warning that the site may need to be taken down (temporarily, I hope) later today if it can’t be sorted out.

Update 20 August. There’s been a bit of improvement, following various measures. Data transfer (normally 20-30 MB a day) is back to about 50 MB a day, from a high of 500 MB a day. I’ve made my peace with the web hosting company. But I do need to move the site onto a different server, which is proving to be a bit of a pain, and means that I’m needing to find out more about mySQL than I really want to know.

22 August. The site is now on the new server. I hope most things have transferred ok; please let me know if you find any glitches. The last four comments – from Kurt, Howard Salis, and replies from me to each – aren’t registered in the “Recent Comments” sidebar, but can be found in the appropriate posts, “commercialising synthetic biology” and “cheap designer genes”.

Commercialising synthetic biology

What’s going to be the quickest way of achieving some kind of radical nanotechnology, in which sophisticated nanoscale machines carry out complex chemical tasks? Since nature has evolved sophisticated and effective nanomachines that are optimised for the nanoscale environment, an obvious approach is to take components from living systems and reassemble them to do the tasks you want. This is the approach of bionanotechnology. But we could take this logic further. Rather than rebuilding systems from individual biological components, we could take a complete organism, strip out the functions we don’t want, and patch in the genetic code for the components we need. This top-down approach to bionanotechnology is exactly what is being proposed by a new company, Synthetic Genomics Inc, founded in June by Craig Venter. Venter is, of course, the scientist behind the private sector venture to sequence the human genome. The initial focus will be on the use of these partly synthetic organisms to make alternative fuels such as hydrogen and ethanol.

The vehicle for these strange hybrids is likely to be the parasitic bacteria Mycoplasma genitalium, an unwelcome inhabitant of some people’s urinary tracts, which currently has the distinction of having the smallest known genome. This is contained on a mere 580,000 base pairs of DNA, coding for about 480 proteins and 40 RNA molecules. Venter’s group systematically knocked out genes from this organism in an attempt to find a so-called minimal genome. One can think of this as the simplest possible fully functioning life-form (of course, such an organism would be very restricted in the environment it can live in). In Venter’s 1999 paper in Science, Global transposon mutagenesis and a minimal mycoplasma genome, a further 100 proteins were eliminated without fatally compromising the organisms’ existence. Having stripped the organism down to a minimal level of complexity, the idea would be to reinsert synthetic genes coding for whatever machinery you require.

There are two questions to ask about this: will it work, and should it be done? It’s certainly a very bold commitment to a very reductionist view of life: in their words “using the genome as a bio-factory, a custom designed, modular cassette system will be developed so that the organism executes specific molecular functions”. As for the ethics of the enterprise, I’m sure even the most enthusiastic technophile would at least pause to think about the implications of attempting to re-engineer life on this scale. Indeed, Venter’s group commissioned their own bioethicists to think about the issues, and this ethical commentary accompanied their original Science article. This is just the beginning of a very big story.

Cheap designer genes

The kind of DNA-based nanotechnology pioneered by New York University’s Ned Seeman is currently the closest thing we have to the radical aim of making nanoscale structures and machines with atomic precision, but the development of the technology is limited by cost. DNA is an expensive molecule – currently it costs about $5000 a gram to make short, synthetic DNA sequences.

The cost of synthetic DNA has been dropping, but a new company is promising orders of magnitude drops in cost for much longer sequences of DNA. The company, Codon Devices, is commercialising methods developed in George Church’s group at Harvard Medical School – the method is describe in this Nature paper (subscription required for full paper): Accurate multiplex gene synthesis from programmable DNA microchips.

It’s not DNA nanotechnology that the company cites as its major potential market, though. Their ambition is to make synthetic genes for synthetic organisms, in the emerging field of synthetic biology.