Archive for July, 2008

The mis-measure of uncertainty

Thursday, July 31st, 2008

A couple of pieces in the Financial Times today and yesterday offer some food for thought about the problems of commercialising scientific research. Yesterday’s piece – Drug research needs serendipity (free registration may be required) – concentrates on the pharmaceutical sector, but its observations are more widely applicable. Musing on the current problems of big pharma, with their dwindling pipelines of new drugs, David Shaywitz and Nassim Taleb (author of The Black Swan), identify the problem as a failure to deal with uncertainty; “academic researchers underestimated the fragility of their scientific knowledge while pharmaceuticals executives overestimated their ability to domesticate scientific research.”

They identify two types of uncertainty; there’s the negative uncertainty of all the things that can go wrong as one tries to move from medical research to treatments. Underlying this is the simple fact that we know much less about human biology, in all its complexity, than one might think from all the positive headlines and press releases. It’s in response to this negative uncertainty that managers have attempted to impose more structure and focus to make the outcome of research more predictable. But why this is generally in vain? “Answer: spreadsheets are easy; science is hard.” According to Shaywitz and Taleb, this approach isn’t just doomed to fail on its on terms, it’s positively counterproductive. This is because it doesn’t leave any room for another type of uncertainty: the positive uncertainty of unexpected discoveries and happy accidents.

Their solution is to embrace the trend we’re already seeing, for big Pharma to outsource more and more of its functions, lowering the barriers to entry and leaving room for “a lean, agile organisation able to capture, consider and rapidly develop the best scientific ideas in a wide range of disease areas and aggressively guide these towards the clinic.”

But how are things for the small and agile companies that are going to be driving innovation in this new environment? Not great, says Jonathan Guthrie in today’s FT, but nonetheless “There is hope yet for science park toilers”. The article considers, from a UK perspective, the problems small technology companies are having raising money from venture capitalists. It starts from the position that the problem isn’t shortage of money but shortage of good ideas; perhaps not the end of the age of innovation, but a temporary lull after the excitement of personal computers, the internet and mobile phones. And, for the part of the problem that lies with venture capitalists, misreading this cycle has contributed to their difficulties. In the wake of the technology bubble, venture capital returns aren’t a good advertisement for would-be investors at the moment – “funds set up after 1996 have typically lost 1.4 per cent a year over five years and 1.8 per cent over 10 years, says the British Private Equity and Venture Capital Association.” All is not lost, Guthrie thinks – as the memory of the dotbomb debacles fade the spectacular returns enjoyed by the most successful technology start-ups will attract money back into the sector. Where will the advances take place? Not in nanotechnology, at least in the form of the nanomaterials sector as it has been understood up to now: “materials scientists have engineered a UK nanotechnology sector so tiny it is virtually invisible.” Instead Guthrie points to renewable energy and power saving systems.

Nanotubes for flexible electronics

Friday, July 25th, 2008

The glamorous applications for carbon nanotube in electronics focus on the use of individual nanotubes for nanoscale electronics – for example, this single nanotube integrated circuit reported by IBM a couple of years ago. But more immediate applications may come from using thin layers of nanotubes on flexible substrates as conductors or semiconductors – these could be used for thin film transistor arrays in applications like electronic paper. A couple of recent papers report progress in this direction.

From the group of John Rogers, at the University of Illinois, comes a Nature paper reporting integrated circuits on flexible substrates based on nanotubes. The paper (Editors summary in Nature, subscription required for full article) , whose first author is Qing Cao, describes the manufacture of an array of 100 transistors on a 50 µm plastic substrate. The transistors aren’t that small – their dimensions are in the micron range – so this is the sort of electronics that would be used to drive a display rather than as CPU or memory. But the performance of the transistors looks like it could be competitive with rival technologies for flexible displays, such as semiconducting polymers.

The difficulty with using carbon nanotubes for electronics this way is that the usual syntheses produce a mixture of different types of nanotubes, some conducting and some semiconducting. Since about a third of the nanotubes have metallic conductivity, a simple mat of nanotubes won’t behave like a semiconductor, because the metallic nanotubes will provide a short-circuit. Rogers’s group get this round this problem in an effective, if not terribly elegant, way. They cut the film with grooves, and for an appropriate combination of groove width and nanotube length they reduce the probability of finding a continuous metallic path between the electrodes to a very low level.

Another paper, published earlier this month in Science, offers what is potentially a much neater solution to this problem. The paper, “Self-Sorted, Aligned Nanotube Networks for Thin-Film Transistors” (abstract, subscription required for full article), has as its first author Melburne LeMieux, a postdoc in the group of Zhenan Bao at Stanford. They make their nanotube networks by spin-coating from solution. Spin-coating is a simple and very widely used technique for making thin films, which involves depositing a solution on a substrate spinning at a few thousand revolutions per minute. Most of the solution is flung off by the spinning disk, leaving a very thin uniform film, from which the solvent evaporates to leave the network of nanotubes. This simple procedure produces two very useful side-effects. Firstly, the flow in the solvent film has the effect of aligning the nanotubes, with obvious potential benefits for their electronic properties. Even more strikingly, the spin-coating process seems to provide an easy solution to the problem of sorting the metallic and semiconducting nanotubes. It seems that one can prepare the surface so that it is selectively sticky for one or other types of nanotubes; a surface presenting a monolayer of phenyl groups preferentially attracts the metallic nanotubes, while an amine coated surface yields nanotube networks with very good semiconducting behaviour, from which high performance transistors can be made.

“Plastics are precious – they’re buried sunshine”

Friday, July 18th, 2008

Disappearing dress at the London College of Fashion
A disappearing dress from the Wonderland project. Photo by Alex McGuire at the London College of Fashion.

I’m fascinated by the subtle science of polymers, and it’s a cause of regret to me that the most common manifestations of synthetic polymers are in the world of cheap, disposable plastics. The cheapness and ubiquity of plastics, and the problems caused when they’re carelessly thrown away, blind us to the utility and versatility of these marvellously mutable materials. But there’s something temporary about their cheapness; it’s a consequence of the fact that they’re made from oil, and as oil becomes scarcer and more expensive we’ll need to appreciate the intrinsic value of these materials much more.

These thoughts are highlighted by a remarkable project put together by the artist and fashion designer Helen Storey and my Sheffield friend and colleague, chemist Tony Ryan. At the centre of the project is an exhibition of exquisitely beautiful dresses, designed by Helen and made from fabrics handmade by textile designer Trish Belford. The essence of fashion is transience, and these dresses literally don’t last long; the textiles they are made from are water soluble and are dissolved during the exhibition in tanks of water. The process of dissolution has a beauty of its own, captured in this film by Pinny Grylls.

Another film, by the fashion photographer Nick Wright, reminds us of the basic principles underlying the thermodynamics of polymer dissolution. The exhibition will be moving to the Ormeau Baths Gallery in Belfast in October, and you will be able to read more about it in that month’s edition of Vogue.

The biofuels bust

Friday, July 11th, 2008

The news that the UK is to slow the adoption of biofuels, and that the European Parliament has called for a reduction in the EU’s targets for biofuel adoption, is a good point to mark one of the most rapid turnarounds we’ve seen in science policy. Only two years ago, biofuels were seen by many as a benign way for developed countries to increase their energy security and reduce their greenhouse gas emissions without threatening their citizens’ driving habits. Now, we’re seeing the biofuel boom being blamed for soaring food prices, and the environmental benefits are increasingly in doubt. It’s rare to see the rationale for a proposed technological fix for a major societal problem fall apart quite so quickly, and there must surely be some lessons here for other areas of science and policy.

The UK’s volte-face was prompted by a government commissioned report led by the environmental scientist Ed Gallagher. The Gallagher Review is quite an impressive document, given the rapidity with which it has been put together. This issue is in many ways typical of the problem we’re increasingly seeing arising, in which difficult and uncertain science comes together with equally uncertain economics through the unpredictability of human and institutional responses in a rapidly changing environment.

The first issue is whether, looking at the whole process of growing crops for biofuels, including the energy inputs for agriculture and for the conversion process, one actually ends up with a lower output of greenhouse gases than one would using petrol or diesel. Even this most basic question is more difficult than it might seem, as illustrated by the way the report firmly but politely disagrees with a Nobel Laureate in atmospheric chemistry, Paul Crutzen, who last year argued that, if emissions of nitrogen oxides during agriculture were properly accounted for, biofuels actually produce more greenhouse gases than the fossil fuels they replace. Nonetheless, the report finds a wide range of achievable greenhouse gas savings; corn bioethanol, for example, at its best produces a saving of about 35%, but at its worst it actually produces a net increase in greenhouse gases of nearly 30%. Other types of biofuel are better; both Brazilian ethanol from sugar cane and biodiesel from palm oil can achieve savings of between 60% and 70%. But, and this is a big but, these figures assume these crops are grown on existing agricultural land. If new land needs to be taken into cultivation, there’s typically a large release of carbon. Taking into account the carbon cost of changing land use means that there’s a considerable pay-back time before any greenhouse gas savings arise at all. In the worst cases, this can amount to hundreds of years.

This raises the linked questions – how much land is available for growing biofuels, and how much can we expect that the competition from biofuel uses of food crops will lead to further increases in food prices? There seems to be a huge amount of uncertainty surrounding these issues. Certainly the situation will be eased if new technologies arise for the production of cellulosic ethanol, but these aren’t necessarily a panacea, particularly if they involve changes in land-use. The degree to which recent food price increases can be directly attributed to the growth in biofuels is controversial, but no-one can doubt that, in a world with historically low stocks of staple foodstuffs, any increase in demand will result in higher prices than would otherwise have occurred. The price of food is already indirectly coupled to the price of oil because modern intensive agriculture demands high energy inputs, but the extensive use of biofuels makes that coupling direct.

It’s easy to be wise in hindsight, but one might wonder how much of this could have been predicted. I wrote about biofuels here two years ago, and re-reading that entry – Driving on sunshine – it seems that some of the drawbacks were more easy to anticipate than others. What’s sobering about the whole episode, though, is that it does show how complicated things can get when science, politics and economics get closely coupled in situations needing urgent action in the face of major uncertainties.