Archive for the ‘Nanobusiness’ Category

Why isn’t the UK the centre of the organic electronics industry?

Monday, November 12th, 2012

In February 1989, Jeremy Burroughes, at that time a postdoc in the research group of Richard Friend and Donal Bradley at Cambridge, noticed that a diode structure he’d made from the semiconducting polymer PPV glowed when a current was passed through it. This wasn’t the first time that interesting optoelectronic properties had been observed in an organic semiconductor, but it’s fair to say that it was the resulting Nature paper, which has now been cited more than 8000 times, that really launched the field of organic electronics. The company that they founded to exploit this discovery, Cambridge Display Technology, was floated on the NASDAQ in 2004 at a valuation of $230 million. Now organic electronics is becoming mainstream; a popular mobile phone, the Samsung Galaxy S, has an organic light emitting diode screen, and further mass market products are expected in the next few years. But these products will be made in factories in Japan, Korea and Taiwan; Cambridge Display Technology is now a wholly owned subsidiary of the Japanese chemical company Sumitomo. How is it that despite an apparently insurmountable academic lead in the field, and a successful history of University spin-outs, that the UK is likely to end up at best a peripheral player in this new industry? (more…)

A billion dollar nanotech spinout?

Monday, March 19th, 2012

The Oxford University spin-out Oxford Nanopore Technologies created a stir last month by announcing that it would be bringing to market this year systems to read out the sequence of individual DNA molecules by threading them through nanopores. It’s claimed that this will allow a complete human genome to be sequenced in about 15 minutes for a few thousand dollars; the company also is introducing a cheap, disposable sequencer which will sell for less that $900. Speculation has now begun about the future of the company, with valuations of $1-2 billion dollars being discussed if they decide to take the company public in the next 18 months.

It’s taken a while for this idea of sequencing a single DNA molecule by directly reading out its bases to come to fruition. The original idea came from David Deamer and Harvard’s Dan Branton in the mid-1990s; from Hagen Bayley, in Oxford, came the idea of using an engineered derivative of a natural pore-forming protein to form the hole through which the DNA is threaded. I’ve previously reported progress towards this goal here, in 2005, and in more detail here, in 2007. The Oxford Nanopore announcement gives us some clues as to the key developments since then. The working system uses a polymer membrane, rather than a lipid bilayer, to carry the pore array, which undoubtedly makes the system much more robust. The pore is still created from a pore forming protein, though this has been genetically engineered to give greater discrimination between different combinations of bases as the DNA is threaded through the hole. And, perhaps most importantly, an enzyme is used to grab DNA molecules from solution and feed them through the pore. In practise, the system will be sold as a set of modular units containing the electronics and interface, together with consumables cartridges, presumably including the nanopore arrays and the enzymes. The idea is to take single molecule analysis beyond DNA to include RNA and proteins, as well as various small molecules, with a different cartridge being available for each type of experiment. This will depend on the success of their program to develop a whole family of different pores able to discriminate between different types of molecules.

What will the impact of this development be, if everything works as well as is being suggested? (The prudent commentator should stress the if here, as we haven’t yet seen any independent trials of the technology). Much has already been written about the implications of cheap – less than $1000 – sequencing of the human genome, but I can’t help wondering whether this may not actually be the big story here. And in any case, that goal may end being reached with or without Oxford Nanopore, as this recent Nature News article makes clear. We still don’t know whether the Oxford Nanopore technique will be yet competitive on accuracy and price with the other contending approaches. I wonder, though, whether we are seeing here something from the classic playbook for a disruptive innovation. The $900 device in particular looks like it’s intended to create new markets for cheap, quick and dirty sequencing, to provide an income stream while the technology is improved further – with better, more selective pores and better membranes (inevitably, perhaps, Branton’s group at Harvard reported using graphene membranes for threading DNA in Nature last year). As computers continue to get faster, cheaper and more powerful, the technology will automatically benefit from these advances too – fragmentary and perhaps imperfect sequence information has much greater value in the context of vast existing sequence libraries and the data processing power to use them. Perhaps applications for this will be found in forensic and environmental science, diagnostics, microbiology and synthetic biology. The emphasis on molecules other than DNA is interesting too; single molecule identification and sequencing of RNA opens up the possibility of rapidly identifying what genes are being transcribed in a cell at a given moment (the so-called “transcriptome”).

The impact on the investment markets for nanotechnology is likely to be substantial. Existing commercialisation efforts around nanotechnology have been disappointing so far, but a company success on the scale now being talked about would undoubtedly attract more money into the area – perhaps it might also persuade some of the companies currently sitting on huge piles of cash that they might usefully invest some of this in a little more research and development. What’s significant about Oxford Nanopore is that it is operating in a sweet spot between the mundane and the far-fetched. It’s not a nanomaterials company, essentially competing in relatively low margin speciality chemicals, nor is it trying to make a nanofactory or nanoscale submarine or one of the other more radical visions of the nanofuturists. Instead, it’s using the lessons of biology – and indeed some of the components of molecular biology – to create a functional device that operates on the true single molecule level to fill real market needs. It also seems to be displaying a commendable determination to capture all the value of its inventions, rather than licensing its IP to other, bigger companies.

Finally, not the least of the impacts of a commercial and technological success on the scale being talked about would be on nanotechnology itself as a discipline. In the last few years the field’s early excitement has been diluted by a sense of unfulfilled promise, especially, perhaps, in the UK; last year I asked “Why has the UK given up on nanotechnology?” Perhaps it will turn out that some of that disillusionment was premature.

Good capitalism, bad capitalism and turning science into economic benefit

Wednesday, October 5th, 2011

Why isn’t the UK more successful at converting its excellent science into wealth creating businesses? This is a perennial question – and one that’s driven all sorts of initiatives to get universities to handle their intellectual property better, to develop closer partnerships with the private sector and to create more spinout companies. Perhaps UK universities shied away from such activities thirty years ago, but that’s not the case now. In my own university, Sheffield, we have some very successful and high profile activities in partnership with companies, such as our Advanced Manufacturing Research Centre with Boeing, shortly to be expanded as part of an Advanced Manufacturing Institute with heavy involvement from Rolls Royce and other companies. Like many universities, we have some interesting spinouts of our own. And yet, while the UK produces many small high tech companies, we just don’t seem to be able to grow those companies to a scale where they’d make a serious difference to jobs and economic growth. To take just one example, the Royal Society’s Scientific Century report highlighted Plastic Logic, a company based on great research from Richard Friend and Henning Sirringhaus from Cambridge University making flexible displays for applications like e-book readers. It’s a great success story for Cambridge, but the picture for the UK economy is less positive. The company’s Head Office is in California, its first factory was in Leipzig and its major manufacturing facility will be in Russia – the latter fact not unrelated to the fact that the Russian agency Rusnano invested $150 million in the company earlier this year.

This seems to reflect a general problem – why aren’t UK based investors more willing to put money into small technology based companies to allow them to grow? Again, this is something people have talked about for a long time, and there’ve been a number of more or less (usually less) successful government interventions to address the issue. Only the latest of these was announced at the Conservative party conference speech by the Chancellor of the Exchequer, George Osborne – “credit easing” to “help solve that age old problem in Britain: not enough long term investment in small business and enterprise.”

But it’s not as if there isn’t any money in the UK to be invested – so the question to ask isn’t why money isn’t invested in high tech businesses, it is why money is invested in other places instead. The answer must be simple – because those other opportunities offer higher returns, at lower risk, on shorter timescales. The problem is that many of these opportunities don’t support productive entrepreneurship, which brings new products and services to people who need them and generates new jobs. Instead, to use a distinction introduced by economist William Baumol (see, for example, his article Entrepreneurship: Productive, Unproductive, and Destructive, PDF), they support unproductive entrepreneurship, which exploits suboptimal reward structures in an economy to make profits without generating real value. Examples of this kind of activity might include restructuring companies to maximise tax evasion, speculating in financial and property markets when the downside risk is shouldered by the government, exploiting privatisations and public/private partnerships that have been structured to the disadvantage of the tax-payer, and generating capital gains which result from changes in planning and tax law.

Most criticism of this kind of bad capitalism focuses on issues of fairness and equity, and on the damage to the democratic process done by the associated lobbying and influence-peddling. But it causes deeper problems than this – money and effort used to support unproductive entrepreneurship is unavailable to support genuine innovation, to create new products and services that people and society want and need. In short, bad capitalism crowds out good capitalism, and innovation suffers.

Food nanotechnology – their Lordships deliberate

Tuesday, June 30th, 2009

Today I found myself once again in Westminster, giving evidence to a House of Lords Select Committee, which is currently carrying out an inquiry into the use of nanotechnology in food. Readers not familiar with the intricacies of the British constitution need to know that the House of Lords is one of the branches of Parliament, the UK legislature, with powers to revise and scrutinise legislation, and through its select committees, hold the executive to account. Originally its membership was drawn from the hereditary peerage, with a few bishops thrown in; recently as part of a slightly ramshackle program of constitutional reform the influence of the hereditaries has been much reduced, with the majority of the chamber being made up of members appointed for life by the government. These are drawn from former politicians and others prominent in public life. Whatever the shortcomings of this system from the democratic point of view, it does mean that the membership includes some very well informed people. This inquiry, for example, is being chaired by Lord Krebs, a very distinguished scientist who previously chaired the Food Standards Agency.

All the evidence submitted to the committee is publicly available on their website; this includes submissions from NGOs, Industry Organisations, scientific organisations and individual scientists. There’s a lot of material there, but together it’s actually a pretty good overview of all sides of the debate. I’m looking forward to seeing their Lordships’ final report.

Deja vu all over again?

Wednesday, November 12th, 2008

Today the UK’s Royal Commission on Environmental Pollution released a new report on the potential risks of new nanomaterials and the implications of this for regulation and the governance of innovation. The report – Novel Materials in the Environment: The case of nanotechnology is well-written and thoughtful, and will undoubtedly have considerable impact. Nonetheless, four years after the Royal Society report on nanotechnology, nearly two years after the Council of Science and Technology’s critical verdict on the government’s response to that report, some of the messages are depressingly familiar. There are real uncertainties about the potential impact of nanoparticles on human health and the environment; to reduce these uncertainties some targeted research is required; this research isn’t going to appear by itself and some co-ordinated programs are needed. So what’s new this time around?

Andrew Maynard picks out some key messages. The Commission is very insistent on the need to move beyond considering nanomaterials as a single class; attempts to regulate solely on the basis of size are misguided and instead one needs to ask what the materials do and how they behave. In terms of the regulatory framework, the Commission was surprisingly (to some observers, I suspect) sanguine about the suitability and adaptability of the EU’s regulatory framework for chemicals, REACH, which, it believes, can readily be modified to meet the special challenges of nanomaterials, as long as the research needed to fill the knowledge gaps gets done.

Where the report does depart from some previous reports is in a rather subtle and wide-ranging discussion of the conceptual basis of regulation for fast-moving new technologies. It identifies three contrasting positions, none of which it finds satisfactory. The “pro-innovation” position calls for regulators to step back and let the technology develop unhindered, pausing only when positive evidence of harm emerges. “Risk-based” approaches allow for controls to be imposed, but only when clear scientific grounds for concern can be stated, and with a balance between the cost of regulating and the probability and severity of the danger. The “precautionary” approach puts the burden of proof on the promoters of new technology to show that it is, beyond any reasonable doubt, safe, before it is permitted. The long history of unanticipated consequences of new technology warn us against the first stance, while the second position assumes that the state of knowledge is sufficient to do these risk/benefit analyses with confidence, which isn’t likely to be the case for most fast moving new technologies. But the precautionary approach falls down, too, if, as the Commission accepts, the new technologies have the potential to yield significant benefits that would be lost if they were to be rejected on the grounds of inevitably incomplete information. To resolve this dilemma, the Commission seeks an adaptive system of regulation that seeks, above all, to avoid technological inflexibility. The key, in their view, is to innovate in a way that doesn’t lead society down paths from which it is difficult to reverse, if new information should arise about unanticipated threats to health or the environment.

The report has generated a substantial degree of interest in the press, and, needless to say, the coverage doesn’t generally reflect these subtle discussions. At one end, the coverage is relatively sober, for example Action urged over nanomaterials, from the BBC, and Tight regulation urged on nanotechnology, from the Financial Times. In the Daily Mail, on the other hand, we have Tiny but toxic: Nanoparticles with asbestos-like properties found in everyday goods. Notwithstanding Tim Harper’s suggestion that some will welcome this sort of coverage if it injects some urgency into the government’s response, this is not a good place for nanotechnology to be finding itself.

Nanocosmetics in the news

Thursday, November 6th, 2008

Uncertainties surrounding the use of nanoparticles in cosmetics made the news in the UK yesterday; this followed a press release from the consumer group Which? – Beauty must face up to nano. This is related to a forthcoming report in their magazine, in which a variety of cosmetic companies were asked about their use of nanotechnologies (I was one of the experts consulted for commentary on the results of these inquiries).

The two issues that concern Which? are some continuing uncertainties about nanoparticle safety and the fact that it hasn’t generally been made clear to consumers that nanoparticles are being used. Their head of policy, Sue Davies, emphasizes that their position isn’t blanket opposition: “We’re not saying the use of nanotechnology in cosmetics is a bad thing, far from it. Many of its applications could lead to exciting and revolutionary developments in a wide range of products, but until all the necessary safety tests are carried out, the simple fact is we just don’t know enough.” Of 67 companies approached for information about their use of nanotechnologies, only 8 replied with useful information, prompting Sue to comment: “It was concerning that so few companies came forward to be involved in our report and we are grateful for those that were responsible enough to do so. The cosmetics industry needs to stop burying its head in the sand and come clean about how it is using nanotechnology.”

On the other hand, the companies that did supply information include many of the biggest names – L’Oreal, Unilever, Nivea, Avon, Boots, Body Shop, Korres and Green People – all of whom use nanoparticulate titanium dioxide (and, in some cases, nanoparticulate zinc oxide). This makes clear just how widespread the use of these materials is (and goes someway to explaining where the estimated 130 tonnes of nanoscale titanium dioxide being consumed annually in the UK is going).

The story is surprisingly widely covered by the media (considering that yesterday was not exactly a slow news day). Many focus on the angle of lack of consumer information, including the BBC, which reports that “consumers cannot tell which products use nanomaterials as many fail to mention it”, and the Guardian, which highlights the poor response rate. The story is also covered in the Daily Telegraph, while the Daily Mail, predictably, takes a less nuanced view. Under the headline The beauty creams with nanoparticles that could poison your body, the Mail explains that “the size of the particles may allow them to permeate protective barriers in the body, such as those surrounding the brain or a developing baby in the womb.”

What are the issues here? There is, if I can put it this way, a cosmetic problem, in that there are some products on the market making claims that seem at best unwise – I’m thinking here of the claimed use of fullerenes as antioxidants in face creams. It may well be that these ingredients are present in such small quantities that there is no possibility of danger, but given the uncertainties surrounding fullerene toxicology putting products like this on the market doesn’t seem very smart, and is likely to cause reputational damage to the whole industry. There is a lot more data about nanoscale titanium dioxide, and the evidence that these particular nanoparticles aren’t able to penetrate healthy skin looks reasonably convincing. They deliver an unquestionable consumer benefit, in terms of screening out harmful UV rays, and the alternatives – organic small molecule sunscreens – are far from being above suspicion. But, as pointed out by the EU’s Scientific Committee on Consumer Products, there does remain uncertainty about the effect of titanium dioxide nanoparticles on damaged and sun-burned skin. Another issue recently highlighted by Andrew Maynard is the issue of the degree to which the action of light on TiO2 nanoparticles causes reactive and potentially damaging free radicals to be generated. This photocatalytic activity can be suppressed by the choice of crystalline structure (the rutile form of titanium dioxide should be used, rather than anatase), the introduction of dopants, and coating the surface of the nanoparticles. The research cited by Maynard makes it clear that not all sunscreens use grades of titanium dioxide that do completely suppress photocatalytic activity.

This poses a problem. Consumers don’t at present have ready access to information as to whether nanoscale titanium dioxide is used at all, let alone whether the nanoparticles in question are in the rutile or anatase form. Here, surely, is a case where if the companies following best practise provided more information, they might avoid their reputation being damaged by less careful operators.

What’s meant by “food nanotechnology”?

Friday, October 10th, 2008

A couple of weeks ago I took part in a dialogue meeting in Brussels organised by the CIAA, the Confederation of the Food and Drink Industries of the EU, about nanotechnology in food. The meeting involved representatives from big food companies, from the European Commission and agencies like the European Food Safety Association, together with consumer groups like BEUC, and the campaigning group Friends of the Earth Europe. The latter group recently released a report on food nanotechnology – Out of the laboratory and on to our plates: Nanotechnology in food and agriculture; according to the press release, this “reveals that despite concerns about the toxicity risks of nanomaterials, consumers are unknowingly ingesting them because regulators are struggling to keep pace with their rapidly expanding use.” The position of the CIAA is essentially that nanotechnology is an interesting technology currently in research rather than having yet made it into products. One can get a good idea of the research agenda of the European food industry from the European Technology Platform Food for Life. As the only academic present, I tried in my contribution to clarify a little the different things people mean by “food nanotechnology”. Here, more or less, is what I said.

What makes the subject of nanotechnology particularly confusing and contentious is the ambiguity of the definition of nanotechnology when applied to food systems. Most people’s definitions are something along the lines of “the purposeful creation of structures with length scales of 100 nm or less to achieve new effects by virtue of those length-scales”. But when one attempts to apply this definition in practise one runs into difficulties, particularly for food. It’s this ambiguity that lies behind the difference of opinion we’ve heard about already today about how widespread the use of nanotechnology in foods is already. On the one hand, Friends of the Earth says they know of 104 nanofood products on the market already (and some analysts suggest the number may be more than 600). On the other hand, the CIAA (the Confederation of Food and Drink Industries of the EU) maintains that, while active research in the area is going on, no actual nanofood products are yet on the market. In fact, both parties are, in their different ways, right; the problem is the ambiguity of definition.

The issue is that food is naturally nano-structured, so that too wide a definition ends up encompassing much of modern food science, and indeed, if you stretch it further, some aspects of traditional food processing. Consider the case of “nano-ice cream”: the FoE report states that “Nestlé and Unilever are reported to be developing a nano- emulsion based ice cream with a lower fat content that retains a fatty texture and flavour”. Without knowing the details of this research, what one can be sure of is that it will involve essentially conventional food processing technology in order to control fat globule structure and size on the nanoscale. If the processing technology is conventional (and the economics of the food industry dictates that it must be), what makes this nanotechnology, if anything does, is the fact that analytical tools are available to observe the nanoscale structural changes that lead to the desirable properties. What makes this nanotechnology, then, is simply knowledge. In the light of the new knowledge that new techniques give us, we could even argue that some traditional processes, which it now turns out involve manipulation of the structure on the nanoscale to achieve some desirable effects, would constitute nanotechnology if it was defined this widely. For example, traditional whey cheeses like ricotta are made by creating the conditions for the whey proteins to aggregate into protein nanoparticles. These subsequently aggregate to form the particulate gels that give the cheese its desirable texture.

It should be clear, then, that there isn’t a single thing one can call “nanotechnology” – there are many different technologies, producing many different kinds of nano-materials. These different types of nanomaterials have quite different risk profiles. Consider cadmium selenide quantum dots, titanium dioxide nanoparticles, sheets of exfoliated clay, fullerenes like C60, casein micelles, phospholipid nanosomes – the risks and uncertainties of each of these examples of nanomaterials are quite different and it’s likely to be very misleading to generalise from any one of these to a wider class of nanomaterials.

To begin to make sense of the different types of nanomaterial that might be present in food, there is one very useful distinction. This is between engineered nanoparticles and self-assembled nanostructures. Engineered nanoparticles are covalently bonded, and thus are persistent and generally rather robust, though they may have important surface properties such as catalysis, and they may be prone to aggregate. Examples of engineered nanoparticles include titanium dioxide nanoparticles and fullerenes.

In self-assembled nanostructures, though, molecules are held together by weak forces, such as hydrogen bonds and the hydrophobic interaction. The weakness of these forces renders them mutable and transient; examples include soap micelles, protein aggregates (for example the casein micelles formed in milk), liposomes and nanosomes and the microcapsules and nanocapsules made from biopolymers such as starch.

So what kind of food nanotechnology can we expect? Here are some potentially important areas:

• Food science at the nanoscale. This is about using a combination of fairly conventional food processing techniques supported by the use of nanoscale analytical techniques to achieve desirable properties. A major driver here will be the use of sophisticated food structuring to achieve palatable products with low fat contents.
• Encapsulating ingredients and additives. The encapsulation of flavours and aromas at the microscale to protect delicate molecules and enable their triggered or otherwise controlled release is already widespread, and it is possible that decreasing the lengthscale of these systems to the nanoscale might be advantageous in some cases. We are also likely to see a range of “nutriceutical” molecules come into more general use.
• Water dispersible preparations of fat-soluble ingredients. Many food ingredients are fat-soluble; as a way of incorporating these in food and drink without fat manufacturers have developed stable colloidal dispersions of these materials in water, with particle sizes in the range of hundreds of nanometers. For example, the substance lycopene, which is familiar as the molecule that makes tomatoes red and which is believed to offer substantial health benefits, is marketed in this form by the German company BASF.

What is important in this discussion is clarity – definitions are important. We’ve seen discrepancies between estimates of how widespread food nanotechnology is in the marketplace now, and these discrepancies lead to unnecessary misunderstanding and distrust. Clarity about what we are talking about, and a recognition of the diversity of technologies we are talking about, can help remove this misunderstanding and give us a sound basis for the sort of dialogue we’re participating in today.

From micro to nano for medical applications

Wednesday, October 1st, 2008

I spent yesterday at a meeting at the Institute of Mechanical Engineers, Nanotechnology in Medicine and Biotechnology, which raised the question of what is the right size for new interventions in medicine. There’s an argument that, since the basic operations of cell biology take place on the nano-scale, that’s fundamentally the right scale for intervening in biology. On the other hand, given that many current medical interventions are very macroscopic, operating on the micro-scale may already offer compelling advantages.

A talk from Glasgow University’s Jon Cooper gave some nice examples illustrating this. His title was Integrating nanosensors with lab-on-a-chip for biological sensing in health technologies, and he began with some true nanotechnology. This involved a combination of fluid handling systems for very small volumes with nanostructured surfaces, with the aim of detecting single biomolecules. This depends on a remarkable effect known as surface enhanced Raman scattering. Raman scattering is a type of spectroscopy that can detect chemical groups with what is normally rather low sensitivity. But if one illuminates metals with very sharp asperities, this hugely magnifies the light field very close to the surface, increasing sensitivity by a factor of ten million or so. Systems based on this effect, using silver nanoparticles coated so that pathogens like anthrax will stick to them, are already in commercial use. But Cooper’s group uses, not free nano-particles, but very precisely structured nanosurfaces. Using electron beam lithography his group creates silver split-ring resonators – horseshoe shapes about 160 nm across. With a very small gap one can get field enhancements of a factor of one hundred billion, and it’s this that brings single molecule detection into prospect.

On a larger scale, Cooper described systems to probe the response of single cells – his example involved using a single heart cell (a cardiomyocyte) to screen responses to potential heart drugs. This involved a pico-litre scale microchamber adjacent to an array of micron size thermocouples, which allow one to monitor the metabolism of the cell as it responds to a drug candidate. His final example was on the millimeter scale, though its sensors incorporated nanotechnology at some level. This was a wireless device incorporating an electrochemical blood sensor – the idea was that one would swallow this to screen for early signs of bowel cancer. Here’s an example where, obviously, smaller would be better, but how small does one need to go?

Nanoparticles down the drain

Wednesday, September 17th, 2008

With significant amounts of nanomaterials now entering markets, it’s clearly worth worrying about what’s going to happen these materials after disposal – is there any danger of them entering the environment and causing damage to ecosystems? These are the concerns of the discipline of nano-ecotoxicology; on the evidence of the conference I was at yesterday, on the Environmental effects of nanoparticles, at Birmingham, this is an expanding field.

From the range of talks and posters, there seems to be a heavy focus (at least in Europe) on those few nanomaterials which really are entering the marketplace in quantity – titanium dioxide, of sunscreen fame, and nano-silver, with some work on fullerenes. One talk, by Andrew Johnson, of the UK’s Centre for Ecology and Hydrology at Wallingford, showed nicely what the outline of a comprehensive analysis of the environmental fate of nanoparticles might look like. His estimate is that 130 tonnes of nano-titanium dioxide a year is used in sunscreens in the UK – where does this stuff ultimately go? Down the drain and into the sewers, of course, so it’s worth worrying what happens to it then.

At the sewage plant, solids are separated from the treated water, and the first thing to ask is where the titanium dioxide nanoparticles go. The evidence seems to be that a large majority end up in the sludge. Some 57% of this treated sludge is spread on farmland as fertilizer, while 21% is incinerated and 17% goes to landfill. There’s work to be done, then, in determining what happens to the nanoparticles – do they retain their nanoparticulate identity, or do they aggregate into larger clusters? One needs then to ask whether those that survive are likely to cause damage to soil microorganisms or earthworms. Johnson presented some reassuring evidence about earthworms, but there’s clearly more work to be done here.

Making a series of heroic assumptions, Johnson made some estimates of how many nanoparticles might end up in the river. Taking a worst case scenario, with a drought and heatwave in the southeast of England (they do happen, I’m old enough to remember) he came up with an estimate of 8 micrograms/litre in the Thames, which is still more than an order of magnitude less than that that has been shown to start to affect, for example, rainbow trout. This is reassuring, but, as one questioner pointed out, one still might worry about the nanoparticles accumulating in sediments to the detriment of filter feeders.

The mis-measure of uncertainty

Thursday, July 31st, 2008

A couple of pieces in the Financial Times today and yesterday offer some food for thought about the problems of commercialising scientific research. Yesterday’s piece – Drug research needs serendipity (free registration may be required) – concentrates on the pharmaceutical sector, but its observations are more widely applicable. Musing on the current problems of big pharma, with their dwindling pipelines of new drugs, David Shaywitz and Nassim Taleb (author of The Black Swan), identify the problem as a failure to deal with uncertainty; “academic researchers underestimated the fragility of their scientific knowledge while pharmaceuticals executives overestimated their ability to domesticate scientific research.”

They identify two types of uncertainty; there’s the negative uncertainty of all the things that can go wrong as one tries to move from medical research to treatments. Underlying this is the simple fact that we know much less about human biology, in all its complexity, than one might think from all the positive headlines and press releases. It’s in response to this negative uncertainty that managers have attempted to impose more structure and focus to make the outcome of research more predictable. But why this is generally in vain? “Answer: spreadsheets are easy; science is hard.” According to Shaywitz and Taleb, this approach isn’t just doomed to fail on its on terms, it’s positively counterproductive. This is because it doesn’t leave any room for another type of uncertainty: the positive uncertainty of unexpected discoveries and happy accidents.

Their solution is to embrace the trend we’re already seeing, for big Pharma to outsource more and more of its functions, lowering the barriers to entry and leaving room for “a lean, agile organisation able to capture, consider and rapidly develop the best scientific ideas in a wide range of disease areas and aggressively guide these towards the clinic.”

But how are things for the small and agile companies that are going to be driving innovation in this new environment? Not great, says Jonathan Guthrie in today’s FT, but nonetheless “There is hope yet for science park toilers”. The article considers, from a UK perspective, the problems small technology companies are having raising money from venture capitalists. It starts from the position that the problem isn’t shortage of money but shortage of good ideas; perhaps not the end of the age of innovation, but a temporary lull after the excitement of personal computers, the internet and mobile phones. And, for the part of the problem that lies with venture capitalists, misreading this cycle has contributed to their difficulties. In the wake of the technology bubble, venture capital returns aren’t a good advertisement for would-be investors at the moment – “funds set up after 1996 have typically lost 1.4 per cent a year over five years and 1.8 per cent over 10 years, says the British Private Equity and Venture Capital Association.” All is not lost, Guthrie thinks – as the memory of the dotbomb debacles fade the spectacular returns enjoyed by the most successful technology start-ups will attract money back into the sector. Where will the advances take place? Not in nanotechnology, at least in the form of the nanomaterials sector as it has been understood up to now: “materials scientists have engineered a UK nanotechnology sector so tiny it is virtually invisible.” Instead Guthrie points to renewable energy and power saving systems.