On pure science, applied science, and technology

It’s conventional wisdom that science is very different from technology, and that it makes sense to distinguish between pure science and applied science. Largely as a result of thinking about nanotechnology (as I discussed a few years ago here and here), I’m less confident any more that there’s such a clean break between science and technology, or, for that matter, pure and applied science.

Historians of science tell us that the origin of the distinction goes back to the ancient Greeks, who distinguished between episteme, which is probably best translated as natural philosophy, and techne, translated as craft. Our word technology derives from techne, but careful scholars remind us that technology actually refers to writing about craft, rather than doing the craft itself. They would prefer to call the actual business of making machines and gadgets technique (in the same way as the Germans call it technik), rather than technology. Of course, for a long time nobody wrote about technique at all, so there was in this literal sense no technology. Craft skills were regarded as secrets, to be handed down in person from master to apprentice, who were from a lower social class than the literate philosophers considering more weighty questions about the nature of reality.

The sixteenth century saw some light being thrown on the mysteries of technique with books (often beautifully illustrated) being published about topic like machines and metal mining. But one could argue that the biggest change came with the development of what was called then experimental philosophy, which we see now as being the beginnings of modern science. The experimental philosophers certainly had to engage with craftsman and instrument makers to do their experiments, but what was perhaps more important was the need to commit the experimental details to writing so that their counterparts and correspondents elsewhere in the country or elsewhere in Europe could reliably replicate the experiments. Complex pieces of scientific apparatus, like Robert Boyle’s airpump, certainly were some of the most advanced (and expensive) pieces of technology of the day. And, conversely, it’s no accident that James Watt, who more than anyone else made the industrial revolution possible with his improved steam engine, learned his engineering as an instrument maker at the University of Glasgow.

But surely there’s a difference between making a piece of experimental apparatus to help unravel the ultimate nature of reality, and making an engine to pump a mine out? In this view, the aim of science is to understand the ultimate fundamental nature of reality, while technology seeks merely to alter the world in some way, with its success being judged simply by whether it does its intended job. In actuality, the aspect of science as natural philosophy, with its claims to deep understanding of reality, has always coexisted with a much more instrumental type of science whose success is judged by the power over nature it gives us (Peter Dear’s book The Intelligibility of Nature is a fascinating reflection on the history of this dual character of science). Even the keenest defenders of science’s claim to make reliable truth-claims about the ultimate nature of reality – often resort to entirely instrumental arguments – “if you’re so sceptical about science”, they’ll ask a relativist or social constructionist, “why do you fly in airplanes or use antibiotics?”

It’s certainly true that different branches of science are, to a different degree, applicable to practical problems. But which science is an applied science and which is a pure science depends as much on what problems society, at a particular time and in a particular place, needs solving, as on the character of the science itself. In the sixteenth and seventeenth centuries astronomy was a strategic subject of huge importance to the growing naval powers of the time, and was one of the first recipients of large scale state funding. The late nineteenth and early twentieth centuries were the heyday of chemistry, with new discoveries in explosives, dyes and fertilizers making fortunes and transforming the world only a few years after their discoveries in the laboratory. A contrarian might even be tempted to say “a pure science is an applied science that has outlived its usefulness”.

Another way of seeing the problems of a supposed divide between pure science, applied science and technology is to ask what it is that scientists actually do in their working lives. A scientist building a detector for CERN or writing an image analysis program for some radio astronomy data may be doing the purest of pure science in terms of their goals – understanding particle physics or the distant universe – but what they’re actually doing day to day will look very similar indeed to their applied scientist counterparts designing medical imaging hardware or software for interpreting CCTV footage for the police. Of course, this is the origin of the argument that we should support pure science for the spin-offs it produces (such as the World Wide Web, as the particle physicists continually remind us). A counter-argument would say, why not simply get these scientists to work on medical imaging (say) in the first place, rather than trying to look for practical applications for the technologies they develop in support of their “pure” science? Possible answers to this might point to the fact that the brightest people are motivated to solve deep problems in a way that might not apply to more immediately practical issues, or that our economic system doesn’t provide reliable returns for the most advanced technology developed on a speculative basis.

If it was ever possible to think that pure science could exist as a separate province from the grubby world of application, like Hesse’s “The Glass Bead Game”, that illusion was shattered in the second world war. The purest of physicists delivered radar and the fission bomb, and in the cold war we emerged into it seemed that the final destiny of the world was going to be decided by the atomic physicists. In the west, the implications of this for science policy was set out by Vannevar Bush. Bush, an engineer and perhaps the pre-eminent science administrator of the war, set out the framework for government funding of science in the USA in his report “Science: the endless frontier”.

Bush’s report emphasised, not “pure” research, but “basic” research. The distinction between basic research and applied research was not to be understood in terms of whether it was useful or not, but in terms of the motivations of the people doing it. “Basic research is performed without thought of practical ends” – but those practical ends do, nonetheless, follow (albeit unpredictably), and it’s the job of applied research to fill in the gaps. It had in the past been possible for a country to make technological progress without generating its own basic science (as the USA did in the 19th century) but, Bush asserted, the modern situation was different, and “A nation which depends upon others for its new basic scientific knowledge will be slow in its industrial progress and weak in its competitive position in world trade”.

Bush thus left us with three ideas that form the core of the postwar consensus on science policy. The first was that basic research should be carried out in isolation from thoughts of potential use – that it should result from ” the free play of free intellects, working on subjects of their own choice, in the manner dictated by their curiosity for exploration of the unknown”. The second was that, even though the scientists who produced this basic knowledge weren’t motivated by practical applications, these applications would follow, by a process in which potential applications were picked out and developed by applied scientists, and then converted into new products and processes by engineers and technologists. This one-way flow of ideas from science into application is what innovation theorists call the linear model of innovation. Bush’s third assertion was that a country that invested in basic science would recoup that investment through capturing the rewards from new technologies.

All three of these assertions have subsequently been extensively criticised, though the basic picture has a persistent hold on our thinking about science. Perhaps the most influential critique, from the science policy point of view, came in a book by Donald Stokes called Pasteur’s quadrant. Stokes argued from history that the separation of basic research from thoughts of potential use often didn’t happen; his key example was Louis Pasteur, who created a new field of microbiology in his quest to understand the spoilage of milk and the fermentation of wine. Rather than thinking about a linear continuum between pure and applied research, he thought in terms of two dimensions – the degree to which research was motivated by a quest for fundamental understanding, and the degree to which it was motivated by applications. Some research was driven solely by the quest for understanding, typified by Bohr, while an engineer like Edison typified a search for practical results untainted by any deeper curiosity. But, the example of Pasteur showed us that the two motivations could coexist. He suggested that research in this “Pasteur’s quadrant” – use-inspired basic research – should be a priority for public support.

Where are we now? The idea of Pasteur’s quadrant underlies the idea of “Grand Challenges” inspired by societal goals as an organising principle for publicly supported science. From innovation theory and science and technology studies come new terms and concepts, like technoscience, and Mode 2 knowledge production. One might imagine that nobody believes in the linear model anymore; it’s widely accepted that technology drives science as often as science drives technology. As David Willetts, the UK’s Science Minister, put it in a speech in July this year, “A very important stimulus for scientific advance is, quite simply, technology. We talk of scientific discovery enabling technical advance, but the process is much more inter-dependent than that.” But the linear model is still deeply ingrained in the way policy makers talk – in phrases like “technology readiness levels”, “pull-though to application”. From a more fundamental point of view, though, there is still a real difference between finding evidence to support a hypothesis and demonstrating that a gadget works. Intervening in nature is a different goal to understanding nature, even though the processes by which we achieve these goals are very much mixed up.

Energy, carbon, money – floating rates of exchange

When one starts reading about the future of the world’s energy economy, one needs to get used to making conversions amongst a zoo of energy units – exajoules, millions of tons of oil equivalent, quadrillions of british thermal units and the rest. But these conversions are trivial in comparison to a couple of other rates of exchange – the relationship between energy and carbon emissions (using this term as a shorthand for the effect of energy use on the global climate), and the conversion between energy and money.

On the face of it, it’s easy to see the link between emissions and energy. You burn a tonne of coal, you get 29 GJ of energy out and you emit 2.6 tonnes of carbon dioxide. But, if we step back to the level of a national or global economy, the emissions per unit of energy used depend on the form in which the energy is used (directly burning natural gas vs using electricity, for example) and, for the case of electricity, on the mix of generation being used. But if we want an accurate picture of the impact of our energy use on climate change, we need to look at more than just carbon dioxide emissions. CO2 is not the only greenhouse gas; methane, for example, despite being emitted in much smaller quantities than CO2, is still a significant contributor to climate change as it is a considerably more potent greenhouse gas than CO2. So if you’re considering the total contribution to global warming of electricity derived from a gas power station you need to account, not just for the CO2 produced by direct burning, but of the effect of any methane emitted from leaks in the pipes getting to the power station. Likewise, the effect on climate of the high altitude emissions from aircraft is substantially greater than that from the carbon dioxide alone, for example due to the production of high altitude ozone from NOx emissions. All of these factors can be wrapped up by expressing the effect of emissions on the climate through a measure of “mass of carbon dioxide equivalent”. It’s important to take these additional factors into account, or you end up significantly underestimating the climate impact of much energy use, but this accounting embodies more theory and more assumptions.

For a high accessible and readable account of the complexities of assigning carbon footprints to all sorts of goods and activities, I recommend Mike Berners-Lee’s new book How Bad Are Bananas?: The carbon footprint of everything. This has some interesting conclusions – his insistence on full accounting leads to surprisingly high carbon footprints for rice and cheese, for example (as the title hints, he recommends you eat more bananas). But carbon accounting is in its infancy; what’s arguably most important now is money.

At first sight, the conversion between energy and money is completely straightforward; we have well-functioning markets for common energy carriers like oil and gas, and everyone’s electricity bill makes it clear how much we’re paying individually. The problem is that it isn’t enough to know what the cost of energy is now; if you’re deciding whether to build a nuclear power station or to install photovoltaic panels on your roof, to make a rational economic decision you need to know what the price of energy is going to be over a twenty to thirty year timescale, at least (the oldest running nuclear power reactor in the UK was opened in 1968).

The record of forecasting energy prices and demand is frankly dismal. Vaclav Smil devotes a whole chapter of his book Energy at the Crossroads: Global Perspectives and Uncertainties to this problem – the chapter is called, simply, “Against Forecasting”. Here are a few graphs of my own to make the point – these are taken from the US Energy Information Administration‘s predictions of future oil prices.

In 2000 the USA’s Energy Information Agency produced this forecast for oil prices (from the International Energy Outlook 2000):

Historical oil prices up to 2000 in 2008 US dollars, with high, low and reference predictions made by the EIA in 2000

After a decade of relatively stable oil prices (solid black line), the EIA has relatively tight bounds between its high (blue line), low (red line) and reference (green line) predictions. Let’s see how this compared with what happened as the decade unfolded:

High, low and reference predictions for oil prices made by the EIA in 2000, compared with the actual outcome from 2000-2010

The EIA, having been mugged by reality in its 2000 forecasts, seems to have learnt from its experience, if the range of the predictions made in 2010 is anything to go by:

2000 and 2010 oIl price predictions
Successive predictions for future oil prices made by the USA's EIA in 2000 and 2010, compared to the actual outcome up to 2010

This forecast may be more prudent than the 2000 forecast, but with a variation of nearly of factor of four between high and low scenarios, it’s also pretty much completely useless. Conventional wisdom in recent years argues that we should arrange our energy needs through a deregulated market. It’s difficult to see how this can work when the information on the timescale needed to make sensible investment decisions is so poor.

David Willetts on Science and Society

The UK’s Minister for Science and Higher Education, David Willetts, made his first official speech about science at the RI on 9 July 2010. What everyone is desperate to know is how big a cut the science budget will take. Willetts can’t answer this yet, but the background position isn’t good. We know that the budget of his department – Business, Innovation and Skills – will be cut by somewhere between 25%-33%. Science accounts for about 15% of this budget, with Universities accounting for another 29% (not counting the cost of student loans and grants, which accounts for another 27%). So, there’s not going to be a lot of room to protect spending on science and on research in Universities.

Having said this, this is a very interesting speech, in that Willetts takes some very clear positions on a number of issues related to science and innovation and their relationship to society, some of which are rather different from views in government before. I met Willetts earlier in the year, and then he said a couple of things then that struck me. He said that there was nothing in science policy that couldn’t be illuminated by looking at history. He mentioned in particular “The Shock of the Old”, by David Edgerton (which I’ve previously discussed here), and I noticed that at the RS meeting after the election he referred very approvingly to David Landes’s book “The Wealth and Poverty of Nations”. More personally, he referred with pride to his own family origins as Birmingham craftsmen, and he clearly knows the story of the Lunar Society well. His own academic background is as a social scientist, so it would be to be expected that he’d have some well-developed views about science and society. Here’s how I gloss the relevant parts of his speech.

More broadly, as society becomes more diverse and cultural traditions increasingly fractured, I see the scientific way of thinking – empiricism – becoming more and more important for binding us together. Increasingly, we have to abide by John Rawls’s standard for public reason – justifying a particular position by arguments that people from different moral or political backgrounds can accept. And coalition, I believe, is good for government and for science, given the premium now attached to reason and evidence.

The American political philosopher John Rawls was very concerned about how, in a pluralistic society, one could agree on a common set of moral norms. He rejected the idea that you could construct morality on entirely scientific grounds, as consequentialist ethical systems like utilitarianism try to, instead looking for a principles based morality; but he recognised that this was problematic in a society where Catholics, Methodists, Atheists and Muslims all had their different sets of principles. Hence the idea of trying to find moral principles that everyone in society can agree on, even though the grounds on which they approve of these principles may differ from group to group. In a coalition uniting parties including people as different as Evan Harris and Philippa Stroud one can see why Willetts might want to call in Rawls for help.

The connection to science is an interesting one, that draws on a particular reading of the development of the empirical tradition. According, for example, to Schaffer and Shapin (in their book “Leviathan and the Air Pump”) one of the main aims of the Royal Society in its early days was to develop a way of talking about philosophy – based on experiment and empiricism, rather than doctrine – that didn’t evoke the clashing religious ideologies that had been the cause of the bloody religious wars of the seventeenth century. According to this view (championed by Robert Boyle), in experimental philosophy one should refrain entirely from talking about contentious issues like religion, restricting oneself entirely to discussion of what one measures in experiments that are open to be observed and reproduced by anyone.

You might say that science is doing so well in the public sphere that the greatest risks it faces are complacency and arrogance. Crude reductionism puts people off.

I wonder if he’s thinking of the current breed of scientific atheists like Richard Dawkins?

Scientists can morph from admired public luminaries into public enemies, as debates over nuclear power and GM made clear. And yet I remain optimistic here too. The UK Research Councils had the foresight to hold a public dialogue about ramifications of synthetic biology ahead of Craig Venter developing the first cell controlled by synthetic DNA. This dialogue showed that there is conditional public support for synthetic biology. There is great enthusiasm for the possibilities associated with this field, but also fears about controlling it and the potential for misuse; there are concerns about impacts on health and the environment. We would do well to remember this comment from a participant: “Why do they want to do it? … Is it because they will be the first person to do it? Is it because they just can’t wait? What are they going to gain from it? … [T]he fact that you can take something that’s natural and produce fuel, great – but what is the bad side of it? What else is it going to do?” Synthetic biology must not go the way of GM. It must retain public trust. That means understanding that fellow citizens have their worries and concerns which cannot just be dismissed.

This is a significant passage which seems to accept two important features of some current thinking about public engagement with science. Firstly, that it should be “upstream” – addressing areas of science, like synthetic biology, for which concrete applications have yet to emerge, and indeed in advance of signficant scientific breakthroughs like Venter’s “synthetic cell”. Secondly, it accepts that the engagement should be two-way, that the concerns of the public may well be legitimate and should be taken seriously, and that these concerns go beyond simple calculations of risk.

The other significant aspect of Willetts’s speech was a wholesale rejection of the “linear model” of science and innovation, but this needs another post to discuss in detail.

Whose goals should direct goal-directed research?

I’ve taken part in panel discussions at two events with a strong Science and Technology Studies flavour in the last couple of months. “Democratising Futures” was a meeting under the auspices of the Centre for Research in Arts, Social Sciences and Humanities, at Cambridge on 27 May 2010. The Science and Democracy Network’s meeting was held in association with the Royal Society at the Kavli Centre on the 29 June 2010. What follows is a composite of the sorts of things I said at the two meetings.

“There is no alternative” is a phrase with a particular resonance in British politics, but it also expresses a way of thinking about the progress of science and technology. To many people, science and technology represent an autonomous force, driven forward by its own internal logic. In this view, the progress of science and technology cannot effectively steered, much less restrained. I think this view is both wrong and pernicious.

The reality is that there are very many places in which decisions and choices are made about the directions of science and technology. These include the implicit decisions made by the (international) scientific community, as a result of which the fashionable and timely topics of the day acquire momentum, much more explicit choices made by funding agencies in what areas they attach funding priority to, as well as preferences expressed by a variety of actors in the private sector, whether those are the beliefs that inform investment decisions by venture capitalists or the strategic decisions made by multinational companies. It’s obvious that these decisions are not always informed by perfect information and rationality – they will blend informed but necessarily fallible judgements about how the future might unfold with sectional interests, and will be underpinned by ideology.

To take an example which I don’t think is untypical, in the funding body I know best, the UK’s Engineering and Physical Sciences Research Council (EPSRC), priorities are set by a mixture of top-down and bottom-up pressures. The bottom-up aspect comes from the proposals the council receives, from individual scientists, to pursue those lines of research that they think are interesting. From the top, though, comes increasing pressure from government to prioritise research in line with their broad strategies.

In setting a strategic framework, EPSRC distinguishes between the technical opportunities that the current state of science offers, and the demands of “users” of research in industry and research. Advice on the former typically comes from practising scientists, who alone have the expertise to know what is possible. This advice won’t be completely objective, of course – it will be subject to the whims of academic fashion and a certain incumbency bias in favour of established, well-developed fields. The industrial scientists who provide advice will of course have a direct interest in science that benefits their own industries and their own companies. Policy demands supporting science that can be translated into the marketplace, but this needs to be balanced against a reluctance to subsidise the private sector directly. Even accepting the desirability of supporting science that can be taken to market quickly, there is also an incumbency bias here too. Given that this advice necessarily comes from people representing established concerns, who is going to promote the truly disruptive industries?

So, given these routes by which scientists and industry representatives have explicit mechanisms for influencing the agenda and priorities for publicly funded science, the big outstanding question is how the rest of the population can have some influence. Of course, research councils are aware of the broader societal contexts that surround the research they fund; and the scientists and industry people providing advice will be asked to incorporate these broader issues in their thinking. The danger is that these people are not well equipped to make some judgements. In a phrase of Arie Rip, it’s likely that they will be using “folk social science” – a set of preconceptions and prejudices, unsupported by evidence, about what the wider population thinks about science and technology (one very common example of this in the UK is the proposition that one can gauge probable public reactions to science by reading the Daily Mail).

It might be argued that the proper way for wider societal and ethical issues to be incorporated in scientific priority setting is through the usual apparatus of representative democracy – in the UK system, through Ministers who are responsible to Parliament. This fails in practise for both institutional and practical reasons. There is a formal principle in the UK known as the Haldane principle (like much else in the UK this is probably an invented tradition), which states that science should be governed at one remove from government, with decisions being left to scientists. The funding bodies – the research councils – are not direct subsidiaries of their parent government department, but are free-standing agencies. This doesn’t stop them from being given a strong strategic steer, both through informal and formal routes, but they generally resist taking direct orders from the Minister. But there are more general reasons why science resists democratic oversight through traditional mechanisms – it is at once too big and too little an issue. The long timescales of science and the convoluted routes by which it impacts on everyday life; the poor understanding of science on the part of elected politicians; the lack of immediate feedback from the electorate in the politicians’ postbags – all these factors contribute to science not having a high political profile, despite the deep and fundamental impacts it has on the way people live.

Here, then, is the potential role of public engagement – it should form a key input into identifying what potential goals of science and technology might have broad societal support. It was in recognition of these sorts of issues that EPSRC introduced a Societal Issues Panel into its advisory structure – this a high-level strategic advice panel on a par with the Technical Opportunities Panel and the User Panel.

Another development in the way people are thinking about scientific priority setting makes these issues even more pointed – this is the growing popularity across the world of the idea of the “Grand Challenge” as a way of organising science. Here, we have an explicit link being made between scientific priorities and societal goals – which leads directly to the question “whose goals?”

Grand Challenges provide a way of contextualising research that goes beyond a rather sterile dichotomy between “applied” and “blue sky” research – it supports work that has some goal in mind, but a goal that is more distant than the typical object of applied research, and is often on a larger scale. The “challenge” or context is typically based on some larger societal goal, rather than on a question arising from a scientific discipline. This might be a global problem, such as the need to develop a low carbon energy infrastructure or ensure food security for a growing population, or something that is more local to a particular country or group of countries, such as the problems of ageing populations in the UK and other developed countries. The definition in terms of a societal goal necessarily implies that the work needs to be cross-disciplinary in character, and there is growing recognition in principle of the importance of social sciences.

An example of the way in which public engagement could help steer such a grand challenge programme was given by the EPSRC’s recent Grand Challenge in Nanotechnology for Medicine and Healthcare. Here, a public engagement exercise was designed with the explicit intention of using what emerged as an input, together with expert advice from academic scientists, clinicians and industry representatives, into a decision about how to shape the priorities of the programme.

I’ve written in more detail about this process elsewhere. Here, it’s worth stressing what made this programme particularly suitable for this approach. The proposed research was framed explicitly as a search for technological responses to societal issues, so it was easy to argue that public attitudes and priorities were an important factor to consider. The area is also strongly interdisciplinary; this makes the traditional approaches of relying solely on expert advice less effective. Very few, if any, individual scientists have expertise that crosses the range of disciplines that is necessary to operate in the field of nanomedicine, so technical advice needs to integrate the contributions of people expert in areas as different as colloid chemistry and neuroscience, for example.

The outcome of the public engagement provided rich insights that in some cases surprised the expert advisors. These insights included both specific commentaries on the proposed areas of research that were being considered (such as the use of nanotechnology enabled surfaces to control pathogens) and a more general filter – the idea that a key issue in deciding people’s response to a proposed technology was the degree to which it gave or took away control and empowerment from the individual. Of course, people were concerned about issues of risk and regulation, but the form of the engagement was such that much broader questions than the simple question “is it safe” were discussed.

I believe that this public engagement was very successful, because it concerned a rather concrete and tightly defined technology area, it was explicitly linked to a pending funding decision, and there was complete clarity about how it would contribute, together with more conventional consultations, to that decision – that is, what kind of applications of nanotechnology to medicine and healthcare a forthcoming funding call would prioritise. Of course, there are still many open questions about using public engagement more widely in this sort of priority setting.

The first issue is the question of scope – at what level does one ask the question? For example, in the area of energy research, one could ask, should we have a programme of energy research, and if so how big? Or, taking the answer to that question as given, one could ask whether research in biofuels should form a part of the energy programme? Or one could ask what kind of biofuel should we prioritise. My experience from a variety of public engagement exercises in the area of nanotechnology is that the more specific the question, the easier it is for people to engage with the process. But the criticism of focusing public engagement down in this way is that one can be accused, by focusing on the details, of taking the answers to the big questions as read.

But the big questions are fundamentally questions of politics in its proper sense. They are questions about what sort of world we want to live in and what kinds of lives we want to lead. The inescapable conclusion, for me, is that the explicit linkage of science and this kind of politics – the politics of big questions about society’s future – is both inevitable and desirable.

Many scientists will instinctively recoil from this this enmeshing of science and politics. I think this is a mistake. It is less controversial to say we need more science in politics – since so many of the big issues we face have a scientific dimension, most people agree that decisions on these issues need to be informed by science. But we also need to recognise that we need more explicit recognition of the political dimensions of science – because the science we do has such potential to shape the way our society will change, we need positive visions of those changes to steer the way science develops. So, we need more science in politics, and more politics in science. And, when it comes to it, we probably need more politics in politics too.

In addition to these more fundamental questions, there are some very practical linked issues related to the scale of the engagement exercises one does, their methodological robustness, and their cost. Social scientists can contribute a great deal to understanding how to make them as reliable as possible, but I believe that a certain pragmatism is called for when one considers their inevitable methodological shortcomings – they need to be seen as one input into a decision making process that already falls short of perfection. This is inevitable; it is expensive in money and time to do these exercises properly. The UK research councils seem to have settled down to an informal understanding that they will do one or two of these exercises a year on topics that seem to the be most potentially controversial. Following the nanomedicine dialogue, there have been recently completed exercises on synthetic biology and geo-engineering. But we will see how strong the will is to continue in this way in an environment with much less money around.

In addition to practical difficulties, there are people who oppose in principle any use of public engagement in setting scientific priorities. One can identify perhaps three classes of objections. The first will come from those scientists who oppose any infringement of the sovereignty of the “independent republic of science”. The second can be heard from some politicians, who regard the use of direct public engagement as an infringement of the principles of representative democracy. The third will come from free market purists, who will insist that the market provides the route by which informal, non-scientific knowledge is incorporated in decisions about how technology is developed. I don’t think any of these objections is tenable, but that’s the subject for a much longer discussion.

Is debt putting British science at risk?

This was my opening statement at a debate at the Cheltenham Science Festival. This piece also appears as a guest blog on the Times’s Science blog “Eureka Zone”; see also Mark Henderson’s commentary on the debate as a whole.

The question we are posed is “Is debt putting British science at risk?” The answer to this question is certainly yes – we are all aware of the need to arrest the growth in the nation’s debt, and the science budget looks very vulnerable. There is a moral case against excessive debt – it is those in the next generations, our children, who will be paying higher taxes to service this debt. But we can leave a positive inheritance for future generations as well. The legacy we leave them comes from the science we do now. It’s this science that will underpin their future prosperity. We also know that future generations will have to face some big problems – problems that may be so big that they even threaten their way of life. How will we adapt to the climate change we know is coming? How will we get the energy we need to run our energy-dependent society without further adding to that climate change, when the cheap oil we’ve relied on may be a distant memory? How will we feed a growing population? How will we make sure that we can keep our aging population well? These are the problems that we have left future generations to deal with, so we owe it to them to do the science that will provide the solutions.

It’s worth reminding ourselves about the legacy we inherited – what’s happened as a result of the science done in the 1970’s, 80’s and 90’s. I’m going to give just two examples. The first is in the area of health. Many people know the story of how monoclonal antibodies were invented by Cesar Milstein in the Cambridge MRC lab in 1975, a discovery for which he won the Nobel prize in 1984. Further developments took place, notably the method of “humanising” mouse antibodies invented by Greg Winter, also at the MRC lab. This is now the basis of a $32 billion dollar market; one third of all new pharmaceutical treatments are based on this technology, including new treatments for breast cancer, arthritis, asthma and leukemia. And, contrary to the stereotype that the UK is good at science but bad at making money from it, this technology is now licensed to 50 companies, earning £300 million in royalties for MRC. The two main spin-out companies were sold for a total of £932 million, one to AstraZeneca and GlaxoSmithKline, and these large companies are continuing to generate value for the UK from them. So this is a very clear example of a single invention that led to a new industry.

Often the situation is much more complicated than this; rather than a single invention one has a whole series of linked breakthroughs in science, technology and business. Like many other people, I’m delighted with my new smartphone; this is a symbol of a vast new sector of the economy based on information and communication technology. Many people know that the web as we now know it was made possible by the work of Sir Timothy Berners-Lee, a spin-off from the high energy physics effort at CERN; perhaps fewer know about the way the hardware of the web depends on optical fibre, in which so much work was done at Southampton. The basics of how to run a wireless network were developed by the company Racal, the spin-out from which, Vodafone, became a global giant in its own right. The display on my smartphone uses liquid crystals, invented at Hull, while newer e-book readers are starting to use e-ink displays reliant of the technology of Plastic Logic, a spin-out based in the plastic electronics work done in the Cavendish Lab in Cambridge in the 1990’s. So there’s a whole web of invention – an international effort, certainly, but one in which the UK has made a disproportionately large contribution, with economic value generated in all kinds of ways. It’s having a strong science base that allows one to benefit from this kind of web of innovation.

The case for science is made in the excellent Royal Society report “The Scientific Century – securing our future prosperity”. This had input from two former science ministers (one Conservative, one Labour) – Lords Sainsbury and Waldegrave, outstanding science leaders like Sir Paul Nurse and Mark Walport, a few rank-and-file scientists like myself, and was put together by the excellent Science Policy team at the Royal Society. I think it’s thoughtful, evidence-based and compelling.

I’d like to highlight three reasons why we should keep our science base strong.

Firstly, it will underpin our future prosperity. The transformation of science into products through spin-out companies is important, but the role of science in underpinning the economy goes much deeper than this. It’s through the trained people that come out of the science enterprise and its connections with existing industry that the so-called “absorptive capacity” of the economy is underpinned – the ability of an economy to make the most of the opportunities that science and technology will bring.

Secondly, it will give us the tools to solve the big problems we know we are going to face. Tough times are coming – the Government’s Chief Scientific Advisor, Sir John Beddington, talks of the “perfect storm” we face, when continuing population pressure, climate change and the end of cheap energy all come together from 2020 onwards. It is science that will give us the tools to get through this time and prosper. We don’t know what will work in advance, so we need to support many different approaches. In my own area of nanotechnology, I’m particularly excited by the prospects for new kinds of solar cells that will be much cheaper and made on a much larger scale than current types, allowing solar energy to make a real contribution to our energy needs. And some of my colleagues are developing new ways of delivering drugs that can cross the blood-brain barrier and help us deal with those intractable neurodegenerative diseases like Alzheimer’s that are exacting such high and growing human and economic costs on our aging society. But these are just two from many promising lines of attack on our growing problems, and it’s vital to maintain science in its diversity. To cut back on science now, in the face of these coming threats, would amount to unilateral disarmament.

Thirdly, we should support science in the UK because we’re very good at it. The “Scientific Century” report quotes the figures that with 1% of the world’s population, and 3% of the world’s spending on science, we produce 7.9% of the worlds scientific papers. The impact of these papers is measured by the fact that they attract 11.8% of the citations that other scientific papers make; of the most highly cited papers – the ones that have the biggest impact – the UK produces 14.4%. Arguably, we produce more top quality science for less money than anyone else. And despite myths to the contrary, we are effective at translating science into economic benefit – our universities are now more focused on exploiting what they do than ever before, and as good at this as anywhere in the world. Our success in science is a source of advantage to us in a very competitive world, and a cause of envy in other countries that are investing significantly to try and match our performance.

So if debt is the problem we leave to future generations, science is the legacy we leave them; we owe it to them not to damage our science success story now.

Digital vitalism

The DNA that Venter’s team inserted into a bacteria, in his recently reported break-through in synthetic biology, was entirely synthetic – “Our cell has been totally derived from four bottles of chemicals”, he is quoted as saying. It’s this aspect that underlies the comment from Arthur Caplan that I quoted in my last post, that “Venter’s achievement would seem to extinguish the argument that life requires a special force or power to exist. This makes it one of the most important scientific achievements in the history of mankind.” Well, this is one view. But the idea that some special quality separates matter of biological origin from synthetic chemicals – chemical vitalism – is more usually assumed to have been killed by Wöhler’s synthesis of urea in 1828.

But while Venter is putting a stake through the heart of the long-dead doctrine of chemical vitalism, I wonder whether he’s allowed another kind of vitalism to slip in through the back door, as it were. The idea that his cells are entirely synthetic depends on a particular view of the flow of information – we have the sequence of his genome stored on his computer, this information is given physical realisation through the synthesis of the information carrying molecule DNA, and it is this information, when inserted into the lifeless husk, the shell of a bacteria whose own DNA has been removed, that sparks that cell into life, re-animating the cell under the control of the new DNA. In language Venter and others often use, the cell is “booted up”, as a dead computer with a corrupted operating system is restored to life with a new system disk. This idea that the spark of life is imparted by the information of the DNA seems perilously close to another kind of vitalism – let’s call it “digital vitalism”.

But does DNA control the cell, or does the cell control the DNA? Certainly, until Venter’s DNA molecule is introduced into its bacterial host, it is simply a lifeless polymer. It’s the machinery of the cell that reads the DNA and synthesises the protein molecules whose sequences are encoded within it. In many cases, it’s the regulatory apparatus of the cell that controls when that reading and synthesis is done – an enzyme is a tool, so there’s no point making it unless it is needed. Here the DNA seems less like a controller directing the operation of the cell, and more like a resource for the cell to draw on when necessary. And, it seems, bacteria endlessly swap bits of DNA with each other, allowing the fast spread of particularly useful tools, like resistance to antibiotics. This isn’t to deny that DNA is absolutely central to life of all sorts – without it the cell can’t renew itself, much less reproduce – but perhaps the relationship between the DNA and the rest of the cell is less asymmetric and more entangled than this talk of control implies.

How much do we need to worry about a few arguable metaphors? Here, more than usually, because it is these ideas of complete control and the reduction of biology to the digital domain that are so central in investing the visions of synthetic biology with such power.

Speculative bioethics as an engine of hype

Looking back on the reporting of the paper from Craig Venter’s team reporting the successful insertion of a synthetic genome into a bacteria, one thing strikes me – the commentators who were talking up the potential and significance of the experiment the most weren’t the scientists, but the bioethicists.

As one might expect, the Daily Mail took a hysterical view – “one mistake in a lab could lead to millions being wiped out by a plague” sums up their tone. But they were able to back up their piece with expert opinion, from Julian Savulescu, of the Oxford Uehiro Centre for Practical Ethics. He says of Venter “he is not merely copying life artificially or modifying it by genetic engineering. He is going towards the role of God: Creating artificial life that could never have existed.” Even in the sober pages of the Financial Times, we have Arthur Caplan, bioethics professor at the University of Pennsylvania, saying “Venter’s achievement would seem to extinguish the argument that life requires a special force or power to exist. This makes it one of the most important scientific achievements in the history of mankind.” There’s a very marked contrast with the generally much more sceptical comments from scientists, for example those quoted in a NY Times article. The Nobel Laureate David Baltimore, for example, says “To my mind Craig has somewhat overplayed the importance of this… He has not created life, only mimicked it”.

One might almost suspect that there is a symbiosis going on here, between those scientists anxious to maximise the significance of their work, and bioethicists in search of an issue to raise their own profile. After all, if a piece of science is worth worrying about, it must be important. It’s not that I don’t think that these developments have potentially important societal and ethical implications – but it seems to me that these would be better considered from a standpoint that was a little more critical.

On Descartes and nanobots

A couple of weeks ago I was interviewed for the Robots podcast special on on 50 years of robotics, and predictions for the next half century. My brief was nanorobots, and you can hear the podcast here. My pitch was that on the nanoscale we’d be looking to nature for inspiration, exploiting design principles such as self-assembly and macromolecular shape change; as a particularly exciting current development I singled out progress in DNA nanotechnology, and in particular the possibility of using this to do molecular logic. As it happens, last week’s edition of Nature included two very interesting papers reporting further developments in this area – Molecular robots guided by prescriptive landscapes from Erik Winfree’s group in Caltech, and A proximity-based programmable DNA nanoscale assembly line from Ned Seeman’s group in NYU.

The context and significance of these advances is well described in a News and Views article (full text); the references to nanorobots and nanoscale assembly lines have led to considerable publicity. James Hayton (who reads the Daily Mail so the rest of us don’t have to), in his 10e-9 blog comments very pertinently on the misleading use of classical nanobot imagery to illustrate this story. The Daily Mail isn’t the only culprit here – even the venerable Nature uses a still from the film Fantastic Voyage to illustrate their story, with the caption “although such machines are still a fantasy, molecular ‘robots’ made of DNA are under development.”

What’s wrong with these illustrations is that they are graphic representations of bad metaphors. DNA nanotechnology falls squarely in the soft nanotechnology paradigm – it depends on the weak interactions by which complementary sequences are recognised to enable the self-assembly of structures whose design is coded within the component molecules themselves, and macromolecular shape changes under the influence of Brownian motion to effect motion. Soft machines aren’t mechanical engineering shrunk, as I’ve written about at length on this blog and elsewhere.

But there’s another, more subtle point here. Our classical conception of a robot is something with sensors feeding information into a central computer, which responds to this sensory input by a computation, which is then effected by the communication of commands to the actuators that drive the robot’s actions. This separation of the “thinking” function of the robot from its sensing and action is something that we find very appealing; we are irresistibly drawn to the analogy with the way we have come to think about human beings since Descartes – as machines animated by an intelligence largely separate from our bodies.

What is striking about these rudimentary DNA robots is that what “intelligence” they possess – their capacity to sense the environment and process this information to determine which of a limited set of outcomes will be effected – arises from the molecules from which the robot is made and their interaction with a (specially designed) environment. There’s no sense in which the robot’s “program” is loaded into it; the program is implicit in the construction of the robot and its interaction with the environment. In this robot, “thought” and “action” are inseparable; the same molecules both store and process information and drive its motion.

In this, these proto-robots operate on similar general principles to bacteria, whose considerable information processing power arises from the interaction of many individual molecules with each other and with their physical environment (as beautifully described in Dennis Bray’s book Wetware: a computer in every living cell). Is this the only way to build a nanobot with the capacity to process and act on information about the environment? I’m not sure, but for the moment it seems to be the direction we’re moving in.

Science, Engineering and Innovation Summit at the Royal Society

I’ve been at the Science, Engineering and Innovation summit at the Royal Society this evening. This was an attempt to build on the reports on science and innovation published before the election (as discussed my last post) with the new government. It was a packed meeting, involving just about anyone in the UK with any interest in science policy. Here are my notes, as taken at the meeting. My apologies for any inaccuracies, and to anyone in the questions whose name I didn’t catch.

Martin Rees welcomes a packed audience – David Willetts is late.

James Wilsdon takes the chair.

Aims – an opportunity for the new minister to set out the direction of travel in the new spending round, and to make sure the weighty set of reports published before the election are not forgotten about.

Martin Taylor – (Chair, The Scientific Century)

The recent election saw science with a higher profile than usual, partly as a result of the many reports we’re talking about it. Of course science was still marginal, but our key arguments did register. This is partly because of the uncertainty we’re in – we anticipate tempestuous times. The next spending review will be the most important for a generation, setting national finances on an even keel but setting the tone for many years to come. There is a lot that unites the flurry of reports about science we’re talking about – one common theme is the need for stability. Three big themes in the Scientific Century. 1. Science and Innovation must be at the heart of any strategy for economic growth. 2. UK science is a marvellous asset for the UK, but there is a new global competitive environment for science, with major new investments in France, USA and elsewhere. 3. If we don’t continue to invest, we’ll lose our place at science’s top table.

We must show the new government how science and engineering can help the government both overcome immediate challenges and their long term aspirations. There will be difficult decisions ahead – the Royal Society is ready to offer help and advice. But short-term decisions must not undermine the ability of science to help meet the long term global challenges we face, and the health of the nation.

John Browne (President of Royal Academy of Engineering, ex CEO of BP, author of forthcoming Browne review into the financing of higher education).

What can be done to retool the British economy for growth and innovation? We’re the worlds 6th biggest manufacturer, with leading areas in sattelietes, aerospace, pharmaceuticals and design led manufacturing. But we don’t always turn ideas into business. Decisions about budget cuts must be made with an eye on the future. Businesses remain the main vehicle for wealth creation, but governments can help. This isn’t about picking winners, but supporting strategic sectors. Seven areas should be concentrated on:
1. ensuring we have the people with the right skills
2. keeping ideas flowing by funding the best researchers – then a debate about what other research we can afford
3. systems to bridge the gap between science and the market
4. stable environment with stable regulatory framework
5. incentives for small and large companies
6. government’s influence as a customer, with public procurement used as a tool for innovation
7. all the above to be put into a coherent framework, measured and assessed
Government doesn’t have to do everything – national academies, professsional scoeities etc. are ready to help.

We must all take a firmer lead to communicate the excitement of science and engineering and the careers it leads to.

Janet FInch – (Chair, CST Vision for UK research)

The CST report had one headline message – the UK’s research is a great success story but it’s under threat because of global competition. This is the most important message, for the scientific community, to government and to business. The “Rising above the gathering storm” report from the USA made this point strongly in the US context, emphasising the growing importance of China. So the UK’s strong position has been admirable for last decade but we can’t assume it will stay there. What we need to do is:
1. Government should adopt a clear long term vision – we need stable policy and stable policy directions – particularly to encourage private sector investment. In the current environment attracting private sector investment is going to be crucial, but we need government funding for upstream research, creative discovery based research across a range of disciplines (including social sciences).
2. We need to invest in people is important, more than projects. This includes both home grown people and attracting the best from abroad. We can’t predict the future, but if we have the best people they will adapt and respond to new challenges as they arise.
3. We need an ambitious innovation strategy which is directed at major global and social challenges. There need to be more incentives for collaboration, both in the UK and internationally. OUr highly competitive funding environment has served us well, but we need to balance competition with collaboration. This country is always going to be small compared to major world economies so we need to bring people together.

Hermann Hauser (Amadeus Capital Partners, author of Hauser review)

Let me make the case for Maxwell centres. The UK is second only to the USA in producing quality papers, in in papers per pound we’re number one. But we don’t use this excellence in research well to make new companies and to make our large companies more successful. It used to be the case that researchers produced papers, then industrial scientists read them and a few years later might do something with them. But things aren’t so leisurely any more, and it’s a race to commercialise new ideas. The Americans are derogatory about Europe using too much state support, but DARPA is a gigantic government support mechanism for Silicon valley. ITRI in Taiwan, the Fraunhofers in Germany, and others have all created new industries and supported existing ones. Fraunhofers have funding split 1/3 state, 1/3 private sector, 1/3 project based, and this is a good template.

In what areas should we establish a few such centres in the UK? Three criteria:
1. market size billions to 10’s of billions
2. demonstrable academic lead
3. a plan to keep most of the value in the UK – though we do need to recognise that in a global environment there will be international partners.

We should set up a small number of such centres, each funded at a level of 100 million over ten years, and we should support them with government procurement.

(David Willetts arrives)

Iain Gray (CEO, Technology Strategy Board)

A thought inspired by the portrait of Darwin: It’s not the strongest species that survives, nor the most intelligent, but the one can adapt to change.

Science and research produces the ideas for the future, and the exploitation of these ideas produces the innovation that provides economic growth. Innovation is the key to recovering from the recession, and many countries across the world are investing heavily in this area.

Key points – the TSB budget is tiny compared to the government procurement budget – this is a huge opportunity for driving innovation. The low carbon vehicle program is an exemplar of somewhere where strategic investment can help overcome some of the big challenges society is going to face. Nissan’s investment in the UK was because of the science budget and the support for innovation. Other examples of strong businesses – the Satellite business, with new business models. Ceres power, through support in innovation got a key contract with British Gas. Innovation is crucial to economic growth. Regenerative medicine could be a game-changer for UK plc.

We need to provide the right ammunition to get the arguments across that innovation is what’s needed for the short and long term growth of the UK economy.

Helga Nowotny (President, European Research Council)

Research – Innovation – Education: the knowledge triangle is still valid, but we see some adjustments taking place. Innovation becomes the flagship of the plans for Europe – but Europe needs changes to increase the speed with which discoveries are taken to market. We know how to do this. We need
1. the spirit of entrepreneurship both inside and outside academia – intellectual property, venture captial, and public procurement. Less often talked about: every technological innovaiton needs social innovation. Not all innovaiton is based on research. But the kinds of innovation that will take us further are science based. As de Gennes has said, “By improving the candle, we are not going to invent electricity”.
2. So to research – the ERC will continue to be driven by excellence, bottom-up approach, the researchers know best. The UK is very successful in the ERC – the UK is a winner, but so is Europe, because this develops healthy competition and a raising of standards of evaluation right across Europe. The ERC trusts in people and their talents – but we need the third arm of the triangle.
3. Education begins long before the University. Many countries have a leaky pipeline of talented youngsters, so in the national context this pipeline should be fixed.

We hear a great deal about the grand challenges, of energy, climate, etc. There is one grand challenge that needs to be addressed first – how to integrate the three arms of the knowledge triangle.

David Willetts (Minister for Science and Higher Education)

I think fondly of a visit to the RS a couple of years ago when Martin Rees let me handle first editions of Principia and Origin of Species. This is the excitement of science which we should never forget. I’m not the only minister here – Pauline Neville-Jones (Minister for Security) is also in the audience.

I have dual responsibility for Universities and Science – I think this is a very exciting connection. What does this imply for dual support system? Firstly it means there is clear responsibility – the research councils can’t pass off responsibility to HEFCE, and vice versa.

Impact – Martin Rees is eloquent on some of these issues. Most academics do hope for and aspire to work that has an impact – researchers in medicine want to make a drug that will save lives, if you are a historian you hope your work will change the percpetions of the nation. The issue, then, isn’t impact per se – we all agree that research needs impact, but the “impact agenda”. I am wary of clunky attempts to measure and fund impact through the REF, and the impact agenda needs to be methodologically sound and commands widespread research. Blue skies research is still very important.

Innovation – this is often wrongly reduced to R&D. Coming from Birmingham, I start with Joseph Priestley and the Lunar Society. I would like to apologise retrospectively for the Tory riots in 1792 that burnt his house down! He discovered oxygen, but Lavoisier created the theoretical understanding, and the Swiss man Schweppes that made money out of it! I find the concept of the cluster a valuable way of thinking how innovation arises, much better than the sausage machine idea where science goes in one end and innovation comes out the other. We need low risk environments for doing high risk ventures. One idea for strengthening the cluster agenda is the idea of reproducing the Fraunhofers – I am struck by the similarity of Hauser and Dyson recommendations. Of course money is tight and some people will say you shouldn’t be thinking about making new institutions, but this is a very important area that I will be studying carefully.

Universities – Browne’s report on making the sector financially sustainable is very important , and the European agenda is very important here. Important arguments in Landes’s book – “The Wealth and Poverty of Nations” – point to the diversity of Europe compared to the monolithic nature of China as important in promoting innovation in Europe. Looking at the UK’s Nobel prize winners, so many of them had a moment of crisis or disjunction, moving disciplines or moving cultures. These shocks can create true intellectual greatness.

Questions now:

Imran Khan CASE – two key messages – it would be a false economy to cut support for science now, and we need a long-term plan

Someone from Imperial – medical research will die unless we pare back regulation

Alan Thorpe (CEO, NERC) – There’s much evidence on the economic benefits to the UK show factors like 15 in returns to the economy, many case studies on innovation deriving from fundamental research. Research Councils are proud of their “excellence with impact” theme. Is our evidence persuasive? What else should we do?

David Willetts – the evidence is powerfully set out. The absorbtive capacity of the economy allows you to benefit from advances around the world. There is a cash constraint, and that means that some things that are desirable are not affordalbe. I will make the argumetns about the role of science in the economy and civilisation. But I won’t be a shop steward – I understand the argumetns ane will do my best to convey them. I am here to learn from the panellists, and to serve this community.

Martin Taylor. Please put science and innovation at the heart of you plans for the economy – the figures for foreign investment, money coming in with foreign students. Please develop a plan for science that has ambition and vision, and give them stability.

Mary Phillips, UCL. Pleasing to see the role of social sciences highlighted – Ss and arts and humanities shouldn’t be neglected.

David Cope, POST. You emphasised diversity at the European level. But you can scale this down to the national level. Diversity is important, but this is in tension with concentrating resources on

James Woodhouse. You were much less robotic than your predecessors – would you favour research on robots in the home and the hospital. What is your attitude to nuclear power, and will you spend more money on carbon capture.

ANO. Food security – estimates that we will need lots more food, but I don’t see a new green revoution on the way,

DW. A common thread through these questions – the argument of John Beddington that there are a set of global challenges from which we are not exempt. We are dumping serious problems on the next generation (see the Pinch!) but we have a repairing lease on the planet. The resource that scientists offer is invaluable. We must revisit agricultural research, energy is crucial, robotics – Japan is instructive as robots seem to be their solution to the ageing population. And social science is very important – these challenges are about human behaviour. And I won’t say anything in the middle of a spending review about allocations to different institutions!

Janet Finch. The global challenges are getting much more severe; as China and India grow, we need to see much more collaboration between universities and this is much more important than having a debate about concentrating funding.

John Browne. The review on HE is taking public evidence this week – one question is trying to understand the difference between a set of world class institutions and a world class system. Carbon capture and storage will be debated for too long and action will be smaller than we expect. It’s a possibility but as cost and scale mounts alternatives will intervene. The discovery of unconventional natural gas will defer the need for CCS.

Hermann Hauser. You should have as much diversity as possible when it comes to blue sky research, but for exploitation there are only a few sectors in which the UK can be world-class.

Mark Walport. When times get tight the temptation is to slash and burn. we must maintain excellence at critical mass. we need a stable poslicty enviornment if industry is going to innovate. With all hte talk of big society, you don’t have a stable environmnet – that needs strong central

Chi Onwurah (Labour MP for Newcastle Central). I am a chartered engineer, Parliament at least is the most diverse environment I have worked in, having been a (black, female) engineer for the last 20 years. We need to attract a very wide range of people into science. How can we attract less well-off people to professional jobs like this?

Joyce Tait (Edinburgh) Enormous opportunities for innovation in life sciences if we can adapt the regulatory environment – a small change in the regulatory system could yield big benefits.

Helga Nowotny. Diversity is a source of creativity. But as Hermann says you have to look at what stage you mean. But diversity can turn into fragmentation. We need more gender diversity – more women in science and those we have don’t leave. we have a majority of female students, but many leave as the postdoc lifestyle demands mobility and insecurity inconsistent with family life. Too much measurement means that people become cynical and learn the rules of the game, at the expense of creativity. For the ERC we see a growing number of applicants. 14% is the fraction of women professors/advanced grant holders, but things are better for younger women.

Iain Gray. The Big Bang science fair brought many underprivileged children to be involved in Manchester. Regulation is a hugely important area. Maybe there are some special factors in Scotland we could look at.

DW. Stable policy environment – we are trying to operate on the basis of a strong coalition government to last five years. The PM made it clear that he didn’t want to reorganise Whitehall – so we have an opportunity to provide stability. I agree about diversity. On regulation, let’s have some concrete proposals. Many exciting discussions about the difference between risk and hazard and the regulation thereof remain!

Science in the British election

It’s now clear that our election has produced no winners, least of all science. But it’s worth reflecting on what’s worked and what’s not worked in the various efforts there’ve been to raise the profile of science in an election that, in the end, was always going to be dominated by other issues.

The Campaign for Science and Engineering did a great job in extracting statements about science from each of the main parties, which have been published on their excellent blog The Science Vote. The New Scientist blog The S Word has been another excellent source of information and commentary on the campaign to raise science’s profile in the election. Predictably, the parties commitments to science have been notably short on detail, particularly on commitments to maintain current levels of science spending, but it’s progress even to have some warm words.

The background has been set with a few heavyweight reports earlier in the year. In March, the Royal Society released its contribution – The Scientific Century (I was on the advisory group for this, which was a fascinating experience), while the Government’s own highest level advisory body, the Council for Science and Technology, produced their own Vision for UK Research (PDF). Three big themes emerged from these reports; the excellence of the UK science base and of the best individual researchers within it, the importance of science and technology for economic growth and our future prosperity, and the need for science to solve the pressing problems the whole world faces, of dealing with climate change, moving to a low carbon economy and keeping a growing population healthy and fed.

Predictably, it has been the economic argument that’s gained the most political traction; the Labour government produced the Hauser review, calling for translational research centres along the lines of Germany’s Fraunhofer institutes, and the Conservatives have their Dyson review, with remarkably similar conclusions. Though the emphasis of both of these contributions is on near-market research, they both at least pay lip-service to the importance of having a strong science base.

We shall see, of course, how much these warm words translate into action. One has to worry, after an election campaign in which all sides have conspicuously failed to confront the really hard choices that a government will face in dealing with a deficit, that the science budget is going to be seen as a soft target, politically, compared to areas such as health, education or defense.

Is there a significant constituency for science, that might impose any political price on cutting science budgets? This election has seen high hopes for social media as a way of mobilising a science voting block – see #scivote on Twitter. Looking at this, one sees something that looks very much like an attempt to develop an identity politics for science – the idea that there might be a “science vote”, in the way that people talk (correctly or not) about a “gay vote” or a “christian vote”. There’s a sense of a community of right-minded people, with leaders from politics and the media, and clear dividing lines from the forces of unreason. What’s obvious, though, is this strategy hasn’t worked – a candidate standing on a single issue science platform ended up with 197 votes, which compares unfavourably with the 228 votes the Monster Raving Loony Party got in my own, nearby constituency. And Evan Harris, the Liberal Democrat science spokesman and #scivote favourite, lost his own seat.

I think that science is much too important to be treated as a sectional interest; identity politics will never work for science, simply because a serious interest in science for its own sake will only ever be shown by a minority. Instead, support for science must be built from a coalition of people with many different interests and outlooks. For some the intrinsic wonder of science will be enough to strongly support it, but for many others it will be the role of science in the economy, the appeal of medical research or the importance of science for making the transition to a low carbon economy, that persuades them to take the subject seriously.

My congratulations to Dr Julian Huppert, elected Liberal Democrat MP for Cambridge. He’s a research scientist in the Cavendish Laboratory, who will now have a little less time to spend thinking about theoretical biophysics, and a bit more time worrying about science policy, and, I’m sure, many other pressing issues.