Surely there’s more to science than money?

June 15th, 2014

How can we justify spending taxpayers’ money on science when there is so much pressure to cut public spending, and so many other popular things to spend the money on, like the National Health Service? People close to the policy-making process tend to stress that if you want to persuade HM Treasury of the need to fund science, there’s only one argument they will listen to – that science spending will lead to more economic growth. Yet the economic instrumentalism of this argument grates for many people. Surely it must be possible to justify the elevated pursuit of knowledge in less mercenary, less meretricious terms? If our political economy was different, perhaps it would be possible. But in a system in which money is increasingly seen as the measure of all things, it’s difficult to see how things could be otherwise. If you don’t like this situation, it’s not science, but broader society, that you’ve got to change.

The relentless focus on the economic justification of science is relatively recent, but that doesn’t mean that what went before was a golden age. The dominant motivation for state support of science in the twentieth century wasn’t to make money, but to win wars. Read the rest of this entry »

Spin-outs and venture capital won’t fill the pharma R&D gap

May 31st, 2014

Now that Pfizer has, for the moment, been rebuffed in its attempt to take over AstraZeneca, it’s worth reflecting on the broader issues this story raised about the pharmaceutical industry in particular and technological innovation more generally. The political attention focused on the question of industrial R&D capacity was very welcome; this was the subject of my last post – Why R&D matters. Less has been said about the broader problems of innovation in the pharmaceutical industry, which I discussed in an earlier post – Decelerating change in the pharmaceutical industry. One of the responses I had to my last post argued that we shouldn’t worry about declining R&D in the pharmaceutical industry, because that represented an old model of innovation that was being rapidly superseded. In the new world, nimble start-ups, funded by far-seeing venture capitalists, are able to translate the latest results from academic life sciences into new clinical treatments in a much more cost-effective way than the old industry behemoths. It’s an appealing prospect that fits in with much currently fashionable thinking about innovation, and one can certainly find a few stories about companies founded that way that have brought useful treatments to market. The trouble is, though, if we look at the big picture, there is no evidence at all that this new approach is working.

A recent article by Matthew Herper in Forbes – The Cost Of Creating A New Drug Now $5 Billion, Pushing Big Pharma To Change – sets out pharma’s problems very starkly. Read the rest of this entry »

Why R&D matters

May 9th, 2014

The takeover bid for the UK/Swedish pharmaceutical company AstraZeneca by US giant Pfizer has given rare political prominence to the issue of UK-based research and development capacity. Underlying much opposition to the deal is the fear that the combined entity will seek to cut costs, and that R&D expenditure will be first in the firing line. This fear is entirely well-founded; since Pfizer took over Wyeth in 2009 it has reduced total R&D spend from $11bn to $6.7bn, and in the UK Pfizer’s cost-cutting reputation was sealed by the closure of its Sandwich R&D facility in 2011. Nor is the importance of AstraZeneca to UK R&D capacity overstated. In the latest EU R&D scoreboard, of the top world 100 companies by R&D expenditure, only 2 are British. One of these is AstraZeneca, and the other GSK. And, if the deal goes ahead and does result in a significant reduction in UK R&D capacity, it wouldn’t be an isolated event. It would be the culmination of a 30 year decline in UK business R&D intensity, which has taken the UK from being one of the most R&D intensive economies in the developed world, to one of the least.

My recent paper “The UK’s Innovation Deficit and How to repair it” analysed this decline in detail and related it to changes in the wider political economy. One response I’ve had to the paper was to regard this decline in R&D intensity as something to be welcomed. In this view, R&D is a legacy of an earlier era of heavy industry and monolithic corporations, now obsolete in a world of open innovation, where valuable intellectual property is more likely to be a brand identity than a new drug or a new electronic device.

I think this view is quite wrong. This doesn’t mean that I think that those kinds of innovation that arise without formal research and development are not important; innovations in the way we organise ourselves, to give one example, can create enormous value. Of course, R&D in its modern sense is just such a social innovation. Read the rest of this entry »

The economics of innovation stagnation

May 3rd, 2014

What would an advanced economy look like if technological innovation began to dry up? Economic growth would begin to slow, and we’d expect the shortage of opportunities for new, lucrative investments to lead to a period of persistently lower rates of return on capital. The prices of existing income-yielding assets would rise, and as wealth-holders hunted out increasingly rare higher yielding investment opportunities we’d expect to see a series of asset price bubbles. As truly transformative technologies became rarer, when new technologies did come along we might see them being associated with hype and inflated expectations. Perhaps we’d also begin to see growing inequality, as a less dynamic economy cemented the advantages of the already wealthy and gave fewer opportunities to talented outsiders. It’s a picture, perhaps, that begins to remind us of the characteristics of the developed economies now – difficulties summed up in the phrase “secular stagnation”. Could it be that, despite the widespread belief that technology continues to accelerate, that innovation stagnation, at least in part, underlies some of our current economic difficulties?

G7 Real GDP per capita plot
Growth in real GDP per person across the G7 nations. GDP data and predictions from the IMF World Economic Outlook 2014 database, population estimates from the UN World Population prospects 2012. The solid line is the best fit to the 1980 – 2008 data of a logistic function of the form A/(1+exp(-(T-T0)/B)); the dotted line represents constant annual growth of 2.6%.

The data is clear that growth in the richest economies of the world, the economies operating at the technological leading edge, was slowing down even before the recent financial crisis. Read the rest of this entry »

New Dawn Fades?

April 23rd, 2014

Before K. Eric Drexler devised and proselytised for his particular, visionary, version of nanotechnology, he was an enthusiast for space colonisation, closely associated with another, older, visionary for a that hypothetical technology – the Princeton physicist Gerard O’Neill. A recent book by historian Patrick McCray – The Visioneers: How a Group of Elite Scientists Pursued Space Colonies, Nanotechnologies, and a Limitless Future – follows this story, setting its origins in the context of its times, and argues that O’Neill and Drexler are archetypes of a distinctive type of actor at the interface between science and public policy – the “Visioneers” of the title. McCray’s visioneers are scientifically credentialed and frame their arguments in technical terms, but they stand at some distance from the science and engineering mainstream, and attract widespread, enthusiastic – and sometimes adulatory – support from broader mass movements, which sometimes take their ideas in directions that the visioneers themselves may not always endorse or welcome.

It’s an attractive and sympathetic book, with many insights about the driving forces which led people to construct these optimistic visions of the future. Read the rest of this entry »

What’s the best way of harvesting the energy of the sun?

April 7th, 2014

This is another post inspired by my current first year physics course, The Physics of Sustainable Energy (PHY123). Calculations are all rough, order of magnitude estimates – if you don’t believe them, try doing them for yourself.

We could get all the energy we need from the sun, in principle. Even from our cloudy UK skies an average of 100 W arrives at the surface per square meter. Each person in the UK uses energy at an average rate of 3.4 kW, so if we each could harvest the sun from a mere 34 square meters with 100% efficiency, that would do the job. For all 63 million of us, that’s just a bit more than 2,000 square kilometres out of the UK’s total area of 242,900 km2 – less than 1%. What would it take to turn that “in principle” into “in practise”? Here are the problems we have to overcome, in some combination: we need higher efficiencies (to reduce the land area needed), lower costs, the ability to deploy at scale and the ability to store the energy for when the sun isn’t shining.

There are at least four different technological approaches we could use. The most traditional is to use the ability of plants to convert the sun’s energy into fuel molecules; this is cheap, deployable at scale, and provides the energy in easily storable form, but it’s not very efficient and so needs a lot of land. The most technologically sophisticated is the solar cell. These achieve high efficiencies (though still not generally more than about 20-25%), but they cost too much, they are only available at scales that are still orders of magnitude too small, and produce energy in the hard-to-store form of electricity. Other methods include concentrating the sun’s rays to the extent that they can be used to heat up a working fluid directly, a technology already in use in sunny places like California and Spain, while in the future, the prospect of copying nature by using sunshine to synthesise fuel molecules directly – solar fuels – is attractive. How do these technologies compare and what are their future prospects?

We can get a useful baseline by thinking about the most traditional of these technologies – growing firewood. Read the rest of this entry »

On universities and economic growth

March 21st, 2014

I wrote this short piece for the online magazine The Conversation as a comment on the government’s response to the Witty Review on universities and economic growth. It was published there as Budget 2014: cash for research set against an overall story of long-term decline; as the new title suggests it was edited to give more prominence to the new science-related announcements in the Budget. Here’s the original version.

Current UK innovation policy has taken on a medieval cast; no sooner do we have “Catapult Centres” for translational research established, than there is a call for “Arrow Projects”. This is the headline recommendation of a report to government by Sir Andrew Witty on the role of universities in driving economic growth. The tip of the arrow, in Witty’s metaphor, is world-class research from our leading universities – behind this tip we should mobilise research institutes and private sector partners to develop new technologies that would drive new economic growth, involving British companies, big and small, in new supply chains.

Last week saw a government response to this report, which warmly welcomed its recommendations, while making few actual new commitments to support them. But last week also saw the publication of the latest set of national research and development statistics. Total R&D expenditure – in the private sector, in government laboratories and in the universities – has fallen in both cash and real terms, and in proportion to the size of our economy is now substantially lower than both established economic rivals such as France, Germany, the USA and Japan and emerging economic powers such as Korea and China.

Our continuing economic problems, with stagnating productivity and a continuing inability to produce enough tradable goods to pay our way in the world, suggest that we should worry about how effective our innovation system is for translating science into economic growth. Read the rest of this entry »

What should we do about climate change? Two opposing views, and they’re both wrong

March 6th, 2014

In the last 250 years, humanity has become completely dependent on fossil fuel energy. This dependence on fossil fuels has materially changed our climate; these changes will continue and intensify in the future. While uncertainty remains about the future extent and consequences of climate change, there is no uncertainty about the causal link between burning fossil fuel, increasing carbon dioxide concentrations in the atmosphere, and a warming world. This summarises my previous two long posts, about the history of our fossil fuel dependence, and the underlying physics of climate change. What should we do about it? From two ends of the political spectrum, there are two views, and I think they are both wrong.

For the environmental movement, the only thing that stops us moving to a sustainable energy economy right away is a lack of political will. Opposing the “environmentalists” are free-market loving “realists” who (sometimes) accept the reality of human-induced climate change, but balk at the costs of current renewable energy. For them, the correct course of action is to do nothing now (except, perhaps, for some shift from coal to gas), but wait for better technology to come along before making significant moves to address climate change.

The “environmentalists” are right about the urgency of the problem, but they underestimate the degree to which society currently depends on cheap energy, and they overestimate the capacity of current renewable energy technologies to provide cheap enough energy at scale. The “realists”, on the hand, are right about the degree of our dependence on cheap energy, and on the shortcomings of current renewable technologies. But they underplay the risks of climate change, and their neglect of the small but significant chance of much worse outcomes than the consensus forecasts takes wishful thinking to the point of recklessness.

But the biggest failure of the “realists” is that they don’t appreciate how slowly innovation in energy technology is currently proceeding. This arises from two errors. Firstly, there’s a tendency to believe that technology is a single thing that is accelerating at a uniform rate, so that from the very visible rapid rate of innovation in information and communication technologies we can conclude that new energy technologies will be developed similarly quickly. But this is a mistake: innovation in the realm of materials, of the kind that’s needed for new energy technologies, is much more difficult, slower and takes more resources than innovation in the realm of information. While we have accelerating innovation in some domains, in others we have innovation stagnation. Related to this is the second error, which is to imagine that progress in technology happens autonomously;given a need, a technology will automatically emerge to meet that need. But developing new large-scale material technologies needs resources and a collective will, and recently the will to deploy those resources at the necessary scale has been lacking. There’s been a worldwide collapse in energy R&D over the last thirty years; to develop the new technologies we need we will need not only to reverse this collapse but make up the lost ground.

So I agree with the “environmentalists” on the urgency of the problem, and with the “realists” about the need for new technology. But the “realists” need to get realistic about what it will take to develop that new technology.

Climate change: what do we know for sure, and what is less certain?

March 2nd, 2014

In another post inspired by my current first year physics course, The Physics of Sustainable Energy (PHY123), I suggest how a physicist might think about climate change.

The question of climate change is going up the political agenda again; in the UK recent floods have once again raised the question of whether recent extreme weather can be directly attributed to human-created climate change, or whether such events are likely to be more frequent in the future as a result of continuing human induced global warming. One UK Energy Minister – Michael Fallon – described the climate change argument as “theology” in this interview. Of course, theology is exactly what it’s not. It’s science, based on theory, observation and modelling; some of the issues are very well understood, and some remain more uncertain. There’s an enormous amount of material in the 1536 pages of the IPCC’s 5th assessment report (available here). But how should we navigate these very complex arguments in a way which makes clear what we know for sure, and what remains uncertain? Here’s my suggestion for a route-map.

My last post talked about how, after 1750 or so, we became dependent on fossil fuels. Since that time we have collectively burned about 375 gigatonnes of carbon – what has the effect of burning all that carbon been on the environment? The straightforward answer to that is that there is now a lot more carbon dioxide in the atmosphere than there was in pre-industrial times. For the thousand years before the industrial revolution, the carbon dioxide content of the atmosphere was roughly constant at around 280 parts per million. Since the 19th century it has been significantly increasing; it’s currently just a couple of ppm short of 400, and is still increasing by about 2 ppm per year.

This 40% increase in carbon dioxide concentration is not in doubt. But how can we be sure it’s associated with burning fossil fuels? Read the rest of this entry »

How did we come to depend so much on fossil fuels?

February 23rd, 2014

This is another post inspired by my current first year physics course, The Physics of Sustainable Energy (PHY123).

Each inhabitant of the UK is responsible for consuming, on average, the energy equivalent of 3.36 tonnes of oil every year. 88% of this energy is in the form of fossil fuels (about 35% each for gas and oil, and the rest in coal). This dependence on fossil fuels is something new; premodern economies were powered entirely by the sun. Heat came from firewood, which stores the solar energy collected by photosynthesis for at most a few seasons. Work was done by humans themselves, again using energy that ultimately comes from plant foods, or by draught animals. The transition from traditional, solar powered economies, to modern fossil fuel powered economies, was sudden in historical terms – it was probably not until the late 19th century that fossil fuels overtook biomass as the world’s biggest source of energy. The story of how we came to depend on fossil fuels is essentially the story of how modernity developed.

The relatively late date of the world’s transition to a fossil fuel based energy economy doesn’t mean that there were no innovations in the way energy was used in premodern times. On the contrary, the run-up to the industrial revolution saw a series of developments that greatly increased the accessibility of energy. Read the rest of this entry »