July 18th, 2014
The UK’s innovation system is currently under-performing; the amount of resource devoted to private sector R&D has been too low compared to competitors for many years, and the situation shows no sign of improving. My last post discussed the changes in the UK economy that have led us to this situation, which contributes to the deep-seated problems of the UK economy of very poor productivity performance and persistent current account deficits. What can we do to improve things? Here I suggest three steps.
1. Stop making things worse.
Firstly, we should recognise the damage that has been done to the countries innovative capacity by the structural shortcomings of our economy and stop making things worse. R&D capacity – including private sector R&D – is a national asset, and we should try and correct the perverse incentives that lead to its destruction. Read the rest of this entry »
June 24th, 2014
What’s wrong with the UK’s innovation system is not that we don’t have a strong science base, or even that there isn’t the will to connect the science base to the companies and entrepreneurs who might want to use its outputs. The problem is that our economy isn’t assigning enough resource to pulling through the fruits of the science base into technological innovations, the innovation that will create new products and services, bring economic growth, and help solve some of the biggest social problems we face. The primary symptom of the problem is the UK’s very poor performance at business funded research and development R&D. This is the weak link in the UK’s national innovation system, and it is part of a bigger picture of short-termism and under-investment which underlie the UK economy’s serious long-term problems.
For context, it’s worth highlighting two particular features of the UK economy. The first is its very poor productivity growth: currently on one measure (annualised 6 year growth in productivity) we’re seeing the worst peace-time performance for the last 150 years. Without productivity growth, there will be no growth in average living standards, and that’s going to lead to an increasingly sour political scene.
The second is the huge current account deficit, which at 5.4% of GDP is worse than in the crisis years of the mid-1970s. Now, as then, the UK is unable to pay its way in the world. Unlike the 1970′s, though, there’s no immediate political crisis, no humiliating appeals to the IMF for a bail-out. This time round, overseas investors are happy to finance this deficit by buying UK assets. But this isn’t cost-free. An influx of overseas capital is what is currently driving a price bubble for domestic and commercial property in London, severely unbalancing the economy and leading to a growing gulf between the capital and the regions. The assets being bought include the nation’s key infrastructure in energy and transport; there will be an inevitable loss of control and sovereignty as more of this infrastructure falls into overseas ownership. Chinese money will be paying for any new generation of nuclear power stations that will be built; that will give the UK very little leverage in insisting that some of that investment is spent to create jobs in the UK, and it will be paid for by what will effectively be a tax on everyone’s electricity bills, guaranteed for 35 years.
These are long-term problems, and so is the decline in business R&D intensity. The last thirty years has seen this drop from 1.48% in 1981, to 1.09% now (measured as a percentage of GDP) Read the rest of this entry »
June 15th, 2014
How can we justify spending taxpayers’ money on science when there is so much pressure to cut public spending, and so many other popular things to spend the money on, like the National Health Service? People close to the policy-making process tend to stress that if you want to persuade HM Treasury of the need to fund science, there’s only one argument they will listen to – that science spending will lead to more economic growth. Yet the economic instrumentalism of this argument grates for many people. Surely it must be possible to justify the elevated pursuit of knowledge in less mercenary, less meretricious terms? If our political economy was different, perhaps it would be possible. But in a system in which money is increasingly seen as the measure of all things, it’s difficult to see how things could be otherwise. If you don’t like this situation, it’s not science, but broader society, that you’ve got to change.
The relentless focus on the economic justification of science is relatively recent, but that doesn’t mean that what went before was a golden age. The dominant motivation for state support of science in the twentieth century wasn’t to make money, but to win wars. Read the rest of this entry »
May 31st, 2014
Now that Pfizer has, for the moment, been rebuffed in its attempt to take over AstraZeneca, it’s worth reflecting on the broader issues this story raised about the pharmaceutical industry in particular and technological innovation more generally. The political attention focused on the question of industrial R&D capacity was very welcome; this was the subject of my last post – Why R&D matters. Less has been said about the broader problems of innovation in the pharmaceutical industry, which I discussed in an earlier post – Decelerating change in the pharmaceutical industry. One of the responses I had to my last post argued that we shouldn’t worry about declining R&D in the pharmaceutical industry, because that represented an old model of innovation that was being rapidly superseded. In the new world, nimble start-ups, funded by far-seeing venture capitalists, are able to translate the latest results from academic life sciences into new clinical treatments in a much more cost-effective way than the old industry behemoths. It’s an appealing prospect that fits in with much currently fashionable thinking about innovation, and one can certainly find a few stories about companies founded that way that have brought useful treatments to market. The trouble is, though, if we look at the big picture, there is no evidence at all that this new approach is working.
A recent article by Matthew Herper in Forbes – The Cost Of Creating A New Drug Now $5 Billion, Pushing Big Pharma To Change – sets out pharma’s problems very starkly. Read the rest of this entry »
May 9th, 2014
The takeover bid for the UK/Swedish pharmaceutical company AstraZeneca by US giant Pfizer has given rare political prominence to the issue of UK-based research and development capacity. Underlying much opposition to the deal is the fear that the combined entity will seek to cut costs, and that R&D expenditure will be first in the firing line. This fear is entirely well-founded; since Pfizer took over Wyeth in 2009 it has reduced total R&D spend from $11bn to $6.7bn, and in the UK Pfizer’s cost-cutting reputation was sealed by the closure of its Sandwich R&D facility in 2011. Nor is the importance of AstraZeneca to UK R&D capacity overstated. In the latest EU R&D scoreboard, of the top world 100 companies by R&D expenditure, only 2 are British. One of these is AstraZeneca, and the other GSK. And, if the deal goes ahead and does result in a significant reduction in UK R&D capacity, it wouldn’t be an isolated event. It would be the culmination of a 30 year decline in UK business R&D intensity, which has taken the UK from being one of the most R&D intensive economies in the developed world, to one of the least.
My recent paper “The UK’s Innovation Deficit and How to repair it” analysed this decline in detail and related it to changes in the wider political economy. One response I’ve had to the paper was to regard this decline in R&D intensity as something to be welcomed. In this view, R&D is a legacy of an earlier era of heavy industry and monolithic corporations, now obsolete in a world of open innovation, where valuable intellectual property is more likely to be a brand identity than a new drug or a new electronic device.
I think this view is quite wrong. This doesn’t mean that I think that those kinds of innovation that arise without formal research and development are not important; innovations in the way we organise ourselves, to give one example, can create enormous value. Of course, R&D in its modern sense is just such a social innovation. Read the rest of this entry »
April 7th, 2014
This is another post inspired by my current first year physics course, The Physics of Sustainable Energy (PHY123). Calculations are all rough, order of magnitude estimates – if you don’t believe them, try doing them for yourself.
We could get all the energy we need from the sun, in principle. Even from our cloudy UK skies an average of 100 W arrives at the surface per square meter. Each person in the UK uses energy at an average rate of 3.4 kW, so if we each could harvest the sun from a mere 34 square meters with 100% efficiency, that would do the job. For all 63 million of us, that’s just a bit more than 2,000 square kilometres out of the UK’s total area of 242,900 km2 – less than 1%. What would it take to turn that “in principle” into “in practise”? Here are the problems we have to overcome, in some combination: we need higher efficiencies (to reduce the land area needed), lower costs, the ability to deploy at scale and the ability to store the energy for when the sun isn’t shining.
There are at least four different technological approaches we could use. The most traditional is to use the ability of plants to convert the sun’s energy into fuel molecules; this is cheap, deployable at scale, and provides the energy in easily storable form, but it’s not very efficient and so needs a lot of land. The most technologically sophisticated is the solar cell. These achieve high efficiencies (though still not generally more than about 20-25%), but they cost too much, they are only available at scales that are still orders of magnitude too small, and produce energy in the hard-to-store form of electricity. Other methods include concentrating the sun’s rays to the extent that they can be used to heat up a working fluid directly, a technology already in use in sunny places like California and Spain, while in the future, the prospect of copying nature by using sunshine to synthesise fuel molecules directly – solar fuels – is attractive. How do these technologies compare and what are their future prospects?
We can get a useful baseline by thinking about the most traditional of these technologies – growing firewood. Read the rest of this entry »