Reflections on the UK’s new Innovation Strategy

The UK published an Innovation Strategy last week; rather than a complete summary and review, here are a few of my reflections on it. It’s a valuable and helpful document, though I don’t think it’s really a strategy yet, if we expect a strategy to give a clear sense of a destination, a set of plans to get there and some metrics by which to measure progress. Instead, it’s another milestone in a gradual reshaping of the UK’s science landscape, following last year’s R&D Roadmap, and the replacement of the previous administration’s Industrial Strategy – led by the Department of Business, Energy and Industrial Strategy – by a Treasury driven “Plan for Growth”.

The rhetoric of the current government places high hopes on science as a big part of the UK’s future – a recent newspaper article by the Prime Minister promised that “We want the UK to regain its status as a science superpower, and in so doing to level up.” There is a pride in the achievements of UK science, not least in the recent Oxford Covid vaccine. And yet there is a sense of potential not fully delivered. Part of this is down to investment – or the lack of it: as the PM correctly noted: “this country has failed for decades to invest enough in scientific research, and that strategic error has been compounded by the decisions of the UK private sector.”

Last week’s strategy focused, not on fundamental science, but on innovation. As the old saying goes, “Research is the process of turning money into ideas, innovation is turning ideas into money” – and, it should be added, other desirable outcomes for the nation and society – the necessary transition to zero carbon energy, better health outcomes, and the security of the realm in a world that feels less predictable. But the strategy acknowledges that this process hasn’t been working – we’ve seen a decline in productivity growth that’s unprecedented in living memory.

This isn’t just a UK problem – the document refers to an apparent international slowing of innovation in pharmaceuticals and semiconductors. But the problem is worse in the UK than in comparator nations, and the strategy doesn’t shy away from connecting that with the UK’s low R&D intensity, both public and private: “One key marker of this in the UK is our decline in the rate of growth in R&D spending – both public and private. In the UK, R&D investment declined steadily between 1990 and 2004, from 1.7% to 1.5% of GDP, then gradually returned to be 1.7% in 2018. This has been constantly below the 2.2% OECD average over that period.”

One major aspiration that the government is consistent about is the target to increase total UK investment in R&D (public and private) to reach 2.4% of GDP by 2027, from its current value of about 1.7%. As part of this there is a commitment to increase public spending from £14.9 bn this year to £22 bn – by a date that’s not specified in the Innovation Strategy. An increase of this scale should prompt one to ask whether the institutional landscape where research is done is appropriate, and the document announces a new review of that landscape.

Currently the UK’s public research infrastructure is dominated by universities to a degree that is unusual amongst comparator nations. I’m glad to see that the Innovation Strategy doesn’t indulge in what seems to be a widespread urge in other parts of government to denigrate the contribution of HE to the UK’s economy, noting that “in recent years, UK universities have become more effective at attracting investment and bringing ideas to market. Their performance is now, in many respects, competitive with the USA in terms of patents, spinouts, income from IP and proportion of industrial research.” But it is appropriate to ask whether other types of research institution, with different incentive structures and funding arrangements, might be needed in addition to – and to make the most of – the UK’s academic research base.

But there are a couple of fundamentally different types of non-university research institutions. On the one hand, there are institutions devoted to pure science, where investigators have maximum freedom to pursue their own research agendas. Germany’s Max Planck Institutes offer one model, while the Howard Hughes Medical Institute’s Janelia Research Campus, in the USA, has some high profile admirers in UK policy circles. On the other hand, there are mission-oriented institutes devoted to applied research, like the Fraunhofer Institutes in Germany, the Industrial Technology Research Institute in Taiwan, and IMEC (the Interuniversity Microelectronics Centre) in Belgium. The UK has seen a certain amount of institutional evolution in the last decade already, with the establishment of the Turing Institute, the Crick Institute, the Henry Royce Institute, the Rosalind Franklin Institute, the network of Catapult Centres, to name a few. It’s certainly timely to look across the landscape as it is now to see the extent to which these institutions’ missions and the way they fit together in a wider system have crystallised, as well as to ask whether the system as a whole is delivering the outcomes we want as a society.

There is one inescapable factor about the institutional landscape we have now that is seriously underplayed – that is that what we have now is a function of the wider political and economic landscape – and the way that’s changed over the decades. For example, there’s a case study in the Innovation Strategy of Bell Laboratories in the USA. This was certainly a hothouse of innovation in its heyday, from the 1940’s to the 1980’s – but that reflected its unique position, as a private sector laboratory that was sustained by the monopoly rents of its parent. But that changed with the break-up of the Bell System in the 1980’s, itself a function of the deregulatory turn in US politics at the time, and the institution is now a shadow of its former self. Likewise, it’s impossible to understand the drastic scaling back of government research laboratories in the UK in the 1990’s without appreciating the dramatic policy shifts of governments in the 80’s and 90’s. A nation’s innovation landscape reflects wider trends in political economy, and that needs to be understood better and the implications made more explicit.

With the Innovation Strategy was published a “R&D People and Culture Strategy”. This contains lots of aspirations that few would disagree with, but not much in the way of concrete measures to fix things. To connect this with the previous discussion, I would have liked to have seen much more discussion of the connection between the institutional arrangements we have for research, the incentive structure produced by those arrangements, and the culture that emerges. It’s a reasonable point to complain that people don’t move as easily from industry to academia and back as they used too, but it needs to be recognised that this is because the two have drifted apart; with only a few exceptions, the short term focus of industry – and the high pressure to publish on academics – makes this mobility more difficult. From this perspective, one question we should ask about our institutional landscape, is whether it is the right one to allow the people in the system to flourish and fulfil their potential?

We shouldn’t just ask in what kind of institutions research is done, but also where those are institutions situated geographically. The document contains a section on “Levelling Up and innovation across the UK”, reasserting as a goal that “we need to ensure more places in the UK host world-leading and globally connected innovation clusters, creating more jobs, growth and productivity in those areas.” In the context of the commitment to increase the R&D intensity of the economy, “we are reviewing how we can increase the proportion of total R&D investment, public and private, outside London, the South East, and East of England.”

The big news here, though, is that the promised “R&D and Place Strategy” has been postponed and rolled into the forthcoming “Levelling Up” White Paper, expected in the autumn. If this does take the opportunity of considering in a holistic way how investments in transport, R&D, skills and business support can be brought together to bring about material changes in the productivity of cities and regions that currently underperform, that is not a bad thing. I was a member of the advisory group for the R&D and Place strategy, so I won’t dwell further on this issue here, beyond saying that I recognise many of the issues and policy proposals which that body has discussed, so I await the final “Levelling Up” White Paper with interest.

A strategy does imply some prioritisation, and there are a number of different ways in which one might define priorities. The Coalition Government defined 8 Great Technologies; the 2017 Industrial Strategy was built around “Grand Challenges” and “Sector Deals” covering industrial sectors such as Automotive and Aerospace. The current Innovation Strategy introduces seven “technology families” and a new “Innovation Missions Programme”.

It’s interesting to compare the new “seven technology families” with the old “eight great technologies”. For some the carry over is fairly direct, albeit with some wording changes reflecting shifting fashions – robotics and autonomous systems becomes robotics and smart machines, energy and its storage becomes energy and environment technologies, advanced materials and nanotechnology becomes advanced materials and manufacturing, synthetic biology becomes engineering biology. At least two of the original 8 Great Technologies always looked more like industry sectors than technologies – satellites and commercial applications of space, and agri-science. Big data and energy-efficient computing has evolved into AI, digital and advanced computing, reflecting a genuine change in the technology landscape. Regenerative medicine looks like it’s out of favour, replaced in the biomedical area by bioinformatics and genomics. Quantum technology became appended to the “8 great” a year or two later, and this is now expanded to electronics, photonics and quantum.

Interesting thought the shifts in emphasis may be, the key issue is the degree to which these high level priorities are translated into different outcomes in institutions and funding programmes. How, for example, are these priority technology families reflected in advisory structures at the level of UKRI and the research councils? And, most uncomfortable of all, a decision to emphasise some technology families must imply, if it has any real force, a corresponding decision to de-emphasise some others.

One suspects that organisation through industrial sectors is out of favour in the new world where HM Treasury is in the driving seat; for HMT a focus on sectors is associated with incumbency bias, with newer fast-growing industries systematically under-represented, and producer capture of relevant government departments and agencies, leading to a degree of policy attention that reflects a sector’s lobbying effectiveness rather than its importance to the economy.

Despite this colder new environment, the ever opportunistic biomedical establishment has managed to rebrand their sector deal as a “Life Sciences Vision”. The sector lens remains important, though, because industrial sectors do face their own individual issues, all the more so at a time of rapid change. Successfully negotiating the transition to electric vehicles represents an existential challenge to the automotive sector, while for the persistently undervalued chemicals sector, withdrawal from the EU regulatory framework – REACH – threatens substantial extra costs and frictions, while the transition to net zero presents both a challenge for this energy intensive industry, and a huge set of new potential markets as the supply chain for new clean-tech industries like batteries is developed.

One very salutary clarification has emerged as a side-effect of the pandemic. The vaccination programme can be held up as a successful exemplar of an “innovation mission”. This emphasises that a “mission” shouldn’t just be a vague aspiration, but a specific engineering project with a product at the end of it – with a matching social infrastructure developed to ensure that the technology is implemented to deliver the desired societal outcome. Thought of this way, a mission can’t just be about discovery science – it may need the development of new manufacturing capacity, new ICT systems, repurposing of existing infrastructures. Above all, a mission needs to be executed with speed, decisiveness, and a willingness to spend money in more than homeopathic quantities, characteristics that aren’t strongly associated with recent UK administrations.

What further innovation missions can we expect? It isn’t characterised in these terms, but the project to build a prototype power fusion reactor – the “Spherical Tokamak for Energy Production” – could be thought as another one. By no means guaranteed to succeed, it would be a significant development if it did work, and in the meantime it probably will support the spinning out of a number of potentially important technologies for other applications, such as new materials for extreme environments, and further developments in robotics.

Who will define future “innovation missions”? The answer seems to be the new National Science and Technology Council, to be chaired by the Prime Minister and run by the government’s Chief Scientific Advisor, Sir Patrick Vallance, given an expanded role and an extra job title – National Technology Adviser. In the words of the Prime Minister, “It will be the job of the new National Science and Technology Council to signal the challenges – perhaps even to specify the breakthroughs required – and we hope that science, both public and commercial, will respond.”

But here there’s a lot to fill in terms of the mechanisms of how this will work. How will the NSTC make its decisions – who will be informing those discussions? And how will those decisions be transmitted to the wider innovation ecosystem – government departments and their delivery agencies like UKRI, and its component research councils and innovation agency InnovateUK? There is a new system emerging here, but the way it will be wired is as yet far from clear.

Fighting Climate Change with Food Science

The false claim that US President Biden’s Climate Change Plan would lead to hamburger rationing has provided a predictably useful attack line for his opponents. But underlying this further manifestation of the polarisation of US politics, there is a real issue – producing the food we eat does produce substantial greenhouse gas emissions, and a disproportionate amount of these emissions come from eating the meat of ruminants like cattle and sheep.

According to a recent study, US emissions from the food system amount to 5 kg a person a day, and 47% of this comes from red meat. Halving the consumption of animal products by would reduce the USA’s greenhouse gas emissions by about 200 million tonnes of CO2 equivalent, a bit more than 3% of the total value. In the UK, the official Climate Change Committee recommends that red meat consumption should fall by 20% by 2050, as part of the trajectory towards net zero greenhouse gas emissions by 2050, with a 50% decrease necessary if progress isn’t fast enough in other areas. At the upper end of the range possibilities, a complete global adoption of completely animal-free – vegan – diets has been estimated to reduce total global greenhouse gas emissions by 14%.

The political reaction to the false story about Biden’s climate change plan illustrates why a global adoption of veganism isn’t likely to happen any time soon, whatever its climate and other advantages might be. But we should be trying to reduce meat consumption, and it’s worth asking whether the development of better meat substitutes might be part of the solution. We are already seeing “plant-based” burgers in the supermarkets and fast food outlets, while more futuristically there is excitement about using tissue culture techniques to produce in vitro, artificial or lab-grown meat. Is it possible that we can use technology to keep the pleasure of eating meat while avoiding its downsides?

I think that simulated meat has huge potential – but that this is more likely to come from the evolution of the currently relatively low-tech meat substitutes rather than the development of complex tissue engineering approaches to cultured meat [1]. As always, economics is going to determine the difference between what’s possible in principle and what is actually likely to happen. But I wonder whether relatively small investments in the food science of making meat substitutes could yield real dividends.

Why is eating meat important to people? It’s worth distinguishing three reasons. Firstly, meat does provide an excellent source of nutrients (though with potential adverse health effects if eaten to excess). Secondly, It’s a source of sensual pleasure, with a huge accumulated store of knowledge and technique about how to process and cook it to produce the most delicious results. Finally, eating meat is freighted with cultural, religious and historical significance. What kind of meat one’s community eats (or indeed, if it it eats meat at all), when families eat or don’t eat particular meats, all of these have deep historical roots. In many societies access to abundant meat is a potent signifier of prosperity and success, both at the personal and national level. It’s these factors that make calls for people to change their diets so political sensitive to this day.

So how is it realistic to imagine replacing meat with a synthetic substitute? The first issue is easy – replacing meat with foods of plant origin of equivalent nutritional quality is straightforward. The third issue is much harder – cultural change is difficult, and some obvious ways of eliminating meat run into cultural problems. A well-known vegetarian cookbook of my youth was called “Not just a load of old lentils” – this was a telling, but not entirely successful attempt to counteract an unhelpful stereotype head-on. So perhaps the focus should be on the second issue. If we can produce convincing simulations of meat that satisfy the sensual aspects and fit into the overall cultural preconceptions of what a “proper” meal looks like – in the USA or the UK, burger and fries, or a roast rib of beef – maybe we can meet the cultural issue halfway.

So what is meat, and how can we reproduce it? Lean meat consists of about 75% water, 20% protein and 3% fat. If it was just a question of reproducing the components, synthetic meat would be easy. An appropriate mixture of, say, wheat protein and pea protein (a mixture is needed to get all the necessary amino acids), some vegetable oil, and some trace minerals and vitamins, dispersed in water would provide all the nutrition that meat does. This would be fairly tasteless, of course – but given the well developed modern science of artificial flavours and aromas, we could fairly easily reproduce a convincing meaty broth.

But this, of course, misses out the vital importance of texture. Meat has a complex, hierarchical structure, and the experience of eating it reflects the way that structure is broken down in the mouth and the time profile of the flavours and textures it releases. Meat is made from animal muscle tissue, which develops to best serve what that particular muscle needs to do for the animal in its life. The cells in muscle are elongated to make fibres; the fibres bundle together to create the grain that’s familiar when we cut meat, but they also need to incorporate the connective tissue that allows the muscle to exert forces on the animal’s bones, and the blood-carrying vascular system that conveys oxygen and nutrients to the working muscle fibres. All of this influences the properties of the tissue when it becomes meat. The connective tissue is dominated by the protein material collagen, which consists of long molecules tightly bound together in triple helices.

Muscles that do a lot of work – like the lower leg muscles that make up the beef cuts known as shin or leg – have a lot of connective tissue. These cuts of meat are very tough, but after long cooking at low temperatures the collagen breaks down; the triple helices come apart, and the separated long molecules give a silky texture to the gravy, enhanced by the partial reformation of the helical junctions as it cools. In muscles that do less work – like the underside of the loin that forms the fillet in beef – there is much less connective tissue, and the meat is very tender even without long cooking.

High temperature grilling creates meaty flavours through a number of complex chemical reactions known as Maillard reactions, which are enhanced in the presence of carbohydrates in the flour and sugar that are used for barbecue marinades. Other flavours are fat soluble, carried in the fat cells characteristic of meat from well-fed animals that develop “marbling” of fat layers in the lean muscle. All of these characteristics are developed in the animal reflecting the life it leads before slaughter, and are developed further after butchering, storage and cooking.

In “cultured” meat, individual precursor cells derived from an animal are grown in a suitable medium, using a “scaffold” to help the cells organise to form something resembling natural muscle tissue. There a a couple of key technical issues with this. The first is the need to provide the right growth medium for the cells, to provide an energy source, other nutrients, and the growth factors that simulate the chemical communications between cells in whole organisms.

In the cell culture methods that have been developed for biomedical applications, the starting point for these growth media has been sera extracted from animal sources like cows. These are expensive – and obviously can’t produce an animal free product. Serum free growth media have been developed but are expensive, and optimising, scaling up and reducing the cost of these represent key barriers to be overcome to make “cultured meat” viable.

The second issue is reproducing the vasculature of real tissue, the network of capillaries that conveys nutrients to the cells. It’s this that makes it much easier to grow a thin layer of cells than to make a thick, steak-like piece. Hence current proofs of principle of cultured meat are more likely to produce mince meat for burgers rather than whole cuts.

I think there is a more fundamental problem in making the transition from cells, to tissue, to meat. One can make a three dimensional array of cells using a “scaffold” – a network of some kind of biopolymer that the cells can attach to and which guides their growth in the way that a surface does in a thin layer. But we know that the growth of cells is influenced strongly by the mechanical stimuli they are exposed to. This is obvious at the macroscopic scale – muscles that do more work, like leg muscles, grow in a different way that ones that do less – hence the difference between shin of beef and fillet steak. I find it difficult to see how, at scale, one could reproduce these effects in cell culture in a way that produces something that looks more like a textured piece of meat rather than a vaguely meaty mush.

I think there is a simpler approach, which builds on the existing plant-based substitutes for meat already available in the supermarket. Start with a careful study of the hierarchical structures of various meats, at scales from the micron to the millimetre, before and after cooking. Isolate the key factors in the structure that produce a particular hedonic response – e.g. the size and dispersion of the fat particles, and their physical state; the arrangement of protein fibres, the disposition of tougher fibres of connective tissue, the viscoelastic properties of the liquid matrix and so on. Simulate these structures using plant derived materials – proteins, fats, gels with different viscoelastic properties to simulate connective tissue, and appropriate liquid matrices, devising processing routes that use physical processes like gelation and phase separation to yield the right hierarchical structure in a scalable way. Incorporate synthetic flavours and aromas in controlled release systems localised in different parts of the structure. All this is a development and refinement of existing food technology.

At the moment, attempting something like this, we have start-ups like Impossible Burger and Beyond Meat, with new ideas and some distinct intellectual property. There are established food multinationals, like Unilever, moving in with their depth of experience in branding, distribution and deep food science expertise. We already have products, many of which are quite acceptable in the limited market niches they are aiming at (typically minced meat for burgers and sauces). We need to move now to higher value and more sophisticated products, closer to whole cuts of meat. To do this we need some more basic food science research, drawing on the wide academic base in the life sciences, and integrating this with the chemical engineering for making soft matter systems with complex heterogenous structures at scale, often by non-equilibrium self-assembly processes.

Food science is currently rather an unfashionable area, with little funding and few institutions focusing on it (for example, the UK’s former national Institute of Food Research in Norwich has pivoted away from classical food science to study the effect of the microbiome on human health). But I think the case for doing this is compelling. The strong recent rise in veganism and vegetarianism creates a large and growing market. But it does need public investment, because I don’t think intellectual property in this area will be very easy to defend. For this reason, large R&D investments by individual companies alone may be difficult to justify. Instead we need consortia bringing together multinationals like Unilever and players further downstream in the supply chain, like the manufacturers of ready meals and suppliers to fast food outlets, together with a relatively modest increase in public sector applied research. Food science may not be as glamorous as a new approach to nuclear fusion, but maybe turn out to be just as important in the fight against climate change.

[1]. See also this interesting article by Alex Smith and Saloni Shah – The Government Needs an Innovation Policy for Alternative Meats – which makes the case for an industrial strategy for alternative meats, but is more optimistic about the prospects for cell culture than I am.

The Prime Minister’s office asserts control over UK science policy

The Daily Telegraph published a significant article from the Prime Minister about science and technology this morning, to accompany a government announcement “Prime Minister sets out plans to realise and maximise the opportunities of scientific and technological breakthroughs”.

Here are a few key points I’ve taken away from these pieces.

1. There’s a reassertion in the PM’s article of the ambition to raise government spending on science from its current value of £14.9 billion to a new target of £22 bn (though no date is attached to this target), together with recognition that this needs to lever in substantially more private sector R&D spending to meet the overall target of the goal of total R&D spending – public and private – of 2.4% of GDP. The £22bn spending goal was promised in the March 2020 budget, but had since disappeared from HMT documents.

2. But there’s a strong signal that this spending will be directed to support state priorities: “It is also the moment to abandon any notion that Government can be strategically indifferent”.

3. A new committee, chaired by the Prime Minister, will be set up – the National Science and Technology Council. This will establish those state priorities: “signalling the challenges – perhaps even to specify the breakthroughs required”. This could be something like the ministerial committee recommended in the Nurse Review, which it was proposed would coordinate the government’s response to science and technology challenges right across government.

4. There is an expanded role for the Government Chief Scientific Advisor, Sir Patrick Vallance, as National Technology Advisor, in effect leading the National Science and Technology Council.

5. A new Office for Science and Technology Strategy is established to support the NSTC. This is based in the Cabinet Office – emphasising its whole-of-government remit. Presumably this supersedes, and/or incorporates, the existing Government Office of Science, which is now based in BEIS.

6. There is a welcome recognition of some of the current weaknesses of the UK’s science and innovation – the article talks about restoring Britain’s status as a science superpower” (my emphasis), after decades of failure to invest, both by the state and by British industry: “this country has failed for decades to invest enough in scientific research, and that strategic error has been compounded by the decisions of the UK private sector”. The article highlights the UK’s loss of capacity in areas like vaccine manufacture and telecoms.

7. The role of the new funding agency ARIA is defined as looking for “Unknown unknowns”, while NSTC sets out priorities supporting missions like net zero, cyber threats and medical issues like dementia. There is no mention of the UK’s current main funder of upstream research – UKRI – but presumably its role is to direct the more upstream science base to support the missions as defined by NSTC.

8. The role of science and technology in creating economic growth remains important, with an emphasis on scientifically led start-ups and scale-ups, and a reference to “Levelling up” by spreading technology led economic growth outside the Golden Triangle to the whole country.

As always, the effectiveness with which a reorganised structure delivers meaningful results will depend on funding decisions made in the Autumn’s spending review – and thus the degree to which HM Treasury is convinced by the arguments of the NSTC, or compelled by the PM to accept them.

What next for UK Industrial Strategy?

The UK’s industrial strategy landscape was overturned again in the March budget, with the previous strategy (as described in the 2017 White Paper from Greg Clark, Business Minister in the May government, superseded by a Treasury document: “Build back better: our plan for growth”. Is this merely a “rebranding”, or a more substantial repudiation of the very idea of industrial strategy?

From what I can deduce, it is neither of these extremes – instead it reflects some unresolved tension inside government between two views of how industrial policy should be framed. In one view – traditionally associated with HM Treasury – the government should restrict itself to general measures that it is thought will promote productivity growth across the whole economy, resisting any measures that selectively one sector of the economy over another. This is often called “horizontal” industrial policy, in contrast to so-called “vertical” industrial strategy, in which particular sectors of the economy that are thought to be of particular importance are singled out for special support. The 2017 White Paper did signal some return to “vertical” industrial strategy, though we can see recent precursors for this going back to Mandelson’s return to the Department of Business, Innovation and Skills (as BEIS – the department for Business, Energy and Industrial Strategy was called then) in 2008, and in the continuing support for sectors such as aerospace, automotive and life sciences since then. It seems that the Treasury “Plan for Growth” marks a swing of the pendulum back towards a focus on “horizontal” industrial policy, though the signals remain somewhat mixed.

The biggest signal of a change of direction following the March budget was the abolition of the Industrial Strategy Council. This was a non-statutory body set up by BEIS to monitor and provide advice about the implementation of the Industrial Strategy, chaired by the Bank of England’s Chief Economist, Andy Haldane, and featuring a stellar array of economists and business people. The Industrial Strategy Council’s final annual report gives a great outline of what an industrial strategy should be – “a programme of supply-side policies to drive prosperity in and across the economy”, whose key ingredients should be “scale, longevity and policy co-ordination”. The regional dimensions of industrial strategy, they say, should be co-created with businesses and regional actors (as we’ve seen in the development of local industrial strategies). The report calls for the use of clear metrics to judge success by, but to look “beyond these “traditional” drivers of productivity to measures of social, human, and natural capital, as well as broader welfare impacts”. Naturally, the Industrial Strategy Council thinks it’s a good idea to have an independent – and preferably statutory – body to provide independent monitoring and advice. There’s an good summary in this FT article by Andy Haldane – UK industrial strategy is dead, long may it live.

Leaving aside the signals that winding up the Industrial Strategy Council might be sending, what’s the substance in the new Treasury document “Build back better: our plan for growth”?

I don’t find a lot to argue with in the diagnosis of the problems. The UK’s poor productivity performance since the global financial crisis is placed front and centre. A telling graph highlights the growing gap in productivity between the UK and France, Germany and the USA, while another graph (shown below) makes it clear that this isn’t just an abstract issue of economics – the stagnation of wages and living standards the UK has seen since the financial crisis closely tracks the productivity slow-down. The UK’s persistent regional disparities in productivity, about which I’ve written at length in the past, are highlighted, too, with the problem identified (correctly, in my view) as arising from “cities outside London not fully capturing the benefits of their size”. The level of analysis of the causes of these issues is somewhat more sketchy, with the Treasury ascribing the problem primarily to be persistent low investment in physical capital and skills.


A lost decade. UK Labour productivity and real wages since 2000. From HM Treasury’s Build back better: our plan for growth. Open Government License.

The new framework for Treasury industrial strategy is built on three “pillars for growth” – infrastructure, skills and innovation. This is classical “horizontal” industrial strategy, without a focus on any particular sectors. But there are priorities – three goals, each of rather different character.

The first of these is “levelling up” – a (commendable) commitment to “ensure the benefits of growth are spread to all corners of the UK”, tackling regional disparities in health and education outcomes, supporting struggling towns, and ensuring that “every region and nation of the UK [has] at least one globally competitive city, acting as hotbeds of innovation and hubs of high value activity”. The second is the 2050 net zero greenhouse gas target, where the stress is laid on the number of “green jobs” this will produce. The third priority is the post-Brexit one of “taking advantage of the opportunities that come with our new status as a fully sovereign trading nation” – as “Global Britain”.

The plans for building on these three pillars and three priorities remain vague – some existing commitments are reasserted, such as the plan to increase public infrastructure spending, to meet an total R&D spending target of 2.4% of GDP, to deliver the FE White Paper, to introduce the new science funding agency “ARIA”, and to introduce “Freeports”. Further details are promised later, including an Innovation Strategy and the R&D Places Strategy.

The reduction of emphasis in this document on the sectors that were so prominent in the previous industrial strategy – such as aerospace, automotive, and life sciences – has clearly caused some anxiety in business circles. There’s been a response to this, in the shape of a joint letter from BEIS Secretary of State Kwarteng & Chancellor of the Exchequer Sunak. This emphasises continuity with the previous industrial strategy, asserting that the new plan “builds on the best of the Industrial Strategy from 2017 and makes the most of our strengths right across the economy”. They promise that “this government remains committed to its industrial sectors” and that the existing sector deals (e.g. for Aerospace, Automotive and Life Sciences) will be honoured. And there is the promise of more in the future – “we will follow up the plan for growth with an Innovation Strategy, as well as strategies for net zero, hydrogen and space” and “we will also develop a vision for high-growth sectors and technologies.”

There’s another indication that the words “industrial strategy” may not yet be completely unspeakable in the current government – shortly after the budget, the Ministry of Defense published their “Defense and Security Industrial Strategy”. I think this is positive – another way of creating some priorities in industrial strategy without entirely going down the sector route is for the government to focus on strategically important, long-term goals of the state, and systematically to evaluate what innovation is required and what industrial capacity needs to be built to deliver those goals.

What other goals should be pursued besides defense? The obvious two are net zero and healthcare. As a matter of urgency, the government should be developing a long-term Net Zero Industrial Strategy, to accompany a more detailed road-map for the huge job of transforming the UK’s energy economy. And as we recover from the pandemic, there needs to be a refocused Healthcare Industrial Strategy, building on the successes of the old “Life Sciences Strategy” but focusing more on population health, and learning both the positive and negative lessons from the way the UK’s health and life sciences sector responded to the pandemic. The lately departed Industrial Strategy Council produced a very helpful paper on the lessons that industrial strategy should learn from the state’s involvement in the development of the Oxford/AstraZeneca Covid-19 vaccine.

What would worry me most if I were in government is time. “Developing visions” is all well and good, but budgets are now set until 2022, and there are suggestions of funding “pauses” in some parts of the existing industrial strategy, such as the industry research and development supported by the Aerospace Technology Institute. If new programmes are to begun in 2022, they will take time to ramp up. Meanwhile other dates will be creeping up – 2027 is the date for the 2.4% R&D target, which needs the private sector to make decisions to commit substantial extra funds to business R&D in response to any increase in government R&D. And although 2050 seems far away now for the net zero greenhouse gas target, the scale of the transition and the lifetime of the assets, and the need for innovation to bring down the cost of the transition, means that the next ten years is crucial.

Not least, the latest date the government can hold an election is the end of 2024. Having repealed the Fixed Term Parliament Act, the government will probably want to use the regained flexibility to hold the election as much as a year early. There are some who say that this is a government that likes to mark its own homework. Ultimately, though, the homework will be marked by the voters. The government has raised high expectations about a return to economic growth and a levelling up of living standards, especially in the so-called “Red Wall” seats of the Midlands and the North. There’s not a lot of time to demonstrate that the country has even started on that journey, let alone made any substantial progress on it. So whatever the government has decided is the future of industrial strategy, it needs to get on with it.

Novavax – another nanoparticle Covid vaccine

The results for the phase III trial of the Novavax Covid vaccine are now out, and the news seems very good – an overall efficacy of about 90% in the UK trial, with complete protection against severe disease and death. The prospects now look very promising for regulatory approval. What’s striking about this is that we now have a third, completely different class of vaccine that has demonstrated efficacy against COVID-19. We have the mRNA vaccines from BioNTech/Pfizer and Moderna, the viral vector vaccine from Oxford/AstraZeneca, and now Novavax, which is described as “recombinant nanoparticle technology”. As I’ve discussed before (in Nanomedicine comes of age with mRNA vaccines), the Moderna and BioNTech/Pfizer vaccines both crucially depend on a rather sophisticated nanoparticle system that wraps up the mRNA and delivers it to the cell. The Novavax vaccine depends on nanoparticles, too, but it turns out that these are rather different in their character and function to those in the mRNA vaccines – and, to be fair, are somewhat less precisely engineered. So what are these “recombinant nanoparticles”?

All three of these vaccine classes – mRNA, viral vector and Novavax – are based around raising an immune response to a particular protein on the surface of the coronovirus – the so-called “spike” protein, which binds to receptors on the surface of target cells at the start of the process through which the virus makes its entrance. The mRNA vaccines and the viral vector vaccines both hijack the mechanisms of our own cells to get them to produce analogues of these spike proteins in situ. The Novavax vaccine is less subtle – the protein itself is used as the vaccine active ingredient. It’s synthesised in bioreactors by using a genetically engineered insect virus, which is used to infect a culture of cells from a moth caterpillar. The infected cells are harvested and the spike proteins collected and formulated. It’s this stage that, in the UK, will be carried out in the Teeside factory of the contract manufacturer Fujifilm Diosynth Biotechnologies.

The protein used in the vaccine is a slightly tweaked version of the molecule in the coronavirus. The optimal alteration was found by Novavax’s team, led by scientist Nita Patel, who quickly tried out 20 different versions before hitting on the variety that is most stable and immunologically active. The protein has two complications compared to the simplest molecules studied by structural biologists – it’s a glycoprotein, which means that it has short polysaccharide chains attached at various points along the molecule, and it’s a membrane protein (this means that it’s structure has to be determined by cryo-transmission electron microscopy, rather than X-ray diffraction). It has a hydrophobic stalk, which sticks into the middle of the lipid membrane which coats the coronavirus, and an active part, the “spike”, attached to this, sticking out into the water around the virus. For the protein to work as a vaccine, it has to have exactly the same shape as the spike protein has when it’s on the surface of the virus. Moreover, that shape changes when the virus approaches the cell it is going to infect – so for best results the protein in the vaccine needs to look like the spike protein at the moment when it’s armed and ready to invade the cell.

This is where the nanoparticle comes in. The spike protein is formulated with a soap-like molecule called Polysorbate 80 (aka Tween 80). This consists of a hydrocarbon tail – essentially the tail group of oleic acid – attached to a sugar like molecule – sorbitan – to which are attached short chains of ethylene oxide. The whole thing is what’s known as a non-ionic surfactant. It’s like soap, in that it has a hydrophobic tail group and a hydrophilic head group. But unlike soap or comment synthetic detergents, the head group is, although water soluble, uncharged. The net result is that in water Polysorbates-80 self-assembles into nanoscale droplets – micelles – in which the hydrophobic tails are buried in the core and the hydrophilic head groups cover the surface, interacting with the surrounding water. The shape and size of the micelles is set by the length of the tail group and the area of the head group, so for these molecules the optimum shape is a sphere, probably a few tens of nanometers in diameter.

As far as the spike proteins are concerned, these somewhat squishy nanoparticles look a bit like the membrane of the virus, in that they have an oily core that the stalks can be buried in. When the protein, having been harvested from the insect cells and purified, is mixed up with a polysorbate-80 solution, they end up stuck into the sphere like a bunch of whole cloves stuck into a mandarin orange. Typically each nanoparticle will have about 14 spikes. It has to be said that, in contrast to the nanoparticles carrying the mRNA in the BioNTech and Moderna vaccines, neither the component materials nor the process for making the nanoparticles is particularly specialised. Polysorbate-80 is a very widely used, and very cheap, chemical, extensively used as an emulsifier in convenience food and an ingredient in cosmetics, as well as in many other pharmaceutical formulations, and the formation of the nanoparticles probably happens spontaneously on mixing (though I’m sure there are some proprietary twists and tricks to get it to work properly, there usually are).

But the recombinant protein nanoparticles aren’t the only nanoparticles of importance in the Novavax vaccine. It turns out that simply injecting a protein as an antigen doesn’t usually provoke a strong enough immune response to work as a good vaccine. In addition, one needs to use one of the slightly mysterious substances called “adjuvants” – chemicals that, through mechanisms that are probably still not completely understood, prime the body’s immune system and provoke it to make a stronger response. The Novavax vaccine uses as an adjuvant another nanoparticle – a complex of cholesterol and phospholipid (major components of our own cell membranes, widely available commercially) together with molecules called saponins, which are derived from the Chilean soap-bark tree.

Similar systems have been used in other vaccines, both for animal diseases (notably foot and mouth) and human. The Novavax adjuvant technology was developed by a Swedish company, Isconova AB, which was bought by Novavax in 2013, and consists of two separate fractions of Quillaja saponins, separately formulated into 40 nm nanoparticles and mixed together. The Chilean soap-bark tree is commercially cultivated – the raw extract is used, for example, in the making of the traditional US soft drink, root beer – but production will need to be stepped up (and possibly redirected from fizzy drinks to vaccines) if these vaccines turn out to be as successful as it now seems they might.

Sources: This feature article on Novavax in Science is very informative, but I believe the cartoon depicting the nanoparticle isn’t likely to be accurate, depicting it as cylindrical when it is much more likely to be spherical, and based on double tailed lipids rather than the single tailed anionic surfactant that is in fact used in the formulation. This is the most detailed scientific article from the Novavax scientists describing the vaccine and its characterisation. The detailed nanostructure of the vaccine protein in its formulation is described in this recent Science article. The “Matrix-M” adjuvant is described here, while the story of the Chilean soap-bark tree and its products is described in this very nice article in The Atlantic Magazine.

Rubber City Rebels

I’m currently teaching a course on the theory of what makes rubber elastic to Material Science students at Manchester, and this has reminded me of two things. The first is that this a great topic to introduce a number of the most central concepts of polymer physics – the importance of configurational entropy, the universality of the large scale statistical properties of macromolecules, the role of entanglements. The second is that the city of Manchester has played a recurring role of the history of the development of this bit of science, which as always, interacts with technological development in interesting and complex ways.

One of the earliest quantitative studies of the mechanical properties of rubber was published by that great Manchester physicist, James Joule, in 1859. As part of his investigations of the relationship between heat and mechanical work, he measured the temperature change that occurs when rubber is stretched. As anyone can find out for themselves with a simple experiment, rubber is an unusual material in this respect. If you take an elastic band (or, better, a rubber balloon folded into a narrow strip), hold it close to your upper lip, suddenly stretch it and then put it to your lip, you can feel that it significantly heats up – and then, if you release the tension again, it cools down again. This is a crucial observation for understanding how it is that the elasticity of rubber arises from the reduction in entropy that occurs when a randomly coiled polymer strand is stretched.

But this wasn’t the first observation of the effect – Joule himself referred to an 1805 article by John Gough, in the Memoirs of the Manchester Literary and Philosophical Society, drawing attention to this property of natural rubber, and the related property that a strand of the material held under tension would contract on being heated. John Gough himself was a fascinating figure – a Quaker from Kendal, a town on the edge of England’s Lake District, blind, as a result of a childhood illness, he made a living as a mathematics tutor, and was a friend of John Dalton, the Manchester based pioneer of the atomic hypothesis. All of this is a reminder of the intellectual vitality of that time in the fast industrialising provinces, truly an “age of improvement”, while the universities of Oxford and Cambridge had slipped into the torpor of qualifying the dim younger offspring of the upper classes to become Anglican clergymen.

Joule’s experiments were remarkably precise, but there was another important difference from Gough’s pioneering observation. Joule was able to use a much improved version of the raw natural rubber (or caoutchouc) that Gough used; the recently invented process of vulcanisation produced a much stronger, stabler material than the rather gooey natural precursor. The original discovery of the process of vulcanisation was made by the self-taught American inventor Charles Goodyear, who found in 1839 that rubber could be transformed by being heated with sulphur. It wasn’t for nearly another century that the chemical basis of this process was understood – the sulphur creates chemical bridges between the long polymer molecules, forming a covalently bound network. Goodyear’s process was rediscovered – or possibly reverse engineered – by the industrialist Thomas Hancock, who obtained the English patents for it in 1843 [2].

Appropriately for Manchester, the market that Hancock was serving was for improved raincoats. The Scottish industrialist Mackintosh had created his eponymous garment from a waterproof fabric consisting of a sandwich of rubber between two textile sheets; Hancock meanwhile had developed a number of machines and technologies for processing natural rubber, so it was natural for the two to enter into partnership with their Manchester factory making waterproof fabric. Their firm prospered; Goodyear, though, failed to make money from his invention and died in poverty (the Goodyear tire company was named after him, but only some years after his death).

At that time, rubber was a product of the Amazonian rain forest, harvested from wild trees by indigenous people. In a well known story of colonial adventurism, 70,000 seeds of the rubber tree were smuggled out of Brazil by the explorer Henry Wickham, successfully cultivated at Kew Gardens, with the plants exported to the British colonies of Malaya and Ceylon to form the basis of a new plantation rubber industry. This expansion and industrialisation of the cultivation of rubber came at an opportune time – the invention of the pneumatic tyre and the development of the automobile industry led to a huge new demand for rubber around the turn of the century, which the new plantations were in a position to meet.

Wild rubber was also being harvested to meet this time in the Belgian Congo, involving an atrocious level of violent exploitation of the indigenous population by the colonisers. But most of the rubber being produced to meet the new demand came from the British Empire plantations; this cultivation may not have been accompanied by the atrocities committed in the Congo, but the competitive prices plantation rubber could be produced at reflected not just the capital invested and high productivity achieved, but also the barely subsistence wages paid to the workforce, imported from India and China.

Back in England, in 1892 the Birmingham based chemist William Tilden had demonstrated that rubber could be synthesised from turpentine [3]. But this invention created little practical interest in England. And why would it, given that the natural product is of a very high quality, and the British Empire had successfully secured ample supplies through its colonial plantations? The process was rediscovered by the Russian chemist Kondakov in 1901, and taken up by the German chemical company Bayer in time for the synthetic product to play a role in the First World War, when German access to plantation rubber was blocked by the allies. At this time the quality of the synthetic product was much worse than that of natural rubber; nonetheless German efforts to improve synthetic rubber continued in the 1920’s and 30’s, with important consequences in the Second World War.

It’s sobering[4] to realise that by 1919, the rubber industry constituted a global industry with an estimated value of £250 million (perhaps £12 billion in today’s money), on the cusp of a further massive expansion driven by the mass adoption of the automobile – and yet scientists were completely ignorant, not just of the molecular origins of rubber’s elasticity, but even of the very nature of its constituent molecules. It was the German chemist Hermann Staudinger who, in 1920, suggested that rubber was composed of very long, linear molecules – polymers. Obvious thought this may be now, this was a controversial suggestion at the time, creating bitter disputes in the community of German chemists at the time, a dispute that gained a political tinge with the rise of the Nazi regime. Staudinger remained in Germany throughout the Second World War, despite being regarded as deeply ideologically suspect.

Staudinger was right about rubber being made up of long-chain molecules, but he was wrong about the form those molecules would take, believing that they would naturally adopt the form of rigid rods. The Austrian scientist Herman Mark, who was working for the German chemical combine IG Farben on synthetic rubber and other early polymers, realised that these long molecules would be very flexible and take up a random coil conformation. Mark’s father was Jewish, so he left IG Farben, first for Austria, and then after the Anschluss he escaped to Canada. At the University of Vienna in the 1930’s, Mark developed, with Eugene Guth, the statistical theory that explains the elastic behaviour of rubber in terms of the entropy changes in the chains as they are stretched and unstretched. This, at last, provided the basic explanation for the effect Gough discovered more than a century before, and that Joule quantified – the rise of temperature that occurs when rubber is stretched.

By the start of the Second World War, both Mark and Guth found themselves in the USA, where the study of rubber was suddenly to become very strategically important indeed. The entry of Japan into the war and the fall of British Malaya cut off allied supplies of natural rubber, leading to a massive scale up of synthetic rubber production. Somewhat ironically, this was based on the pre-war discovery by IG Farben of a version of synthetic rubber that had a great improvement in properties on previous versions – styrene-butadiene rubber (Buna-S). Standard Oil of New Jersey had an agreement with IG Farben to codevelop and market Buna-S in the USA.

The creation, almost from scratch, of a massive synthetic rubber industry in the USA was, of course, just one dimension of the USA’s World War 2 production miracle, but its scale is still astonishing [5]. The industry scaled up, under government direction, from producing 231 tons of general purpose rubber in 1941, to a monthly output of 70,000 tons in 1945. 51 new plants were built to produce the massive amounts of rubber needed for aircraft, tanks, trucks and warships. The programme was backed up by an intensive R&D effort, involving Mark, Guth, Paul Flory (later to win the Nobel prize for chemistry for his work on polymer science) and many others.

There was no significant synthetic rubber programme in the UK in the 1920’s and 1930’s. The British Empire was at its widest extent, providing ample supplies of natural rubber, as well as new potential markets for the material. That didn’t mean that there was no interest in improving scientific understanding of the material – on the contrary, the rubber producers in Malaya first sponsored research in Cambridge and Imperial, then collectively created a research laboratory in England, led by a young physical chemist from near Manchester, Geoffrey Gee. Gee, together with Leslie Treloar, applied the new understanding of polymer physics to understand and control the properties of natural rubber. After the war, realising that synthetic rubber was no longer just an inferior substitute, but a major threat to the markets for natural rubber, Gee introduced a programme of standardisation of rubber grades which helped the natural product maintain its market position.

Gee moved to the University of Manchester in 1953, and some time later Treloar moved to the neighbouring institution, UMIST, where he wrote the classic textbook on rubber elasticity. Manchester in the 1950’s and 60’s was a centre of research into rubber and networks of all kinds. Perhaps the most significant new developments were made in theory, by Sam Edwards, who joined Manchester’s physics department in 1958. Edwards was a brilliant theoretical physicist, who had learnt the techniques of quantum field theory with Julian Schwinger in a postdoc at Harvard. Edwards, having been interested by Gee in the fundamental problems of polymer physics, realised that there are some deep analogies between the mathematics of polymer chains and the quantum mechanical description of the behaviour of electrons. He was able to rederive, in a much more rigorous way that demonstrated the universality of the results, some of the fundamental predictions of polymer physics that had been postulated by Flory, Mark, Guth and others, before going onto results of his own of great originality and importance.

Edwards’s biggest contribution to the theory of rubber elasticity was to introduce methods for dealing with the topological constraints that occur in dense, cross-linked systems of linear chains. Polymer chains are physical objects that can’t cross each other, something that the classical theories of Guth and Mark completely neglect. But it was by then obvious that the entanglements of polymer molecules could themselves behave as cross-links, even in the absence of the chemical cross linking of vulcanisation (in fact, this is already suggested looking back at Gough’s original 1805 observations, which were made on raw, unvulcanised, rubber). Edwards introduced the idea of a “tube” to represent those topological constraints. Combined with the insight of the French physicist Pierre-Gilles de Gennes, this led not just to improved models for rubber elasticity taking account of entanglements, but a complete molecular theory of the complex viscoelastic behaviour of polymer melts [6].

Another leading physicist who emerged from this Manchester school was Julia Higgins, who learnt about polymers while she was a research fellow in the chemistry department in the 1960’s. Higgins subsequently worked in Paris, where in 1974 she carried out, with Cotton, des Cloiseux, Benoit and others, what I think might be one of the most important single experiments in polymer science. Using a neutron source to study the scattering from a melt of polymer molecules, some of which were deuterium labelled, they were able to show that even in the dense, entangled environment of a polymer melt, a single polymer chain still behaves as a classical random walk. This is in contrast with the behaviour of polymers in solution, where the chains are expanded by a so-called “excluded volume” interaction – arising from the fact that two segments of a single polymer chain can’t be in the same place at the same time. This result had been anticipated by Flory, in a rather intuitive and non-rigorous way, but it was Edwards who proved this result rigorously.

[1] My apologies for the rather contrived title. No-one calls Manchester “Rubber City” – it is traditionally a city built on cotton. The true Rubber City is, of course, Akron Ohio. Neither can anyone really describe any of the figures I talk about here as “rebels” (with the possible exception of Staudinger, who in his way is rather a heroic figure). But as everyone knows [7], Akron was a centre of music creativity in the mid-to-late 1970s, producing bands such as Devo, Per Ubu, and the Rubber City Rebels, whose eponymous song has remained a persistent earworm for me since the late 1970’s, and from which I’ve taken my title.
[2] And I do mean “English” here, rather than British or UK – it seems that Scotland had its own patent laws then, which, it turns out, influenced the subsequent development of the rubber boot industry.
[3] It’s usually stated that Tilden succeeded in polymerising isoprene, but a more recent reanalysis of the original sample of synthetic rubber has revealed that it is actually poly(2,3-dimethybutadiene) (https://www.sciencedirect.com/science/article/pii/S0032386197000840)
[4] At least, it’s sobering for scientists like me, who tend to overestimate the importance of having a scientific understanding to make a technology work.
[5] See “U.S. Synthetic Rubber Program: National Historic Chemical Landmark” – https://www.acs.org/content/acs/en/education/whatischemistry/landmarks/syntheticrubber.html
[6] de Gennes won the 1991 Nobel Prize for Physics for his work on polymers and liquid crystals. Many people, including me, strongly believed that this prize should have been shared with Sam Edwards. It has to be said that both men, who were friends and collaborators, dealt with this situation with great grace.
[7] “Everyone” here meaning those people (like me) born between 1958 and 1962 who spent too much of their teenage years listening to the John Peel show.

How does the UK rank as a knowledge economy?

Now the UK has withdrawn from the European single market, it will need to rethink its current and potential future position in the world economy. Some helpful context is provided, perhaps, by some statistics summarising the value added from knowledge and technology intensive industries, taken from the latest edition of the USA’s National Science Board Science and Engineering Indicators 2020.

The plot shows the changing share of world value added in a set of knowledge & technology intensive industries, as defined by an OECD industry classification based on R&D intensity. This includes five high R&D intensive industries: aircraft; computer, electronic, and optical products; pharmaceuticals; scientific R&D services; and software publishing. It also includes eight medium-high R&D intensive industries: chemicals (excluding pharmaceuticals); electrical equipment; information technology (IT) services; machinery and equipment; medical and dental instruments; motor vehicles; railroad and other transportation; and weapons. It’s worth noting that, in addition to high value manufacturing sectors, it includes some knowledge intensive services. But it does exclude public knowledge intensive services in education and health care, and, in the private sector, financial services and those business services outside R&D and IT services.

From this plot we can see that the UK is a small but not completely negligible part of world advanced economy. This is perhaps a useful perspective from which to view some of the current talk of world-beating “global Britain”. The big story is the huge rise of China, and in this context, inevitable that the rest of the world’s share of the advanced economy has fallen. But the UK’s fall is larger than competitors (-46%, cf -19% for the USA and -13% for rest of EU).

The absolute share tells us about the UK’s overall relative importance in the world economy, and should be helpful in stressing the need, in developing industrial strategy, for some focus. Another perspective is provided if we normalise the figures by population, which give us a sense of the knowledge intensity of the economy, which might give a pointer to prospects for future productivity growth. The table shows a rank ordered list by country of value added in knowledge & technology intensive industries per head of population in 2002 and 2018. The values for Ireland & possibly Switzerland may be distorted by transfer pricing effects.

Measuring up the UK Government’s ten-point plan for a green industrial revolution

Last week saw a major series of announcements from the government about how they intend to set the UK on the path to net zero greenhouse gas emissions. The plans were trailed in an article (£) by the Prime Minister in the Financial Times, with a full document published the next day – The ten point plan for a green industrial revolution. “We will use Britain’s powers of invention to repair the pandemic’s damage and fight climate change”, the PM says, framing the intervention as an innovation-driven industrial strategy for post-covid recovery. The proposals are patchy, insufficient by themselves – but we should still welcome them as beginning to recognise the scale of the challenge. There is a welcome understanding that decarbonising the power sector is not enough by itself. The importance of emissions from transport, industry and domestic heating are all recognised, and there is a nod to the potential for land-use changes to play a significant role. The new timescale for the phase-out of petrol and diesel cars is really significant, if it can be made to stick. So although I don’t think the measures yet go far enough or fast enough, one can start to see the outline of what a zero-emission economy might look like.

In outline, the emerging picture seems to be of a power sector dominated by offshore wind, with firm power provided either by nuclear or fossil fuels with carbon capture and storage. Large scale energy storage isn’t mentioned much, though possibly hydrogen could play a role there. Vehicles will predominantly be electrified, and hydrogen will have a role for hard to decarbonise industry, and possibly domestic heating. Some hope is attached to the prospect for more futuristic technologies, including fusion and direct air capture.

To move on to the ten points, we start with a reassertion of the Manifesto commitment to achieve 40 GW of offshore wind installed by 2030. How much is this? At a load factor of 40%, this would produce 140 TWh a year; for comparison, in 2019, we used a total 346 TWh of electricity. Even though this falls a long way short of what’s needed to decarbonise power, a build out of offshore wind on this scale will be demanding – it’s a more than four-fold increase on the 2019 capacity. We won’t be able to expand the capacity of offshore wind indefinitely using current technology – ultimately we will run out of suitable shallow water sites. For this reason, the announcement of a push for floating wind, with a 1 GW capacity target, is important.

On hydrogen, the government is clearly keen, with the PM saying “we will turn water into energy with up to £500m of investment in hydrogen”. Of course, even this government’s majority of 80 isn’t enough to repeal the laws of thermodynamics; hydrogen can only be an energy store or vector. As I’ve discussed in an earlier post (The role of hydrogen in reaching net zero), hydrogen could have an important role in a low carbon energy system, but one needs to be clear about how the hydrogen is made in a zero-carbon way, and how it is used, and this plan doesn’t yet provide that clarity.

The document suggests the first use will be in a natural gas blend for domestic heating, with a hint that it could be used in energy intensive industry clusters. The commitment is to create 5 GW of low carbon hydrogen production capacity by 2030. Is this a lot? Current hydrogen production amounts to 3 GW (27 TWh/year), used in industry and (especially) for making fertiliser, though none of this is low carbon hydrogen – it is made from natural gas by steam methane reforming. So this commitment could amount to building another steam reforming methane plant and capturing the carbon dioxide – this might be helpful for decarbonising industry, on on Deeside or Teeside perhaps. To give a sense of scale, total natural gas consumption in industry and homes (not counting electricity generation) equates to 58 GW (512 TWh/year), so this is no more than a pilot. In the longer term, making hydrogen by electrolysis and/or process heat from high temperature fission is more likely to be the scalable and cost-effective solution, and it is good that Sheffield’s excellent ITM Power gets a namecheck.

On nuclear power, the paper does lay out a strategy, but is light on the details of how this will be executed. For more detail on what I think has gone wrong with the UK’s nuclear strategy, and what I think should be done, see my earlier blogpost: Rebooting the UK’s nuclear new build programme. The plan here seems to be for one last heave on the UK’s troubled programme of large scale nuclear new build, followed up by a possible programme implementing a light water small modular reactor, with research on a new generation of small, high temperature, fourth generation reactors – advanced modular reactors (AMRs). There is a timeline – large-scale deployment of small modular reactors in the 2030’s, together with a demonstrator AMR around the same timescale. I think this would be realistic if there was a wholehearted push to make it happen, but all that is promised here is a research programme, at the level of £215 m for SMRs and £170m for AMRs, together with some money for developing the regulatory and supply chain aspects. This keeps the programme alive, but hardly supercharges it. The government must come up with the financial commitments needed to start building.

The most far-reaching announcement here is in the transport section – a ban on sales of new diesel and petrol car sales after 2030, with hybrids being permitted until 2035, after which only fully battery electric vehicles will be on sale. This is a big deal – a major effort will be required to create the charging infrastructure (£1.3 bn is ear-marked for this), and there will need to be potentially unpopular decisions on tax or road charging to replace the revenue from fuel tax. For heavy goods vehicles the suggestion is that we’ll have hydrogen vehicles, but all that is promised is R&D.

For public transport the solutions are fairly obvious – zero-emission buses, bikes and trains – but there is a frustrating lack of targets here. Sometimes old technologies are the best – there should be a commitment to electrify all inter-city and suburban lines as fast as feasible, rather than the rather vague statement that “we will further electrify regional and other rail routes”.

In transport, though, it’s aviation that is the most intractable problem. Three intercontinental trips a year can double an individual’s carbon footprint, but it is very difficult to see how one can do without the energy density of aviation fuel for long-distance flight. The solutions offered look pretty unconvincing to me – “we are investing £15 million into FlyZero – a 12-month study, delivered through the Aerospace Technology Institute (ATI), into the strategic, technical and commercial issues in designing and developing zero-emission aircraft that could enter service in 2030.” Maybe it will be possible to develop an electric aircraft for short-haul flights, but it seems to me that the only way of making long-distance flying zero-carbon is by making synthetic fuels from zero-carbon hydrogen and carbon dioxide from direct air capture.

It’s good to see the attention on the need for greener buildings, but here the government is hampered by indecision – will the future of domestic heating be hydrogen boilers or electric powered heat pumps? The strategy seems to be to back both horses. But arguably, even more important than the way buildings are heated is to make sure they are as energy-efficient as possible in the first place, and here the government needs to get a grip on the mess that is our current building regulation regime. As the Climate Change Committee says, “making a new home genuinely zero-carbon at the outset is around five times cheaper than retrofitting it later” – the housing people will be living in in 2050 is being built today, so there is no excuse for not ensuring the new houses we need now – not least in the neglected social housing sector – are built to the highest energy efficiency standards.

Carbon capture, usage and storage is the 8th of our 10 points, and there is a commendable willingness to accelerate this long-stalled programme. The goal here is “to capture 10Mt of carbon dioxide a year by 2030”, but without a great deal of clarity about what this is for. The suggestion that the clusters will be in the North East, the Humber, North West, and in Scotland and Wales suggests a goal of decarbonising energy intensive sectors, which in my view is the best use of this problematic technology (see my blogpost: Carbon Capture and Storage: technically possible, but politically and economically a bad idea). What’s the scale proposed here – is 10 Mt of carbon a year a lot or a little? Compared to the total CO2 emissions for the UK – 350 Mt in 2019 – it isn’t much, but on the other hand it is roughly in line with the total emissions of the iron and steel industry in the UK, so as an intervention to reduce the carbon intensity of heavy industry it looks more viable. The unresolved issue is who bears the cost.

There’s a nod to the effects of land-use changes, in the section on protecting the natural environment. There are potentially large gains to be had here in projects to reforest uplands and restore degraded peatlands, but the scale of ambition is relatively small.

Finally, the tenth point concerns innovation, with the promise of a “£1 billion Net Zero Innovation Portfolio” as part of the government’s aspiration to raise the UK’s R&D intensity to 2.4% of GDP by 2027. The R&D is to support the goals in the 10 point plan, with a couple of more futuristic bets – on direct air capture, and on commercial fusion power through the Spherical Tokomak for Energy Production project.

I think R&D and innovation are enormously important in the move to net zero. We urgently need to develop zero-carbon technologies to make them cheaper and deployable at scale. My own somewhat gloomy view (see this post for more on this: The climate crisis now comes down to raw power) is that, taking a global view incorporating the entirely reasonable aspiration of the majority of the world’s population to enjoy the same high energy lifestyle that is to be found in the developed world, the only way we will effect a transition to a zero-carbon economy across the world is if the zero-carbon technologies are cheaper – without subsidies – than fossil fuel energy. If those cheap, zero-carbon technologies can be developed in the UK, that will make a bigger difference to global carbon budgets than any unilateral action that affects the UK alone.

But there is an important counter-view, expressed cogently by David Edgerton in a recent article: Cummings has left behind a No 10 deluded that Britain could be the next Silicon Valley. Edgerton describes a collective credulity in the government about Britain’s place in the world of innovation, which overstates the UK’s ability to develop these new technologies, and underestimates the degree to which the UK will be dependent on innovations developed elsewhere.

Edgerton is right, of course – the UK’s political and commentating classes have failed to take on board the degree to which the country has, since the 1980’s, run down its innovation capacity, particularly in industrial and applied R&D. In energy R&D, according to recent IEA figures, the UK spends about $1.335 billion a year – some 4.3% of the world total, eclipsed by the contributions of the USA, China, the EU and Japan.

Nonetheless, $1.3 billion is not nothing, and in my opinion this figure ought to increase substantially both in absolute terms, and as a fraction of rising public investment in R&D. But the UK will need to focus its efforts in those areas where it has unique advantages; while in other areas international collaboration may be a better way forward.

Where are those areas of unique advantage? One such probably is offshore wind, where the UK’s Atlantic location gives it a lot of sea and a lot of wind. The UK currently accounts for about 1/3 of all offshore wind capacity, so it represents a major market. Unfortunately, the UK has allowed the situation to develop where the prime providers of its offshore wind technology are overseas. The plan suggests more stringent targets for local content, and this does make sense, while there is a strong argument that UK industrial strategy should try and ensure that more of the value of the new technologies of deepwater floating wind are captured in the UK.

While offshore wind is being deployed at scale right now, fusion remains speculative and futuristic. The government’s strategy is to “double down on our ambition to be the first country in the world to commercialise fusion energy technology”. While I think the barriers to developing commercial fusion power – largely in materials science – remain huge, I do believe the UK should continue to fund it, for a number of reasons. Firstly, there is a possibility that it might actually work, in which case it would be transformative – it’s a long odds bet with a big potential payoff. But why should the UK be the country making the bet? My answer would be that, in this field, the UK is genuinely internationally competitive; it hosts the Joint European Torus, and the sponsoring organisation UKAEA retains, rare in UK, capacity for very complex engineering at scale. Even if fusion doesn’t deliver commercial power, the technological spillovers may well be substantial.

The situation in nuclear fission is different. The UK dramatically ran down its research capacity in civil nuclear power, and chose instead to develop a new nuclear build programme on the basis of entirely imported technology. This was initially the French EPR currently being built in Hinkley Point, with another another type of pressurised water reactor, from Toshiba, to be built in Cumbria, and a third type of reactor, a boiling water reactor from Hitachi, in Anglesea. That hasn’t worked out so well, with only the EPRs now looking likely to be built. The current strategy envisages a reset, with a new programme of light water small modular reactors – that is to say, a technologically conservative PWR designed with an emphasis on driving its capital cost down, followed by work on a next generation fission reactor. These “advanced modular reactors” would be relatively small high temperature reactor. The logic for the UK to be the country to develop this technology is that it is only country that has run an extensive programme of gas cooled reactors, but it still probably needs collaboration with other like-minded countries.

How much emphasis should the UK put into developing electric vehicles, as opposed to simply creating the infrastructure for them and importing the technology? The automotive sector still remains an important source of added value for the UK, having made an impressive recovery from its doldrums in the 90’s and 00’s. Jaguar Land Rover, though owned by the Indian conglomerate Tata, is still essentially a UK based company, and it has an ambitious development programme for electric vehicles. But even with its R&D budget of £1.8 bn a year, it is a relative minnow by world standards (Volkswagen’s R&D budget is €13bn, and Toyota’s only a little less); for this reason it is developing a partnership with BMW. The government should support the UK industry’s drive to electrify, but care will be needed to identify where UK industry can find the most value in global supply chains.

A “green industrial strategy” is often sold on the basis of the new jobs it will create. It will indeed create more jobs, but this is not necessarily a good thing. If it takes more people, more capital, more money to produce the same level of energy services – houses being heated, iron being smelted, miles driven in cars and lorries – then that amounts to a loss of productivity across the economy as a whole. Of course this is justified by the huge costs that burning fossil fuels impose on the world as a whole through climate change, costs which are currently not properly accounted for. But we shouldn’t delude ourselves. We use fossil fuels because they are cheap, convenient, and easy to use, and we will miss them – unless we can develop new technologies that supply the same energy services at a lower cost, and that will take innovation. New low carbon energy technologies need to be developed, and existing technologies made cheaper and more effective.

To sum up, the ten point plan is a useful step forward, The contours of a zero-emissions future are starting to emerge, and it is very welcome that the government has overcome its aversion to industrial strategy. But more commitment and more realism is required.

Nanomedicine comes of age with mRNA vaccines

There have been few scientific announcements that have made as big an impact as the recent news that a vaccine, developed in a collaboration between German biotech company BioNTech and the pharmaceutical giant Pfizer, has been shown to effective against covid-19. What’s even more striking is that this vaccine is based on an entirely new technology. It’s an mRNA vaccine; rather than injecting weakened or dead virus materials, it harnesses our own cells to make the antigens that prime our immune system to fight future infections, exactly where those antigens are needed. This is a brilliantly simple idea with many advantages over existing technologies that rely on virus material – but like most brilliant ideas, it takes lots of effort to make it actually work.

Here I want to discuss just one aspect of these new vaccines – how the mRNA molecule is delivered to the cells where we want it to go, and then caused to enter those cells, where it does its job of making the virus proteins that cause the chain of events leading to immunity. This relies on packaging the mRNA molecules inside nanoscale delivery devices. These packages protect the mRNA from the body’s defense mechanisms, carry it undamaged into the interior of a target cell through the cell’s protective membrane, and then open up to release the bare mRNA molecules to do their job. This isn’t the first application of this kind of nanomedicine in the clinic – but if the vaccine lives up to expectations, it will make unquestionably the biggest impact. In this sense, it marks the coming of age of nanomedicine.

Other mRNA vaccines are in the pipeline too. One being developed by the US company Moderna with National Institute of Allergy and Infectious Diseases (part of the US Government’s NIH), is also in phase 3 clinical trials, and it seems likely that we’ll see an announcement about that soon too. Another, from the German Biotech company CureVac, is one step behind, in phase 2 trials. All of these use the same basic idea, delivering mRNA which encodes a protein antigen. A couple of other potential mRNA vaccines use a twist on this simple idea; a candidate from Arcturus with the Duke-National University of Singapore, and another from Imperial College, use “self-amplifying RNA” – RNA which doesn’t just encode the desired antigen, but which also carries the instructions for some machinery to make more of itself. The advantage of this in principle is that it requires less RNA to produce the same amount of antigen.

But all of these candidates have overcome the same obstacle – how to get the RNA into the human cells where it is needed? The problem is that, even before the RNA reaches ones of its target cells, the human body is very effective at identifying any stray bits of RNA it finds wandering around and destroying them. All of the RNA vaccine candidates use more or less the same solution, which is to wrap up the vulnerable RNA molecule in a nanoscale shell made of the same sort of lipid molecules that form the cell membrane.

The details of this technology are complex, though. I believe the BioNTech, CureVac and Imperial vaccines all use the same delivery technology, working in partnership with a Canadian biotech company Acuitas Therapeutics. The Moderna vaccine delivery technology comes from that towering figure of nanomedicine, MIT’s Robert Langer. The details in each case are undoubtedly proprietary, but from the literature it seems that both approaches use the same ingredients.

The basic membrane components are a phospholipid analogous to that found in cell membranes (DSPC – distearoylphosphatidylcholine), together with cholesterol, which makes the bilayer more stable and less permeable. Added to that is a lipid to which is attached a short chain of the water-soluble polymer PEO. This provides the nanoparticle with a hairy coat, which probably helps the nanoparticle avoid some of the body’s defences by repelling the approach of any macromolecules (artificial vesicles thus decorated are sometimes known as “stealth liposomes”), and perhaps also controls the shape and size of the nanoparticles. Finally, perhaps the crucial ingredient is another lipid, with a tertiary amine head group – an ionisable lipid. This is what the chemists call a weak base – like ammonia, it can accept a proton to become positively charged (a cation). Crucially, its charge state depends on the acidity or alkalinity of its environment.

To make the nanoparticles, these four components are dissolved in ethanol, while the RNA is dissolved in a mildly acidic solution in water. Then the two solutions are mixed together, and out of that mixture, by the marvel of self-assembly, the nanoparticles appear, with the RNA safely packaged up inside them. Of course, it’s more complicated than that simple statement makes it seem, and I’m sure there’s a huge amount of knowledge that goes into creating the right conditions to get the particles you need. But in essence, what I think is going on is something like this.

When the ionisable lipid sees the acidic environment, it becomes positively charged – and, since the RNA molecule is negatively charged, the ionisable lipid and the RNA start to associate. Meanwhile, the other lipids will be self-organising into sheets two molecules thick, with the hydrophilic head groups on the outside and the oily tails in the middle. These sheets will roll up into little spheres, at the same time incorporating the ionisable lipids with their associated mRNA, to produce the final nanoparticles, with the RNA encapsulated inside them.

When the nanoparticles are injected into the patient’s body, their hairy coating, from the PEO grafted lipids, will give them some protection against the body’s defences. When they come into contact with the membrane of a cell, the ionisable lipid is once again crucial. Some of the natural lipids that make up the membrane coating the cell are negatively charged – so when they see the positively charged head-group of the ionisable lipids in the nanoparticles, they will bind to them. This has the effect of disrupting the membrane, creating a gap to allow the nanoparticle in.

This is a delicate business – cationic surfactants like CTAB use a similar mechanism to disrupt cell membranes, but they do that so effectively that they kill the cell – that’s why we can make disinfectants out of them. The cationic lipid in the nanoparticle must have been chosen so that it disrupts the membrane enough to let the nanoparticle in, but not so much as to destroy it. Once inside the cell, the conditions must be different enough that the nanoparticle, which is only held together by relatively weak forces, breaks open to release its RNA payload.

It’s taken a huge amount of work – over more than a decade – to devise and perfect a system that produces nanoparticles, that successfully envelopes the RNA payload, that can survive in the body long enough to reach a cell, that can deliver its payload through the cell membrane and then release it. What motivated this work wasn’t the idea of making an RNA vaccine.

One of the earliest clinical applications of this kind of technology was for the drug Onpattro, produced by the US biotech company Alnylam. This uses a different RNA based technology – so called small interfering RNA (siRNA) – to silence a malfunctioning gene in liver cells, to control the rare disease transthyretin amyloidosis. More recently, research has been driven by the field of cancer immunotherapy – this is the area for which the Founder/CEO of BioNTech, Uğur Şahin, received substantial funding from the European Research Council. Even for quite translational medical research, the path from concept to clinical application can take unexpected turns!

We all have to hope that the BioNTech/Pfizer vaccine lives up to its promise, and that at least some of the other vaccine candidates – both RNA based and more conventional – are similarly successful; it will undoubtedly be good to have a choice, as each vaccine will undoubtedly have relative strengths and weaknesses. The big question now must be how quickly production can be scaled up to the billions of doses needed to address a world pandemic.

One advantage of the mRNA vaccines is that the vaccine can be made in a chemical process, rather than having to culture viruses in a cell culture, making scale up faster. Of course there will be potential bottlenecks. These can be as simple as the vials needed to store the vaccine, or the facilities needed to transport and store them – especially acute for the BioNTech/Pfizer, which needs to be stored at -80° C.

There are also some quite specialised chemicals involved. I don’t know what will be needed for scaling up RNA synthesis; for the lipids to make the nanoparticles, I believe that the Alabama-based firm Avanti Polar Lipids has the leading position. This company was recently bought, in what looks like a very well-timed acquisition, by the Yorkshire based speciality chemicals company Croda, which I am sure has the capacity to scale up production effectively. Students of industrial history might appreciate that Croda was originally founded to refine Yorkshire wool grease into lanolin, so their involvement in this most modern application of nanotechnology, which nonetheless rests on fat-like molecules of biological origin, seems quite appropriate.

References.

The paper describing the BioNTech/Pfizer vaccine is: Phase I/II study of COVID-19 RNA vaccine BNT162b1 in adults.

The key reference this paper gives for the mRNA delivery nanoparticles is: Expression kinetics of nucleoside-modified mRNA delivered in lipid nanoparticles to mice by various routes.

The process of optimising the lipids for such delivery vehicles is described here: Rational design of cationic lipids for siRNA delivery.

A paper from the Robert Langer group describes the (very similar) kind of delivery technology that I presume underlies the Moderna vaccine: Optimization of Lipid Nanoparticle Formulations for mRNA Delivery in Vivo with Fractional Factorial and Definitive Screening Designs

UK Industrial Strategy’s three horsemen: COVID, Brexit and trade wars (and a fourth horseman of my own)

A couple of weeks ago, on 9 October 2020, I took part in a seminar for the Tony Blair Institute for Global Change, called “UK Industrial Strategy’s three horsemen: COVID, Brexit and trade wars”. This featured as speakers me, the economist Dame Kate Barker, and Anand Menon (Director of “UK in a changing Europe” at Kings College London), and was chaired by Ian Mulheirn. There is a YouTube video of the event here. Here is a slightly tidied up version of what I said.

It’s a real pleasure to be speaking at this event – and especially to be sharing a virtual platform with Kate Barker, from whom I learnt so much as a colleague working on the Industrial Strategy Commission back in 2017. Our final report then was intended to inform the discussion around the 2017 White Paper on industrial strategy. Now industrial strategy is back on the agenda – we read that the the government is planning to “rip up” the 2017 strategy, producing a new document with a heavy focus on science and technology.

Despite everything that’s happened since 2017, I agree with Kate that the principles we laid down do stand the test of changing times. Since then, the focus of my own work has been on the link between R&D, innovation and productivity, and the way regional imbalances in economic performance reflect regional imbalances in state spending in R&D.

But how is this all changed by the three horseman of the apocalypse – COVID, Brexit and trade wars – that we’re asked to discuss?

Brexit

We can talk about the link between Brexit and industrial strategy both in terms of cause and effect. Failures of industrial strategy contributed to the political conditions that led to Brexit, and the changes that Brexit will force on the UK’s economy will demand a different industrial strategy for the new economic model that the country will have to adopt.

Much written on the connection between “left behind communities” and the Brexit vote, and the relationship is possibly more complicated than simple accounts might suggest. But, as the economic geographer Philip McCann has demonstrated in his analysis of “geographies of discontent” , the UK is an outlier amongst developed countries in the scale of its economic imbalances. The greater Southeast looks like a prosperous Northern European country. The rest of the country looks like East Germany, Southern Italy or Portugal. In fact, East Germany has recovered from 40 years of communism faster than the North of England has recovered from the deindustrialisation of the 1980’s.

These regional imbalances are reflected in living standards and other measures of prosperity, including life expectancy and health outcomes. But at the root of the issue is a huge imbalance in productivity. In fact, the imbalances show up more strongly in productivity than in living standards, because the UK runs an effective transfer union – money is moved up from the greater Southeast – the only parts of the country to make a surplus on the government current account – to the rest of the country.

But the paradox is that while we transfer money to cover current spending, we concentrate the investments that build productivity growth in the already prosperous greater southeast. My own focus has been on Research and Development: here nearly half the public spending on R&D is concentrated in London and the two two subregions containing Oxford and Cambridge. R&D isn’t the only thing that matters in driving productivity growth, but it is associated with high value companies operating at the technological frontier and innovative start-ups, which anchor strong regional innovation ecosystems and produce highly prosperous knowledge intensive economies like those around Cambridge and Oxford.

Even more paradoxical is the fact that public spending on R&D is even more geographically concentrated than private sector spending. This means that potential spillovers from private sector R&D spending are being left uncaptured. We have regions like the North West, with highly productive, R&D intensive industries such as chemicals and pharmaceuticals, and the East Midlands, with its strong private sector innovation in automotive and aerospace, where the innovation potential of the regional economies isn’t being exploited to the full because the public money doesn’t follow the private.

On the other hand, there are some places that don’t have enough R&D of any kind, public or private. In Wales and the Northeast, for example, low investment in R&D leads to weak innovation economies, with poor productivity performance and low demand for skills

It is failures of industrial strategy that have led to a divided country, and those divisions have brought us sour politics.

Trade Wars

Moving onto the effects of Brexit – and the wider sense of a retreat from globalisation – we’re likely to find that the economic model that the UK has chosen is likely to be particularly threatened. Some of our highest productivity and most export oriented sectors have succeeded by becoming highly integrated in transnational value chains, and it is these sectors that are most at risk from the trade dislocations that Brexit threatens.

The UK’s automobile industry is a prime example. This has made a remarkable comeback from a low-point in the mid-2000’s – perhaps an unsung success of a modern industrial strategy which began with Mandelson’s time in the Business department at the end of the new Labour period, and persisted into the Coalition, with considerable policy continuity. But a finished car that rolls off a production line in Sunderland or Solihull combines components and parts that have been shuffled back and forth from a network of suppliers all across the world. This leads to efficient car production, but it’s going to be very difficult to adapt to a post-Brexit world where there are likely to detailed rules on local origin for the export of vehicles to the EU.

Brexit isn’t the only event that’s likely to give us hard lessons about technological self-sufficiency. We can see the effects of a much colder attitude to China in the USA in the pressure on the UK to lessen the involvement of Huawei in the 5G network, and the exclusion of the UK from the EU’s Galileo project for a satellite positioning has resulted in a scramble for a UK alternative. I suspect that politicians and policy makers severely underestimate the degree to which the UK has lost technological self-sufficiency in a whole range of sectors. I also wonder whether a wider sense of loss of what one might call technological sovereignty has itself contributed to the anxiety that culminated in Brexit.

I believe that the UK will need to rebuild some of its technological capacity if it is to remain a prosperous country, and that this will need to be a central ingredient of a more activist industrial policy. But that leaves some big questions. How much, in what sectors? The UK is a relatively small country in a big world economy. We will need to think very deeply about our place in the evolving trading systems of a world that might look very different to the post-cold-war, globalising world that policy makers have grown up in. An industrial strategy does need to be founded on a clear view about what kind of economy the UK wants – and can realistically hope – to become.

COVID-19

We’ve known for some time that increasing globalisation puts the world at greater risk of a pandemic, and with COVID-19 those fears have been realised. The closer entanglement of natural ecosystems with human society leads to more pathogens crossing from the animal world into people; then the worldwide traffic of business people and tourists spread the disease across the world before health systems have a chance to respond, as we have seen with such tragic consequences over the last year.

It’s too soon to unpick all the effects COVID-19 will have on the economy, and the implications that those effects have for the UK’s industrial strategy, but we can already start to see some themes emerging. Different sectors have been affected in different ways, with an obvious severe (but hopefully time-limited) blow to hospitality and tourism, and perhaps more far-reaching effects on commercial real estate as some pandemic induced changes in working practises are permanently adopted.

One very important question for the UK concerns the shape of the future civil aerospace industry. It’s difficult to know how future patterns of international mobility will change, but any permanent reduction will have a serious impact on one of the UK’s highest productivity industries. As a specific example, Rolls-Royce is one of the UK’s few world class innovative engineering companies of any size, but its dependence on long-haul air traffic makes the company – and the cities like Derby that depend on it – very vulnerable.

Rolls-Royce has been bailed out by the government once before, following its bankruptcy in 1971. I believe that letting Rolls-Royce fail now should be unthinkable, because of the dissipation of concentrations of high level skilled people, and the loss of innovation capacity in areas like the East Midlands that would follow. But what form should any bail-out take – and how should it take into account bigger imperatives such as the net zero greenhouse gas target?

What can we learn about our industrial strengths and weaknesses from the experience of our response to the pandemic? We went into the pandemic thinking we had the advantage of a world-class life sciences sector, but after the event we can’t say the UK has excelled in its response.

It’s certainly true that parts of our life sciences sector are excellent, and if the Oxford group or the Imperial group produce an effective vaccine against covid-19, and if the pharmaceutical industry is successful in rapidly scaling up its manufacture, that will be a huge achievement and an invaluable contribution to world health.

But in other important areas – in public health, diagnostics, the care sector – the UK’s weaknesses have been savagely exposed. We have learnt about weaknesses in supply chains for basic supplies like PPE and generic pharmaceuticals.

I think we made a category error in thinking that the “life sciences sector” is a sector at all. We have a strong pharmaceutical sector which historically been enormously productive (though not without recent difficulties), exploiting the UK’s excellent biomedical science base to produce drugs for the most lucrative world markets (particularly those of the USA). But we have done much worse in driving and implementing the kinds of innovation that serve the health needs of the UK’s own population.

The fourth horseman

There is, of course, a fourth horseman of the industrial strategy apocalypse, that is more important than any of the three we have been asked to discuss. That is climate change and the huge economic transition that the need to decarbonise our energy economy requires. It is an entirely positive development that the government has committed to a target of net zero greenhouse gas emissions by 2050, and that there is a wide political consensus in support of this target, or indeed a more ambitious goal. But I’m not convinced that policy makers and the public fully understand the magnitude of the task.

We need not just to decarbonise the electricity sector as it stands now; as we electrify other forms of energy use, we will need to at least double generating capacity. We need to decarbonise transport and domestic heating, probably using hydrogen as an energy store and vector, especially for hard to decarbonise industries like steel. We will need as much offshore wind as we can get (probably including new technologies like floating wind), we will need new nuclear build, possibly including new high temperature designs to make hydrogen from process heat.

This energy transition will be a huge dislocation. We have to do it, but we shouldn’t expect it to be without cost. People rightly talk about the new “green” jobs this transition will produce – but that’s not an unmixed blessing. If the new energy systems need to employ more people than our current fossil fuel based system, that implies a drop in productivity. We will have to apply more resources to achieve the same energy benefits, and those resources won’t be available to satisfy other wants and needs. Innovation, to create new zero carbon technologies and improve existing ones, will be urgently needed to drive down those costs, and that innovation – carried out in parallel with the deployment of existing technologies – should be a priority of industrial strategy.

The energy transition does have potential benefits for regional economic inequality, though. Much of the innovation and deployment of low carbon technologies should happen outside the prosperous Southeast – for example in Teeside and the Humber, in Cumbria and the Wirral. This should be an important part of the “levelling up” agenda.

An industrial strategy for our times

To sum up, the ravages of these four horseman mean that our economy will need to be transformed. That transformation needs to be driven by innovation, and it needs to be informed by a clear view of the enduring challenges the UK faces and a realistic assessment of the UK’s place in the world. As Kate stressed, the challenges are obvious: climate change, weak wage growth, the cost and effectiveness of health and social care, failing places. The need for a new start does give us a chance to spread the benefits of innovation more widely across the country, and we should seize that opportunity.