Nano on the Today program

The BBC’s morning radio news show – Today – ran a couple of items about nanotechnology this morning, and I made a brief appearance myself. The occasion was the launch of the nano-task force that I wrote about on Friday. The highlights of the program can be downloaded as an mp3 file.

The coverage was relatively positive in tone, focusing on the potential importance of nanotechnology in areas like sustainable energy, but pointing out the strength of the competition from other countries. But it has to be said that (as Tim Harper notes) it wasn’t hugely clear after the interviews what people thought actually needed to be done. To be honest, this question wasn’t that much clearer at the meeting in the Houses of Parliament itself; the discussion didn’t really find much of a focus.

I was slightly surprised to get a call this evening from the Today program yet again, who were wondering whether to run another item about nanotechnology tomorrow, in connection with the Demos/NEG launch. But it looks like nanotechnology is going to be displaced by coverage of the rain and floods that have afflicted the country today. I’m not surprised; I can’t remember rain so heavy, and it certainly made my journey down to London today very painful, taking me 6 hours for what’s normally a 2 hour train ride.

Nanotechnology in the UK news next week

Some high profile events in London next week mean that nanotechnology may move a little way up the UK news agenda. On Monday, there’s an event at the Houses of Parliament: Nano Task Force Conference: Nanotechnology – is Britain leading the way? The Nano Task Force in question is a ginger group set up by Ravi Silva, at the University of Surrey, with political support from Ian Gibson MP. Gibson is a Labour Member of Parliament, one of the rare breed of legislators with a science PhD, and a reputation for being somewhat independent minded.

On Tuesday, public engagement is the theme, with an all-day event “All Talk? Nanotechnologies and public engagement” at the Institute of Physics. This is a joint launch; the thinktank Demos and the Nanotechnology Engagement Group are both launching reports. The Demos report is on a series of public engagement exercises, The Nanodialogues, while Nanotechnology Engagement Group final report is an overview of the lessons learnt from all the engagement activities around nanotechnology conducted so far in the UK. The keynote speaker is Sir David King, the government’s chief scientific advisor.

I’m involved in both, giving a talk on the potential of nanotechnology for sustainable energy on Monday, and Tuesday chairing one session and being a panel member on another. Other participants include Sheila Jasanoff from Harvard, David Edgerton, the author of the recently published book “The Shock of the Old”, Ben Goldacre, the writer of the Guardian’s entertaining ‘Bad science’ column, Andy Stirling, from Sussex, James Wilsdon and Jack Stilgoe from Demos, Doug Parr from Greenpeace, and David Guston, the Director of the Center for Nanotechnology in Society at Arizona State University. It promises to be a fascinating day.

The Kroto Research Institute

You wait for years for an interdisciplinary nanoscience and nanotechnology centre to be opened somewhere in the English Midlands or south Yorkshire, and then two come along at once. Having spent yesterday 40 miles south of Sheffield, in Nottingham, at the official opening of the Nottingham nanoscience and nanotechnology centre, today I’m back home at Sheffield for the official opening of the The Kroto Research Institute and Centre for Nanoscale Science and Technology. As yesterday, the man doing the opening was the Nobel Laureate Sir Harry Kroto (Harry is an alumnus of Sheffield).

Actually, the Kroto centre covers a little more than just nanotechnology. It houses the UK’s national facility for fabricating nanostructures from III-V semiconductors, an well-equipped microscopy facility, which will soon commission an aberration corrected high resolution electron microscope capable of chemical analysis at the single-atom level, and a tissue engineering centre which spans the range from surface analysis to putting cultured skin onto patients. But there’s also a centre for computational biology, one for environmental engineering, and one for virtual reality.

Having talked about the Nottingham centre, it’s worth talking about the ways in which our two operations complement each other. Nottingham has what’s probably the best department of pharmacy in the country; they have long operated at the nanoscale, and have been leaders in applying surface science and scanning probe techniques to look at systems of biological and biomedical interest. But when they talk about nanomedicine, they have the strong links with the pharmaceutical industry that are needed to turn ideas into therapies. They’ve been successful in collaborating with the Department of Physics, whose interest in applying physical techniques to biological systems goes back to the discovery there of magnetic resonance imaging. Like Sheffield, they have real strength in semiconductor nanotechnology, and they also have the UK’s leaders in single molecule manipulation using scanning probe techniques.

There are already some major collaborations between Nottingham and Sheffield. These include the Nanorobotics project, which aims to combine nanoscale actuator technology with live electron microscopy observation, each at a resolution of down to 0.1nm. The Snomipede project, also including Glasgow and Manchester, aims to combine near-field scanning probe microscopy as a way of patterning molecules with massive parallelisation of the kind familiar from the IBM millipede technology. There is undoubtedly room for more collaboration between the two universities in this area. One should probably never regret all those failed research proposals one has put in, but back in 2000 we did put together a joint bid, together with Leeds, to host one of the two Interdisciplinary Research Collaborations in Nanotechnology that were being funded then. The money went to Oxford and Cambridge, and I don’t want to cast aspersions on the good work that’s come out of both places, but I’m sure we would have done a good job.

The Nottingham nanotechnology and nanoscience centre

Today saw the official opening of the Nottingham nanotechnology and nanoscience centre, which brings together some existing strong research areas across the University. I’ve made the short journey down the motorway from Sheffield to listen to a very high quality program of talks, with Sir Harry Kroto, co-discoverer of buckminster fullerene, taking the top of the bill. Also speaking were Don Eigler, from IBM (the originator of perhaps the most iconic image in all nanotechnology, the IBM logo made from individual atoms) Colin Humphreys, from the University of Cambridge, and Sir Fraser Stoddart, from UCLA.

There were some common themes in the first two talks (common, also, with Wade Adams’s talk in Norway described below). Both talked about the great problems of the world, and looked to nanotechnology to solve them. For Colin Humphries, the solutions to problems of sustainable energy and clean water are to be found in the material gallium nitride, or precisely in the compounds of aluminium, indium and gallium nitride which allow one to make, not just blue light emitting diodes, but LEDs that can emit light of any wavelength between the infra-red and the deep ultra-violet. Gallium nitride based blue LEDs were invented as recently as 1996 by Shuji Nakamura, but this is already a $4 billion market, and everyone will be familiar with torches and bicycle lights using them.

How can this help the problem of access to clean drinking water? We should remind ourselves that 10% of world child mortality is directly related to poor water quality, and half the hospital beds in the world occupied by people with water related diseases. One solution would be to use deep ultraviolet to sterilise contaminated water. Deep UV works well for sterilisation because biological organisms never developed a tolerance to these waves, which don’t penetrate the atmosphere. UV at a wavelength of 270 nm does the job well, but existing lamps are not practical because they need high voltages and are not efficient, and also some use mercury. AlGaN LEDS work well, and in principle they could be powered by solar cells at 4 V, which might allow every household to sterilise its water supply easily and cheaply. The problem is efficiency is too low for flowing water. At blue wavelengths (400 nm) efficiency is very good at 70%, but it drops precipitously at smaller wavelengths, and this is not yet understood theoretically.

The contribution of solid state lighting to the energy crisis arises from the efficiency of LEDs compared to tungsten light bulbs. People often underestimate the amount of energy used in lighting domestic and commercial buildings. Globally, it accounts for 1,900 megatonnes of CO2; this is 70% of the total emissions from cars, and three times the amount due to aviation. In the UK, it amounts to 20% of electricity generated, and in Thailand, for example, it is even more, at 40%. But tungsten light bulbs, which account for 79% of sales, have an efficiency of only 5%. There is much talk now of banning tungsten light bulbs, but the replacement, fluorescent lights, is not perfect either. Compact fluorescents have an efficiency of 15%, which is an improvement, but what is less well appreciated is that each bulb contains 4 mg of mercury. This would lead to tonnes of mercury ending up in landfills if tungsten bulbs were replaced by compact fluorescents.

Could solid-state lighting do the job? Currently what you can buy are blue LEDs (made from InGaN) which excite a yellow phosphor. The colour balance of these leaves something to be desired, and soon we will see blue or UV LEDs exciting red/green/blue phosphors which will have a much better colour balance (you could also use a combination of red, green and blue LEDs, but currently green efficiencies are too low). The best efficiency in a commercial white LED is 30% (from Seoul Semiconductor), but the best in the lab (Nichia) is currently 50%. The target is an efficiency of 50-80% at high drive currents, which puts them at a higher efficiency than the current most efficient light, sodium lamps, whose familiar orange glow converts electricity at 45% efficiency. This target would make them 10 times more efficient than filaments, 3 times more efficient than compact fluorescents and with no mercury. In the US the 50% replacement of filaments would save 41 GW, in the UK 100% replacement would save 8 GW of power station capacity. The problem at the moment is cost, but the rapidity of progress in this area means that Humphries is confident that within a few years costs will fall dramatically.

Don Eigler also talked about societal challenges, but with a somewhat different emphasis. His talk was entitled “Nanotechnology: the challenge of a new frontier”. The questions he asked were “What challenges do we face as a society in dealing with this new frontier of nanotechnology, and wow should we as a society make decisions about a new technology like nanotechnology?”

There are three types of nanotechnology, he said: evolutionary nanotechnology (historically larger technologies that have been shrunk to nanoscale dimensions), revolutionary nanotechnology (entirely new nanometer-scale technologies) and natural nanotechnology (cell biology, offering inspirations for our own technologies). Evolutionary nanotechnologies include semiconductors, nanoparticles in cosmetics. Revolutionary nanotechnologies include carbon nanotubes, for potential new logic structures that might supplant silicon, and the IBM millipede data storage system. Natural nanotechnologies include bacterial flagellar motors.

Nanohysteria comes into different varieties too. Type 1 nanohysteria is represented by greed driven “irrational exuberance”, and is based on the idea that nanotechnology will change everything very soon, as touted by investment tipsters and consultants who want to take people’s money off them. What’s wrong with this is the absence of critical thought. Type 2 nanohysteria is the opposite – fear driven irrational paranoia exemplified by the grey goo scenario of out of control self-replicating molecular assemblers or nanobots. What’s wrong with this is again, the absence of critical thought. Prediction is difficult, but Eigler thinks that self-replicating nanobots are not going to happen any time soon, if ever.

What else do people fear about nanotechnology? Eigler recently met a young person with strong views, that nanotech is scary, it will harm the biosphere, it will create new weapons, it is being driven by greedy individuals and corporations, in summary it is not just wrong, it is evil. Where did these ideas come from? If you look on the web – you see talk of superweapons made from molecular assemblers. What you don’t find on the web are statements like “My grandmother is still alive today because nanotechnology saved her life”. Why is this? Nanotechnology has not yet provided a tangible benefit to grandmothers!

Some candidates include gold nanoshell cancer therapy, as developed by Naomi Halas at Rice. This particular therapy may not work in humans, but something similar will. Another example is the work of Sam Stupp at Northwestern, making nanofibers that cause neural progenitor cells turn into new neurons, not scar tissue, holding out the hope of regenerative medicine to repair spinal cord damage.

As an example of wrong conclusions, Eigler made the smallest logic circuit, 12nm by 17 nm, made from carbon monoxide. But carbon monoxide is a deadly poison – shouldn’t we worry about this? Let’s do the sum – 18 CO molecules are needed for one transistor. The context is that I breathe 2 billion trillion molecules a day, so every day I breathe enough to make 160 million computers.

What could the green side of nanotechnology be? We could have better materials, that are lighter, stronger and more easily recyclable, and this will reduce energy consumption. Perhaps we can use nanotechnology to reduce consumption of natural resources and helping recycling. We can’t prove yet that these good benefits will follow, but Eigler believes they are likely.

There is a real risk of nanotechnology, if it is used without evaluating the consequences. The widespread introduction of nanoparticulates into the environment would be an example of this. So how do we now if something is safe? We need to think it through, but we can’t guarantee absolutely that anything can be absolutely safe. The principles should be that we eliminate fantasies, understand the different motivations that people have, and honestly assess risk and benefit. We need informed discussion, that is critical, creative, inclusive and respectful. We need to speak with knowledge and respect, and listen with zeal. Scientists have not always been good at this and we need to get much better. Our best weapons are our traditions of rigorous honesty and our tolerance for diverse beliefs.

A new strategy for UK Nanotechnology

It was announced this morning that the Engineering and Physical Sciences Research Council, the lead government agency for funding nanotechnology in the UK, has appointed a new Senior Strategic Advisor for Nanotechnology. This forms part of a new strategy, published (in a distinctly low key way) earlier this year. The strategy announces some relatively modest increases in funding from the current level, which amounts to around £92 million per year, much of which will be focused on some large-scale “Grand Challenge” projects addressing areas of major societal need.

An editorial (subscription required) in February’s issue of Nature Nanotechnology lays out the challenges that will face the new appointee. By a number of measures, the UK is underperforming in nanotechnology relative to its position in world science as a whole. Given the relatively small sums on offer, focusing on areas of existing UK strength – both academically and in existing industry – is going to be essential, and it’s clear that the pharmaceutical and health-care sectors are strong candidates. Nature Nanotechnology’s advice is clear: “Indeed, getting the biomedical community— including companies — to buy into a national strategy for nanotechnology and health care should be a top priority for the nano champion.”

Optimism and pessimism in Norway

I’m in Bergen, Norway, at a conference, Nanomat 2007, run by the Norwegian Research Council. The opening pair of talks – from Wade Adams, of Rice University and Jürgen Altmann, from Bochum, presented an interesting contrast of nano-optimism and nano-pessimism. Here are my notes on the two talks, hopefully more or less reflecting what was said without too much editorial alteration.

The first talk was from Wade Adams, the director of Rice University’s Richard E. Smalley Institute, with the late Richard Smalley’s message “Nanotechnology and Energy: Be a scientist and save the world”. Adams gave the historical background to Smalley’s interest in energy, which began with a talk from a Texan oilman explaining how rapidly oil and gas were likely to run out. Thinking positively, if one has cheap, clean energy most of the problems of the world – lack of clean water, food supply, the environment, even poverty and war – are soluble. This was the motivation for Smalley’s focus on clean energy as the top priority for a technological solution. It’s interesting that climate change and greenhouse gases was not a primary motivation for him; on the other hand he was strongly influenced by Hubbert (see and his theory of peak oil. Of course, the peak oil theory is controversial (recent a article in Nature – That’s oil, folks, subscription needed – for an overview of the arguments), but whether oil production has already peaked, as the doomsters suggest, or the peak is postponed to 2030, it’s a problem we will face at sometime or other. On the pessimistic side, Adams cited another writer – Mat Simmons – who maintains that oil production in Saudi Arabia – usually considered the reserve of last resort – has already peaked.

Meanwhile on the demand side, we are looking at increasing pressure. Currently 2 billion people have no electricity, 2 billion people rely on biomass for heating and cooking, the world’s population is still increasing and large countries such as India and China are industrialising fast. One should also remember that oil has more valuable uses than simply to be burnt – it’s the vital feedstock for plastics and all kinds of other petrochemicals.

Summarising the figures, the world (in 2003) consumed energy at a rate of 14 terawatts, the majority in the form of oil. By 2050, we’ll need between 30 and 60 terawatts. This can only happen if there is a dramatic change – for example renewable energy stepping up to deliver serious (i.e. measured in terawatts) amounts of power. How can this happen?

The first place to look is probably efficiencies. In the United States, about 60% of energy is currently simply wasted, so simple measures such as using low energy light bulbs and having more fuel-efficient cars can take us a long way.

On the supply side, we need to be hard-headed about evaluating the claims of various technologies in the light of the quantities needed. Wind is probably good for a couple of terawatts at most, and capacity constraints limit the contribution nuclear can make. To get 10 terawatts of nuclear by 2050 we need roughly 10,000 new plants – that’s one built every two days for the next 40 years, which in view of the recent record of nuclear build seems implausible. The reactors would in any case need to be breeders to avoid the consequent uranium shortage. The current emphasis on the hydrogen economy is a red herring, as it is not a primary fuel.

The only remaining solution is solar power. 165,000 TW hits the earth in sunlight. The problem is that the sunlight doesn’t arrive in the right places. Smalley’s solution was a new energy grid system, in which energy is transmitted through wires rather than in tankers. To realise this you need better electrical conductors (either carbon nanotubes or superconductors), and electrical energy storage devices. Of course, Rice University is keen on the nanotube solution. The need is to synthesise large amounts of carbon nanotubes which are all of the same structure, the structure that has metallic properties rather than semiconducting ones. Rice had been awarded $16 million from NASA to develop the scale-up of their process for growing metallic nanotubes by seeded growth, but this grant was cancelled amidst the recent redirection of NASA’s priorities.

Ultimately, Adams was optimistic. In his view, technology will find a solution and it’s more important now to do the politics, get the infrastructure right, and above all to enthuse young people with a sense of mission to become scientists and save the world. His slides can be downloaded here (8.4 MB PDF file).

The second, much more pessimistic, talk was from Jürgen Altmann, a disarmament specialist from Ruhr-Universität Bochum. His title was “Nanotechnology and (International) Society: how to handle the new powerful technologies?” Altmann is a physicist by original training, and is the author of a book, Military nanotechnology: new technology and arms control.

Altmann outlined the ultimate goal of nanotechnology as the full control of the 3-d position of each atom – the role model is the living cell, but the goal goes much beyond this, going beyond systems optimised for aqueous environments to those that work in vacuum, high pressure, space etc., limited only by the laws of nature. Altmann alluded to the controversy surrounding Drexler’s vision of nanotechnology, but insisted that no peer-reviewed publication had succeeded in refuting it.

He mentioned the extrapolations of Moore’s law due to Kurzweil, with the prediction that we will have a computer with a human being’s processing power by 2035. He discussed new nanomaterials, such as ultra-strong carbon nanotubes making the space elevator conceivable, before turning to the Drexler vision of mechanosynthesis, leading to a universal molecular assembler, and discussing consequences like space colonies and brain downloading, before highlighting the contrasting utopian and dystopian visions of the outcome – one the one hand, infinitely long life, wealth without work and clean environment, on the other hand, the consumption of all organic life by proliferating nanorobots (grey goo).

He connected these visions to transhumanism – the idea that we could and should accelerate human evolution by design, and the perhaps better accepted notion of converging technologies – NanoBioInfoCogno – which has taken up somewhat different connotations either side of the Atlantic (Altmann was on the working group which produced the EU document on converging technologies). He foresaw the benefits arising on a 20 year timescale, notably direct broad-band interfaces between brain and machines.

What, then, of the risks? There is the much discussed issue of nanoparticle toxicity. How might nanotechnology affect developing countries – will the advertised benefits really arise? We have seen a mapping of nanotechnology benefits onto the Millennium Development Goals looked by the Meridian Institute. But this has been criticised, for example by N. Invernizzi, (Nanotechnology Law and Business Journal 2 101-11- (2005)). High productivity will mean less demand for labour, there might be a tendency to neglect non-technological solutions, there might be a lack of qualified personnel. He asked what will happen if India and China succeed with nano, will that simply increase internal rich-poor divisions within those countries? The overall conclusion is that socio-economic factors are just as important as technology.

With respect to military nanotechnology, there are many potential applications, including smaller and faster electronics and sensors, lighter and faster armour and armoured vehicles, miniature satellites, including offensive ones. Many robots will be developed, including nano-robots, including biotechnical hybrids – electrode controlled rats and insects. Medical nanobiotechnology will have military applications – capsules for controlled release of biological and chemical agents, mechanisms for targeting agents to specific organs, but also perhaps to specific gene patterns or proteins, allowing chemical or biological warfare to be targeted against specific populations.

Military R&D for nano is mostly done in the USA, where it accounts for 1/4 – 1/3 of federal funding. At the moment, the USA spends 4-10 times as much as the rest of the world, but perhaps we can shortly expect other countries with the necessary capacity, like China and Russia, to begin to catch up.

The problem of military nanotechnology from an arms control point of view is that limitation and verification is very difficult – much more difficult than the control of nuclear technology. Nano is cheap and widespread, much more like biotechnology, with many non-military uses. Small countries and non-state actors can use high technology. To control this will need very intrusive inspection and monitoring – anytime, anyplace. Is this compatible with military interest in secrecy and the fear of industrial espionage?

So, Altmann asks, Is the current international system up to this threat? Probably not, he concludes, so we have two alternatives: increasing military and terrorist threats and marked instability, or the organisation of global security in another way, involving some kind of democratic superstate, in which existing states voluntarily accept reduced sovereignty in return for greater security.

Coherent “atoms” in (fairly) warm solids

In 2001, Eric Cornell, Wolfgang Ketterle and Carl Wieman won the Nobel prize for physics for demonstrating the phenomenon of Bose-Einstein condensation in a system of trapped ultra-cold atoms. Bose-Einstein condensation is a remarkable quantum phenomenon in which a system of particles all occupy the same quantum state. In this condition they are identical and indistinguishable – in effect the individual atoms have lost their identities and coalesced into a single coherent quantum blob. Now researchers have demonstrated the same phenomenon in a different type of particle, polaritons, confined in a semiconductor nanostructure, at a temperature of 4.2 K. This is not exactly ambient, but it is much more convenient than the temperature of 20 nanoKelvin needed for the atom experiments.

The experiments, reported in this article in Science (abstract, subscription required for full article), were done by grad students Ryan Balili and Vincent Hartwell in David Snoke’s group at the University of Pittsburgh, in collaboration with Loren Pfeiffer and Kenneth West from Bell Labs. The basic structure consisted of a semiconductor quantum well trapped between a pair of reflectors, each made up of alternating dielectric layers, rather like the one shown in the picture in this earlier post. If a laser is shone into the structure, pairs of electrons and holes are generated; these pairs of charge are bound together by the electrostatic interaction and behave like particles called excitons. Meanwhile, light bounces back between the two mirrors, forming standing wave modes. Energy bounces back and forward between these standing wave photons and excitons, and the combination forms a quasi-particle called a polariton.

How on earth can one compare an entity that is composed of a complicated set of interactions between light and matter with something simple and elementary like an atom? The answer to this is rather interesting, and relies on a principle of solid state physics that is fundamental to the subject, but little known outside the field. Simple theory tells us how to understand systems composed out of entities that don’t interact with each other very much; the first theory of electrons in solids one gets taught simply assumes that the electrons don’t interact with each other at all, which on the face of it is absurd because they are charged objects which strongly repel each other. It turns out that you can often lump together the basic entity together with all its associated interactions as a “quasi-particle”, which behaves just like a simple, quantum mechanical particle. The particle is characterised by an “effective mass” which, in the case of these polaritons, is very much smaller than a real atom. It is this very small mass which allows them to form a Bose-Einstein condensate at (relatively) high temperatures.

This is another great example of how being able to make precisely specified semiconductor nanostructures allows one to tune the interaction between light and matter to produce remarkable new effects. What use could this have in the future? Peter Littlewood, from the Cavendish Laboratory in Cambridge, writes in a commentary in Science (subscription required):

“These objects are, on the one hand, a new kind of low-threshold laser, but the fact that they consist of coherent quantum objects (unlike a regular laser) puts them potentially in the class of quantum devices. A rash speculation is that a small polariton condensate could become the basis for an elementary quantum computer, but the easy coupling to light might simplify the wiring issues that many quantum information technologies find challenging.”

Pierre-Gilles de Gennes 1932-2007

I was sorry to hear that Pierre-Gilles de Gennes, the great French theoretical physicist, died a week ago last Friday, following a long struggle with cancer. De Gennes, who won the Nobel Prize for Physics in 1991, created much of our modern understanding of liquid crystals, colloids and polymers, essentially founding the field of soft condensed matter by recognising the common features of these soft systems characterised by interaction energies comparable to thermal energies and dominated by Brownian motion.

This obituary in Le Monde has a good account of his life and work. My first introduction to his work was at the very beginning of my PhD. When I asked my supervisor what I should do to begin my studies, he told me to go to the bookshop, buy a copy of de Gennes’s book Scaling Concepts in Polymer Physics, and come back when I had read it. I did this, and very good advice it turned out to be; it’s a book I still refer to. Soon after I had the chance of meeting the man himself , when he listened with absolute attention and politeness to what this insignificant graduate student had to say.

De Gennes was an erudite, deeply cultured and utterly charming man. One of his passions outside physics was art, and he used art history to illustrate how he saw the role of the theoretical physicist evolving in a time when computer simulations are becoming ever more powerful. Just as the invention of photography meant that artists no longer felt the obligation to strive for simple verisimilitude, and could seek to capture the essence of their subject in increasingly impressionistic and abstract ways, so the fact that systems of great complexity could now be simulated on a computer left theorists with the job of sketching a description of these systems in a way that puts insight and transparency ahead of perfect accuracy. As the attention of physicists turns more and more towards complex and difficult systems (including living things, the most difficult systems of all) this insistence on cutting through the thicket of detail to focus on the essentials becomes ever more important.

In praise of Vaclav Smil

In my efforts to educate myself about how new technologies might impact on our economy and society, the author from whom I’ve learnt the most is unquestionably Vaclav Smil. Smil is a Professor in the Department of Environment and Geography at the University of Manitoba, but his writings cover the whole sweep of the interaction of technology and society. What I appreciate about his books is their emphasis on rigorous quantification, their long historical perspective and global span (Smil is an expert on China, among many other things), and their grounding in the things that matter – how we get the food we eat and the energy that underlies our lifestyles.

My introduction to Smil’s work came when I needed a rapid introduction to energy economics. His 2003 book Energy at the Crossroads: global perspectives and uncertainties does this job in an admirably clear-headed and realistic way. It has a particularly sobering view of the poor record of energy forecasting in the past, and of the evolution of linkages between economic growth and output and energy inputs. Enriching the Earth: Fritz Haber, Carl Bosch, and the Transformation of World Food Production takes a historical view of the linkage between energy and food. Few people nowadays stop to think about the importance of artificial nitrogen fixation, powered by fossil fuels, in feeding the world. Yet it is clear that without artificial fertilizers more than half of the current population of the earth would not be alive today. We are effectively surviving by eating oil. This theme is developed in Feeding the World: A Challenge for the Twenty-First Century, which asks the fundamental question, just how many people could the world feed? After a period of plentiful and cheap food, at least in the West, we’ve forgotten about some of the more apocolyptic visions of mass famine. Yet the world food supply equation is probably more fragile than we’d like to think. This is likely to get worse, as climate change, water shortages, and environmental degradation puts pressure on yields, and increasing demand for biofuels increases demand for non-food uses of crops.

Many of these themes are brought together, with many other trends, in two of Smil’s most recent books, Creating the Twentieth Century: Technical Innovations of 1867-1914 and Their Lasting Impact and Transforming the Twentieth Century: Technical Innovations and Their Consequences . Taken together, these two volumes offer the best overview of how the world we live in now has developed that I know of. At one level, this is simply a narrative history of modern technology, albeit one that takes a holistic view of the way in which many different inventions come together to make important innovations possible. In this sense, it’s the story of accelerating change, in which one technological development facilitates another. But he is explicitly dismissive of those who are too quick to plot exponential curves and extrapolate from them. The title of his first book makes it clear that in Smil’s view, the true technological revolution took place in the last part of the 19th century, and what we have seen since then is largely the unfolding of the developments that were initiated in this great saltation. And he is by no means certain that the rapid change will continue, noting the degree to which it has been built on a massive, and probably unsustainable, growth in energy consumption. His agnostic outlook is summed up in the last chapter, where he asks:

“have the last six generations of great technical innovations and transformations merely been the beginning of a new extended era of unprecedented accomplishments and spreading and sustained affluence – or have they been a historically ephemeral aberration that does not have any realistic chance of continuing along the same, or a similar trajectory, for much longer?”