A new laboratory for semiconductor nanotechnology at Sheffield

A suite of refurbished laboratories in my department (Physics and Astronomy, University of Sheffield) was formally opened yesterday by Dame Julia Higgins, chair of EPSRC and Vice President of the Royal Society. The labs were refurbished with money from the Wolfson Foundation and the Royal Society and now house Maurice Skolnick’s work in semiconductor nanotechnology, as well as my own group’s labs. We marked the occasion with a set of scientific seminars.

It’s always interesting to get an update on what one’s colleagues are up to, and Maurice’s talk had some stunning examples of recent progress in semiconductor nanotechnology. I’ll show just one example.
Microresonator with quantum dots
The picture shows (left) a very small diameter photonic micropillar – one can make out a central enclosure, the cavity, sandwiched between two distributed Bragg reflectors (DBRs). These are multilayers of different semiconductors which behave as near-perfect mirrors for light; photons generated inside the cavity are essentially trapped by the mirrors and the edge of the pillar.

Simply to make these intricately structured micropillars is enough of an achievement (these were made at Sheffield by A Tahraoui and P W Fry). But within the cavity there is a further level of control. The pictures on the right show individual quantum dots, grown by self-assembly. These are incorporated within the cavity of the structure on the left (the transmission electron micrograph, labelled TEM, comes from Hopkinson and Cullis at Sheffield, the scanning tunelling micrograph, labelled STM, from Skolnick’s collaborator P.M. Koenraad at Eindhoven). The resulting structure simultaneously exploits the quantum effects that occur when electrons are confined within the quantum dots, with the optical confinement effects that occur when photons are trapped within the cavity. This allows simultaneous control both of the energies of electronic states in the quantum dot and of the way transitions between electronic states are coupled to the emission of light.

Quantum dots of this kind are already used to make solid state lasers for use in optical communications. What is really exciting Maurice and his colleagues, though, is the possibility that this kind of structure might be used as the basis for a quantum computer. Quantum computers, if one could get them to work, offer the possibility of massively parallel computing of a power unparalleled with our current CMOS technologies. The problem is that one has to keep quantum states isolated from the environment enough to work their quantum magic, but one still has to retain the ability to interact with the states enough to provide some kind of input and output to the computations. This kind of structure, with its very close control both of the states themselves, and, via the photonic control, of their interactions with the outside world, may just possibly do the trick.

Nanoparticles everywhere

Even the cleanest environments in the world are contaminated by nanoparticles; these are the product, not of the nascent nanotechnology industry, but of natural processes. In the southern ocean between Cape Horn and the Antarctic, on the West Coast of Ireland, swept by Atlantic gales, significant numbers of particles in the 10nm – 100 nm size range can be detected. An interesting recent article in Nature (subscription required) provides an interesting analysis of natural nanoparticles from a sampling site in Ireland. These nanoparticles include, as you would expect, some made out of sea-salt. As is also already well-known, another major contribution to natural nanoparticles comes from sulfates, whose origin is probably the reduction of a chemical called DMS which is generated by plankton. This process has raised lots of interest because of its potential importance as a mechanism of climate control feedback as demanded by the Gaia hypothesis. The recent Nature paper adds a third class of materials to the mix – miscellaneous organic chemicals, that according to the season can comprise more than 60% of the mass of sub-micron particles. These, too, probably have their origin in the plankton near the sea’s surface.

Sampling sites closer to urban life predictably show greater concentrations of organic nanoparticles, arising from volatile organic compounds emitted from vehicle exhausts and other manmade sources. Even here, one surprise is the potential importance of gaseous hydrocarbons of natural origin – isoprene and terpenes – as contributors to the total VOC load. There’s a good brief discussion of the nanoparticle exposure that arises as a result of pollution in the Royal Society Report.

None of this diminishes the need to do good toxicological studies on new nanoparticles if there’s any danger of human or environmental exposure to them. But it does emphasise that a great deal is known about the behaviour of nanoparticles in the enivronment. The trouble is that the knowledge arises in a field – atmospheric chemistry – that seems at first to be far removed from the interests of nanotechnology. Nomenclature differences may seem trivial, but they actually take on a bigger significance in today’s world of computer searches. None of the literature on this subject would show up if you did a search on terms like “nanoparticles in the environment”; for these people nanoparticles aren’t “nanoparticles”, they’re “Aitken-mode aerosols”.

My thanks to Brian Davison for taking me to The Moon for an in-depth briefing on all this.

Most people aren’t engineers

If you ask a materials scientist to choose a material, the first things they will think about are things like strength, fracture toughness and stiffness – the fundamental mechanical properties that characterise the material. A materials technologist will consider these properties too, but into the equation will also go how much the material costs and how easy it is to manufacture. But when a consumer is deciding whether to buy a product made from the material, it’s not numbers like the fracture toughness that swing the decision. It’s much more intangible qualities, the way the material looks and feels, and the way the design integrates the properties of the component materials with the form of the object, that determine whether the purchase is made, the price the product can command, and in many cases the pleasure that the consumer gets from owning and using the artefact. We can create new materials with controlled nanostructures, designing combinations of properties like strength and toughness to order. But who’s thinking about how to design those human-centred properties that are so important in giving value to materials? Only engineers care about fracture toughness, and most people aren’t engineers.

These reflections arise after a day spent in the London offices of the design house, the Conran Partnership. A small group of scientists, on the one hand, and designers, on the other, met to talk about industrial design, what’s good and bad about plastics, and whether there’s any way in which one could relate the emotional response of a consumer to a material to some scientific description. Some things are obvious – the heft that comes from high density, the apparent coldness of metal that comes from its high thermal conductivity. Some are less obvious, though – why is the anodised aluminium finish of my Apple Powerbook quite so desirable? It’s clearly something to do with roughness and texture, but what, exactly? And what about the time dependence of these qualities – what is it about leather, hardwoods and natural stone that make them age so gracefully?

The promise of nanotechnology – even the incremental kind that is a natural development of the last fifty years of materials science – is that it will allow us to design materials with properties to order. Because materials development is done by scientists and engineers, the properties that we tend to concentrate on are the physical ones like strength and stiffness. Now we need to understand the other factors that make an object desirable so we can design materials that fulfill those needs.

The future of nanotechnology; Drexler and Jones exchange letters

The current edition of “Physics World” carries a letter from K. Eric Drexler, written in response to my article in the August edition, “The future of nanotechnology“. There is also a response from me to Drexler’s letter. Since the letters section of Physics World is not published online, I reproduce the letters here. The text here is as the authors wrote them; they’ve been lightly edited to conform with Physics World house style in the printed version.

From Dr K.Eric Drexler to Physics World.

I applaud Physics World for drawing attention to the emerging field of artificial molecular machine systems. Their enormous productive potential is illustrated in biology and nature, where we observe molecular machine systems constructing molecular machinery, electronics, and digital information storage systems at rates measured in billions of tons per year. To understand the future potential of fabrication technologies (the foundation of all physical technology) we must examine the productive potential of artificial molecular machine systems. This field of enquiry has been a focus of my research since (Drexler 1981), which explored directions first suggested by (Feynman 1959).

I was surprised to find that Professor Richard Jones, in describing ���flaws in Drexler���s vision,��� ignores my physical analysis of productive molecular machine systems. He instead criticizes the implied hydrodynamics of an artist���s fantastic conception of a ���nanosubmarine��� ��� a conception not based on my work. It is, I think, important that scientific criticisms address the scientific literature, not artistic fantasies.

Professor Jones then offers a discussion of nanoscale surface forces, thermal motion, and friction that could easily leave readers with the impression that these present dire problems, which he implies have been ignored. But ignoring surface forces or thermal motion in molecular engineering would be like ignoring gravity or air in aeronautics, and physical analysis shows that well-designed molecular bearing interfaces can have friction coefficients far lower than those in conventional machines. These issues (and many others) are analyzed in depth, using the methods of applied physics, in Chapters 3, 5, and 10 of my book Nanosystems (Drexler 1992). Professor Jones fails to cite this work, noting instead only my earlier, popular book written for a general audience.

I agree with Professor Jones regarding the importance of molecular machine systems and the value of learning from and imitating biological systems at this stage of the development of our field. Where we part company is in our judgment of the future potential of the field, of the feasibility of molecular machine systems that are as far from the biological model as a jet aircraft is from a bird, or a telescope is from an eye. I invite your readers to examine the physical analysis that supports this understanding of non-biological productive molecular machine systems, and to disregard the myths that have sprung up around it. (One persistent myth bizarrely equates productive molecular machines with gooey nanomonsters, and then declares these to be impossible contraptions that grab and juggle atoms using fat, sticky fingers.)

There are many interesting research questions to address and technology thresholds to cross before we arrive at advanced artificial molecular machine systems. The sooner we focus on the real physics and engineering issues, building on the published literature, the sooner progress can be made. To focus on artist���s conceptions and myths does a disservice to the community.

K. Eric Drexler, PhD
Molecular Engineering Research Institute

K E Drexler 1981 Molecular Engineering: An approach to the development of general capabilities for molecular manipulation. Proc. Nat. Acad. Sci. (USA) 78:5275���5278
K E Drexler 1992 Nanosystems: Molecular Machinery, Manufacturing, and Computation (New York Wiley/Interscience)
R Feynman 1959 There���s Plenty of Room at the Bottom, in D Gilbert (ed) 1961 Miniaturization (New York Reinhold)

From Dr Richard A.L. Jones to Physics World.

I am pleased that Dr Drexler finds so much to agree with in my article. Our goals are the same; our research aims to understand how to make molecular scale machines and devices. Where we differ is how best to achieve that goal. The article was necessarily very brief in its discussion of surface forces, friction and thermal motion, and my book [1] contains a much fuller discussion, which does explicitly refer to Drexler���s book ���Nanosystems���. No-one who has read ���Nanosystems��� could imagine that Drexler is unaware of these problems, and it was not my intention in the article to imply that he was. Absurd images like the nanosubmarine illustration I used are widely circulated in popular writings about Drexlerian nanotechnology; they well illustrate the point that na?�ve extrapolations of macro-scale engineering to the nanoscale won���t work, but I���m happy to agree that Drexler���s own views are considerably more sophisticated than this. The point I was making was that the approach Drexler describes in detail in Nanosystems, (which he himself describes in the words: ���molecular manufacturing applies the principles of mechanical engineering to chemistry���), works within a paradigm established in macroscopic engineering and seeks to find ways to engineer around the special features of the nanoworld. In contrast to this, the design principles adopted by cell biology turn these special features to advantage and actively exploit them, using concepts such as self-assembly and molecular shape change that have no analogue in macroscopic engineering. Again, Dr Drexler and I are in agreement that in the short term biomimetic nanotechnology will be very fruitful and should be strongly pursued. We differ about the likely long term trajectory of the technology, but here, experiment will decide. Such is the unpredictable nature of the development of technology that I rather suspect that the outcome will surprise us both.

[1] Soft Machines, R.A.L. Jones, OUP (2004)

See-through Science

See-through Science is the title of a pamphlet by James Wilsdon and Rebecca Willis, from the left-of-centre UK think-tank, Demos. It’s a thoughtful reflection on how one ought to engage the public in the development of new technologies, with the emerging debate on nanotechnology taken as its focus. I particularly like the undogmatic tone. The report is very clear on the failings of the old ideas of the “deficit model” of public engagement, which unconvincingly and patronisingly maintained that it is sufficient to educate the public in the wise ways of science to convince them of its benefits. The report also warns against the danger that all debates about new technology get twisted round to focus narrowly on risk assessment. This is particularly timely in view of the way the nanotechnology debate has unfolded, with its excessive focus on the one issue of nanoparticle toxicity.

In terms of recommendations, perhaps for me the most telling point is the way the report highlights the almost complete absence of those parts of industry involved in nanotechnology from this debate. I think this situation needs to change, and very soon.

Then worms ‘ll come and eat thee oop

I had a late night in the prosperous, liberal, Yorkshire town of Ilkley last night, doing a talk and question and answer session on nanotechnology at the local Cafe Philosophique. An engaged and eclectic audience kept the discussion going well past the scheduled finish time. Two points particularly struck me. One recurring question was whether it was ever realistic to imagine that we can relinquish technological developments with negative consequences – “if it can be done, it will be done” was the comment made more than once. I really don’t like this conclusion, but I’m struggling to find convincing arguments against it. A more positive comment concerned the idea of regulation; we are used to thinking of this idea entirely in terms of narrow prohibitions – don’t release these nanoparticles into the environment, for example. But we need to work out how to make regulation a positive force that steers the technology in a desirable direction, rather than simply trying to sit on it.

(Non British readers may need to know that the headline is a line from a rather odd and morbid folk-song called “On Ilkley Moor baht hat”, sung mostly by drunken Yorkshiremen.)

Bad reasons to oppose nanotechnology: part 1

An article in this month’s The Ecologist Magazine, by the ETC group‘s Jim Thomas, makes an argument against nanotechnology that combines ignorance and gullibility with the risk of causing real harm to some of the worlds poorest people. What, he says, will happen to the poor cotton farmers and copper miners of the third world when nanotechnology-based fabric treatments, like those sold by the Nanotex Corp, make cotton obsolete, and carbon nanotubes replace copper as electrical conductors? This argument is so wrong on so many levels it’s difficult to know where to start.

To start with, is there any development economist who would actually argue that basing an third world economy on an extractive industry like copper mining is a good way to get sustained economic development and good living standards for the population as a whole? Likewise, it’s difficult to see that the industrial-scale farming of cotton, with its huge demand for water, can be anything other than an ecological disaster. Zambia is not exactly the richest country in Africa, and Kazakhstan is not a great advertisement for the environmental benefits of cotton growing.

And is the premise even remotely realistic? I wrote below about how novel the nanotex fibre treatments actually are; in fact there are many fibre treatments available now, some carrying a “nano” label, some not, which change handling and water resistance properties of a variety of textiles, both natural and artificial. These are just as likely to increase markets for natural fibres as for artificial ones. And as for nanotubes replacing copper, at a current cost of ��200 a gram this is not going to happen any time soon. What this argument demonstrates is that, unfortunately for a campaigning group, ETC is curiously gullible, with a propensity to mistake corporate press releases and the most uncritical nano-boosterism for reality.

This matters for two reasons. Firstly, on the positive side, nanotechnology really could benefit the environment and the world’s poor. Cheap ways of providing clean water and new sources of renewable energy are realistic possibilities, but they won’t happen automatically and there’ll be real debates about how to set priorities in a way which makes the technology bring benefits to the poor as well as the rich. What these debates are going to need from their participants is some degree of economic and scientific literacy. Secondly, there are some real potential downsides that might emerge from the development of nanotechnology; we need a debate that’s framed in a way that recognises the real risks and doesn’t waste energy and credibility on silly side-issues.

Luckily, there is at least one NGO that is demonstrating a much more sophisticated, subtle and intelligent approach – Greenpeace. The contribution of their chief scientist, Doug Parr, to a recent debate on nanotechnology held by the Royal Society in the light of their recent report, is infinitely more effective at focusing on the real issues.

Feel the vibrations

The most convincing argument that it must be possible to make sophisticated nanoscale machines is that life already does it – cell biology is full of them. But whereas the machines proposed by Drexler are designed from rigid materials drawing on the example of human-scale mechanical engineering, nature uses soft and flexible structures made from proteins. At the temperatures at which protein machines operate, random thermal fluctuations – Brownian motion – cause the structures to be constantly flexing, writhing and vibrating. How is it possible for a mechanism to function when its components are so wobbly?

It’s becoming more and more clear that the internal flexibility of proteins and their constant Brownian random vibration is actually vital to the way these machines operate. Some fascinating evidence for this view was presented at a seminar I went to yesterday by Jeremy Smith, from the University of Heidelberg.

Perhaps the most basic operation of a protein-based machine is the binding of another molecule – a ligand – to a specially shaped site in the protein molecule. The result of this binding is often a change in shape of the protein. It is this shape change, which biologists call allostery, which underlies the operation both of molecular motors and of protein signalling and regulation.

It’s easy to imagine ligand binding as being like the interaction between a lock and a key, and that image is used in elementary biology books. But since both ligand and protein are soft it’s better to think of it as an interaction between hand and glove; both ligand and protein can adjust their shape to fit better. But even this image doesn’t convey the dynamic character of the situation; the protein molecule is flexing and vibrating due to Brownian motion, and the different modes of vibration it can sustain – its harmonics, to use a musical analogy – are changed when the ligand binds. Smith was able to show for a simple case, using molecular dynamics simulations, that this change in the possible vibrations of the protein molecule plays a major role in driving the ligand to bind. Essentially, what happens is with the ligand bound the low frequency collective vibrations become lowered further in frequency – the molecule becomes effectively softer. This leads to an increase in entropy, which provides a driving force for the ligand to bind.

A highly simplified theoretical model of allosteric binding solved by my colleague up the road in Leeds, Tom McLeish , has just been published in Physical Review Letters (preprint, abstract, subscription required for full published article). This supports the notion that the entropy inherent in thermally excited vibrations of proteins plays a big role in ligand binding and allosteric conformational changes. As it’s based on rather a simple model of a protein it may offer food for thought for how one might design synthetic systems using the same principles.

There’s some experimental evidence for these ideas. Indirect evidence comes from the observation that if you lower the temperature of a protein far enough there’s a temperature – a glass transition temperature – at which these low frequency vibrations stop working. This temperature coincides with the temperature at which the protein stops functioning. More direct evidence comes from rather a difficult and expensive technique called quasi-elastic neutron scattering, which is able to probe directly what kinds of vibrations are happening in a protein molecule. One experiment Smith described directly showed just the sort of softening of vibrational modes on binding that his simulations predict. Smith’s seminar went on to describe some other convincing, quantitative illustrations of the principle that flexibility and random motion are vital for the operation of other machines such as the light driven proton pump bacteriorhodopsin and one of the important signalling proteins from the Ras GTPase family.

The important emerging conclusion from all this is this: it’s not that protein-based machines work despite their floppiness and their constant random flexing and vibrations, they work because of it. This is a lesson that designers of artificial nanomachines will need to learn.

What is this thing called nanotechnology? Part 2. Nanoscience versus Nanotechnology

In the first part of my attempt to define nanotechnology terms, I discussed definitions of the nanoscale. Now I come to the important and underappreciated distinction between nanoscience and nanotechnology.

Nanoscience describes the convergence of physics, chemistry, materials science and biology to deal with the manipulation and characterisation of matter on the nanoscale.

Many subfields of these disciplines have been dealing with nanoscale phenomena for many years. A very non-exhaustive list of relevant sub-fields, with examples of topics in nanoscience, would include:

  • Colloid science. The characterisation and control of forces between sub-micron particles to control the stability of dispersions.
  • Metallurgy. The control of nanoscale structure to optimise mechanical and other properties – e.g. particle and precipitate hardening.
  • Molecular biology and biophysics. Structural characterisation at atomic resolution first of complex biomolecules, now of assemblies of macromolecules which function as nanomachines.
  • Polymer science. Systems such as block copolymers which self-assemble to form complex nanoscale structures, new architectures like hyperbranched polymers and dendrimers.
  • Semiconductor physics. Nanoscale low dimensional structures like multilayers, wires and dots exploiting quantum effects for new electronic and optoelectronic devices like light emitting diodes and lasers.
  • Supramolecular chemistry. The use of non-covalent interactions to create self-assembled nanoscale structures from molecular components.
  • The distinguishing feature of nanoscience is that increasingly we find methods and techniques from more than one of these existing subfields combined in novel ways.

    Nanotechnology is an engineering discipline which combines methods from nanoscience with the disciplines of economics and the market to create usable and economically viable products.

    Nanoscience and nanotechnology need to be distinguished. Without nanoscience, nanotechnology will not be possible. On the other hand, if you invest money in a nanoscience venture under the impression that it is nanotechnology, you are sure to be disappointed.

    In the next installment, I’ll discuss the various kinds of nanotechnology, from incremental technologies such as shampoos and textile treatments to the more radical visions.

    Will molecular electronics save Moore’s Law?

    Mark Reed, from Yale, was another speaker at a meeting I was at in New Jersey last week. He gave a great talk about the promise and achievement of molecular electronics which I thought was both eloquent and well-judged.

    The context for the talk is provided by the question marks hanging over Moore’s law, the well-known observation that the number of transistors per integrated circuit, and thus available computer power, has grown exponentially since 1965. There are strong indications that we are approaching the time when this dramatic increase, which has done so much to shape the way the world’s economy has changed recently, is coming to an end.

    The semiconductor industry is approaching a “red brick wall”. This phrase comes from the International Technology Roadmap for Semiconductors, an industry consensus document which sets out the technical barriers that need to overcome in order to maintain the projected growth in computer power. In the technical tables, cells which describe technical problems with no known solution are coloured red, and by 2007-8 these red cells proliferate to the point of becoming continuous – hence the red brick wall.

    A more graphic illustration of the problems the industry faces was provided in a plot that Reed showed of surface power density as a function of time. This rather entertaining plot showed that current devices have long surpassed the areal power density of a hot-plate, are not far away from the values for a nuclear reactor, and somewhere around the middle of the next decade will surpass the surface of the sun. Now I find the warm glow from my Powerbook quite comforting on my lap but carrying a small star around with me is going to prove limiting.

    So the idea that molecular electronics might help overcome these difficulties is quite compelling. In this approach, individual molecules are used as the components of integrated circuits, as transistors or diodes, for example. This provides the ultimate in miniaturisation.

    The good news is that (despite the Sch??n debacle) there are some exciting and solid results in the field. The simplest devices, like diodes, have two terminals, and there is no doubt that single molecule two-terminal devices have been convincingly demonstrated in the lab. Three terminal devices, like transistors, seem to be vital to make useful integrated circuits, though, and there progress has been slower. It’s difficult enough to wire up two connections to a single molecule, but gluing a third one on is even harder. This feat has been achieved for carbon nanotubes.

    What’s the downside? The carbon nanotube transistors have a nasty and underpublicised secret – the connections between the nanotubes and the electrodes are not, in the jargon, Ohmic – that means that electrons have to be given an extra push to get them from the electrode into the nanotube. This makes it difficult to scale them down to the small sizes that would be needed to make them competitive with silicon. And the single molecule devices have the nasty feature that every one is different. Conventional microelectronics works because every one of the tens of millions of transistors on something like a Pentium are absolutely identical. If the characteristics of each of the components were to randomly vary the whole way we currently do computing would need to be rethought.

    So it’s clear to me that molecular electronics remains a fascinating and potentially valuable research field, but it’s not going to deliver results in time to prevent a slow-down in the growth of computer power that’s going to begin in earnest towards the end of this decade. That’s going to have dramatic and far-reaching effects on the world economy, and it’s coming quite soon.