Printing devices

I spent a couple of days earlier this week at a conference in Manchester called “Printing of Functional Materials”. The premise of this meeting was the growing realisation that printing technologies, both the traditional, like silk-screen and gravure, and modern, like ink-jet, offer scalable, cheap and flexible ways of precisely depositing small quantities of materials on surfaces. Traditional inks are just vehicles for pigments to create static images, but there’s no reason why you can’t use printing to deposit materials that are conductors or semiconductors of electricity, which are electro-luminescent, or which have some biological functionality. Indeed, as one of the organisers of the conference has shown, one can even use ink-jet printing to deposit living human cells, with potential applications in tissue engineering.

The degree of commercial interest in these technologies was indicated by the fact that, unusually for an academic conference, more than a third of the attendees were from the commercial sector. Many of these were from the cluster of small and medium companies developing ink-jet technologies from around Cambridge, but European and American concerns were well represented too. My impression that the sector that is closest to maturity in this area is in electrically functional devices, where there’s a great deal of momentum to drive down the cost of RFID and to develop cheap, flexible displays. But there are still many materials issues to solve. It’s not easy to get a complex fluid to flow in the right way to form the tiny, well-defined droplets that make it ink-jet well, and formulating the ink in a way that makes it dry to give the best properties is awkward too. Silver inks illustrate the problems – commercial inks to write conducting lines sometimes use silver nanoparticles. Making the silver particles very small is helpful in making them coalesce well to make a continuous silver layer; the melting point of materials is lowered when they are in nanoparticulate form, making them sinter at lower temperatures. But then you have to work hard to stop the particles aggregating in the ink (it’s particularly undesirable, or course, if they aggregate in the ink-jet nozzle and block it up). To stabilise them, you need to coat them with surfactants or polymer molecules. But then this organic coating needs to be driven off by a heating step to get good conduction, and this compromises your ability to print on paper and plastics, which can’t take much heating. It seems to me that this technology has a huge amount of promise, but there’s a lot of materials science and colloid science to be done before it can fulfill its potential.

On my nanotechnology bookshelf

Following my recent rather negative review of a recent book on nanotechnology, a commenter asked me for some more positive recommendations about books on nanotechnology that are worth reading. So here’s a list of nanotechnology books old and new with brief comments. The only criterion for inclusion on this list is that I have a copy of the book in question; I know that there are a few obvious gaps. I’ll list them in the order in which they were published:

Engines of Creation, by K. Eric Drexler (1986). The original book which launched the idea of nanotechnology into popular consciousness, and still very much worth reading. Given the controversy that Drexler has attracted in recent years, it’s easy to forget that he’s a great writer, with a very fertile imagination. What Drexler brought to the idea of nanotechnology, which then was dominated, on the one hand by precision mechanical engineering (this is the world that the word nanotechnology, coined by Taniguchi, originally came from), and on the other by the microelectronics industry, was an appreciation of the importance of cell biology as an exemplar of nanoscale machines and devices and of ultra-precise nanoscale chemical operations.

Nanosystems: Molecular Machinery, Manufacturing, and Computation , by K. Eric Drexler (1992). This is Drexler’s technical book, outlining his particular vision of nanotechnology – “the principles of mechanical engineering applied to chemistry” – in detail. Very much in the category of books that are often cited, but seldom read – I have, though, read it, in some detail. The proponents of the Drexler vision are in the habit of dismissing any objection with the words “it’s all been worked out in ‘Nanosystems'”. This is often not actually true; despite the deliberately dry and textbook-like tone, and the many quite complex calculations (which are largely based on science that was certainly sound at the time of writing, though there are a few heroic assumptions that need to be made), many of the central designs are left as outlines, with much detail left to be filled in. My ultimate conclusion is that this approach to nanotechnology will turn out to have been a blind alley, though in the process of thinking through the advantages and disadvantages of the mechanical approach we will have learned a lot about how radical nanotechnology will need to be done.

Molecular Devices and Machines : A Journey into the Nanoworld , by Vincenzo Balzani, Alberto Credi and Margherita Venturi (2003). The most recent addition to my bookshelf, I’ve not finished reading it yet, but it’s good so far. This is a technical (and expensive) book, giving an overview of the approach to radical nanotechnology through supramolecular chemistry. This is perhaps the part of academic nanoscience that is closest to the Drexler vision, in that the explicit goal is to make molecular scale machines and devices, though the methods and philosophy are rather different from the mechanical approach. A must, if you’re fascinated by cis-trans isomerisation in azobenzene and intermolecular motions in rotaxanes (and if you’re not, you probably should be).

Bionanotechnology : Lessons from Nature, by David Goodsell (2004). I’m a great admirer of the work of David Goodsell as a writer and illustrator of modern cell biology, and this is a really good overview of the biology that provides both inspiration and raw materials for nanobiotechnology.

Soft Machines : Nanotechnology and Life, by Richard Jones (2004). Obviously I can’t comment on this, apart from to say that three years on I wouldn’t have written it substantially differently.

Nanotechnology and Homeland Security: New Weapons for New Wars , by Daniel and Mark Ratner (2004). I still resent the money I spent on this cynically titled and empty book.

Nanoscale Science and Technology, eds Rob Kelsall, Ian Hamley and Mark Geoghegan (2005). A textbook at the advanced undergraduate/postgraduate level, giving a very broad overview of modern nanoscience. I’m not really an objective commentator, as I co-wrote two of the chapters (on bionanotechnology and macromolecules at interfaces), but I like the way this book combines the hard (semiconductor nanotechnology and nanomagnetism) and the soft (self-assembly and bionano).

Nanofuture: What’s Next For Nanotechnology , by J. Storrs Hall (2005). Best thought of as an update of Engines of Creation, this is a an attractive and well-written presentation of the Drexler vision of nanotechnology. I entirely disagree with the premise, of course.

Nano-Hype: The Truth Behind the Nanotechnology Buzz, by David Berube (2006). A book, not about the science, but about nanotechnology as a social and political phenomenon. I reviewed in detail here. I’ve been referring to it quite a lot recently, and am increasingly appreciating the dry humour hidden within its rather complete historical chronicle.

The Dance of Molecules : How Nanotechnology is Changing Our Lives , by Ted Sargent (2006). Reviewed by me here, it’s probably fairly clear that I didn’t like it much.

The Nanotech Pioneers : Where Are They Taking Us?, by Steve Edwards (2006). In contrast to the previous one, I did like this book, which I can recommend as a good, insightful and fairly nanohype-free introduction to the area. I’ve written a full review of this, which will appear in “Physics World” next month (and here also, copyright permitting).

Magnetic racetrack memories

It’s sobering to think that the hard disk drive is now 50 years old, and yet as a data storage technology it is still about 100 times more cost effective than its competitors. Yet its drawbacks are pretty apparent – hard drives are inherently slow and unreliable, and because of the mechanical nature of the device this is not likely to change fundamentally. Meanwhile various solid state memories, like SRAM, DRAM and flash memory, are very fast-growing market segments. Yet even with the continual shrinking of circuit dimensions, these technologies will not be able to match the raw storage capacity of hard drives. The search is on, then, for memory devices that have the capacity of hard drives but the robustness and speed of access of solid state devices. One candidate for such a device is the magnetic racetrack, invented by IBM’s Stuart Parkin. One needs to take Parkin’s views on magnetic data storage devices very seriously; as the inventor of the giant magnetoresistance based read head, it’s his work that has permitted the miniaturisation of hard disks that we see today and which makes possible devices like the video iPod.

I heard Parkin speak about his invention at last week’s Condensed Matter and Materials Physics meeting of the Institute of Physics, where he was one of the plenary lecturers. A version of this lecture, which Parkin recently gave at UCSB, can be downloaded from here – this includes some very helpful videos. Parkin’s invention is disclosed in this US patent; the basic idea is that data is stored in a pattern of magnetic domain walls in a nanowire of a magnetic material, which needs to be about 10 microns long and less than 100 nanometers wide. Rather than reading the pattern of magnetic domains by moving a read-head along the wire, in the magnetic racetrack the wire is held stationary and the magnetic domains are swept past a stationary read-head by applying a current. There’s a lot of fascinating physics in the way a current can move the domain walls, but the attraction of this arrangement from a practical point of view is that there are no moving parts, and the data density can be very high. Parkin envisages an array of these nanowires, each bent in a U-shape, with the bottom of the U held against a read head and a write head which are laid down by conventional planar silicon technology. It’s this compatibility with existing manufacturing technology that Parkin sees as a compelling advantage of his idea, as compared with other proposals for high density data storage devices depending on more exotic schemes using, for example, carbon nanotubes.

Hollow centre: Nanotechnology is a discipline in the throes of an existential crisis

This review of “The Dance of Molecules : How Nanotechnology is Changing Our Lives”, by Ted Sargent, is published in today’s edition of Nature. The published version differs slightly from this unedited text, which is reproduced here by permission of Nature, and should not be further reproduced.

Every field needs its founding genius; the appropriately mythic figure for many in nanotechnology is Richard Feynman, on the strength of his 1960 lecture “There’s plenty of room at the bottom”. Yet this particular canonization is entirely retrospective; there’s little evidence that this lecture made much impact at the time, and Feynman rarely returned to the topic to develop his thoughts further. Perhaps a better candidate to be considered nanotechnology’s father figure is President Clinton, whose support of the USA’s National Nanotechnology Initiative converted overnight many industrious physicists, chemists and materials scientists into nanotechnologists. In this cynical (though popular) view, the idea of nanotechnology did not emerge naturally from its parent disciplines, but was imposed on the scientific community from outside. As a consequence, nanotechnology is a subject with an existential crisis – is there actually any firm core to this subject at all, any consensus as to what, at heart, defines nanotechnology?

This is the problematic territory that Sargent has tried to map out for the popular reader in “The Dance of Molecules”. How then, does Sargent deal with this tricky question of definition? “Nanotechnologists”, he says, “have as their goal to design and build matter to order, specified by a functional requirement”. This is fine, but it may leave followers of an earlier new discipline – Materials Science – puzzled, as this was their slogan too. He begins by maintaining the centrality of quantum mechanics, but really this is just an assertion of the centrality of chemistry. The title “The Dance of Molecules” might suggest the idea of Brownian motion, but this idea isn’t pursued. In the end he is forced to conclude that nanotechnology’s central theme isn’t scientific, but sociological – a new culture of interdisciplinarity that searches for convergence between increasingly atomized scientific fields.

There is one version of nanotechnology that does have clarity – K. Eric Drexler’s vision of scaled-down mechanical engineering. It is this revolutionary vision that underlies much of the popular image of nanotechnology, promoted through science fiction, films and computer games. Yet very few scientists take this version of nanotechnology seriously. This leaves a problem for popularizers who wish to reflect the scientific consensus. One can rebut these claims in detail, or one can simply dismiss them with appeals to the authority of distinguished scientists like the late Richard Smalley. Sargent takes a third course; he simply does not mention them. This seems to me to be the most unsatisfactory approach of all; if one thinks the Drexler vision is wrong, one should say so, otherwise the reading public, who are extensively exposed to these ideas, will be left very confused.

Lacking a strong science core, the book is written thematically, as a tour of application areas in health, environment and information. Quantum dots make frequent appearances, there’s quite a good description of molecular electronics, which is duly circumspect about the balance of potential and practical difficulty. The descriptions of bionanotechnology carry less conviction – the description of molecular motors seems particularly misleading. Many people will find the rather overwrought style irritating – perhaps the oddest of the many strange and strained metaphors and similes is his description of photolithography as being “like crop circles formed when the sun blazes through round partings in the English permacloud”.

Nanotechnology, above all an applied science, has been the subject of a possibly unprecedented push for early consideration of social, environmental and ethical impacts. Here the rhetoric is overwhelmingly positive, and the need for public engagement seen solely in terms of defusing possible opposition. We’re promised an end to cancer, the restoration of sight to the blind, and, via unconventional solar cells and the hydrogen economy, an end to our dependence on fossil fuels for energy. The possible downsides are largely limited to the potential toxicity of some nanoparticles. Even in military applications, the emphasis is on defensive applications and on the possibility that nanotechnology will make it much easier for the west to wage a “clean war”, in which combatants are easily distinguished from non-combatants. I don’t think you need to be a radical anti-technology activist to greet this claim with some scepticism.

The current difficulties of nanotechnology include its incompletely formed disciplinary identity and lack of clear definition, the overselling of its immediate potential economic and societal impacts, and its association with extreme utopian and dystopian futuristic visions. A good popular book could contribute to overcoming these difficulties by setting out a clear set of core scientific principles that underpin nanotechnology, making realistic claims for what applications and impacts will be possible on what timescale, and presenting a more sophisticated understanding of the relationships between science, the economy and society. This book does not fulfill this need.

A quiet policy shift for UK nanotechnology

The centrepiece of the UK’s publically funded nanotechnology effort has been the Department of Trade and Industry’s Micro and Nanotechnology manufacturing initiative (MNT). This had a high profile launch in July 2003 in a speech by the science minister, Lord Sainsbury, with an initial commitment of £90 million. When, last year, the Secretary of State for Trade and Industry announced an increase of DTI nanotechnology funding to £200 million, the future of the MNT program seemed assured. But a close reading of recent announcements from the DTI make it clear that whatever extra funding they may be putting into nanotechnology, it’s not going into the MNT program.

Technology and innovation policy at the DTI is now informed by a Technology Strategy Board, made up largely from figures from industry and venture capital. This board’s first annual report (PDF) was published in November 2005, and contained this recommendation:
“We also recommend incorporating nanotechnology in the competitions for underpinning technologies, such as advanced materials, to avoid confusion. It is important, however, that the DTI keeps track of expenditure on nano-projects to be able to honour its commitments to Parliament in this area.”

It’s now clear that this recommendation has been followed. The spring competition for collaborative R&D, announced here, and to be formally launched on April 26th, does not include a separate micro- and nano- theme. Instead, the call is based around what the DTI calls innovation platforms – societal challenges which many technologies can be combined to address. Undoubtedly, some of these areas will call for nano- enabled solutions. Novel Technologies For Low-Cost, High Efficiency Electronics And Lighting Systems mentions plastic electronics and light emitting diodes as potential technologies of interest, while Low Carbon Energy Technologies talks about the need for novel solar cells.

This is an interesting shift of emphasis. The MNT program had few friends in the world of academic nanoscience and technology; it always seemed happier with the micro- than the nano- , and the insistence that programs be business-oriented seemed on occasion to shade into a positive antipathy to academic nanoscience and led to the perception that the program was considerably friendlier to consultants than either technologists or scientists. On the other hand, the idea of building an applied research program around problems to be solved, rather than technological solutions looking for problems, seems one that is well worth trying.

What needs to happen for this to work? Firstly, the ongoing MNT program needs to become much more effective at connecting the best parts of the UK nanoscience base to potential users of the new technologies, and it needs to give more impression of being a little more forward looking in the technologies it’s sponsoring. Then the Technology Program is going to have to work hard to make sure that the right scientists are engaged and nanotechnology gets an appropriate share of the resources, meeting the very specific commitment to a certain level of spending on nanotechnology made by the Minister.

It’s going to be important to get this right. As I discussed a couple of weeks ago, there’s growing evidence of an external perception of the UK nanotechnology program as being diffuse, unfocused and ineffective. Given the general strength of the UK science base, the UK should be in a much better position; there’s a real danger that this could turn into a big missed opportunity.

Printing silicon

The main driving force for developing plastic electronics – the use of semiconducting polymers to make logic devices, light emitting diodes and displays, and solar cells – is the hope that these materials can be processed very cheaply. Because these materials are soluble, devices can be made by processes like ink-jet printing or screen printing. Compared to standard silicon-based electronics, the performance of these devices is often not very good, but the fact that you won’t need the massive capital expenditure of a conventional silicon fab tilts the economics towards the plastic materials, at least for applications where cost is more important than performance. But Nature this week reports an interesting twist – a group from the Seiko-Epson labs in Japan report a new way of printing silicon directly from solution (see also the Epson press release here).

The method is based on using polysilane as a precursor material. Polysilane is essentially the silicon based analogue of the well-known polymer polyethylene, consisting of a long chain of silicon atoms, each of which has two hydrogen atoms attached. But unlike polyethylene, polysilane is both unstable and very difficult to work with, being insoluble in most common solvents. The Japanese group got over this problem by starting with a five-membered ring version of the polysilane molecule – cyclo-pentasilane (this is the silicon analogue of cyclopentane). They found that polysilane was soluble in solutions of this precursor, and these solutions could be ink-jet printed and converted into pure silicon layers by a simple heating step.

The silicon layers formed this way are amorphous, not crystalline, and their electronic properties are not very good compared to silicon films prepared by more conventional routes (though they are better than most polymer semiconductors). Plastic electronics still has some advantages over this new process, which requires temperatures too high for the use of plastic substrates. The printing step is also complicated by the need to exclude water and oxygen. Nonetheless, it’s an important step forward towards the development of low-cost electronics for applications like large area displays and cheap solar cells.

The best of both worlds – organic semiconductors in inorganic nanostructures

Today’s picture is a scanning electron micrograph of a hybrid structure in which organic light emitters are confined in a micropillar by a pair of inorganic multilayer mirrors. These hybrid organic/inorganic structures have interesting photonic properties that may have applications in quantum cryptography and quantum computing; this work comes from my colleagues in the physics department here at Sheffield in collaboration with some of our electrical engineers.

SEM image of a micropillar
Image by Wen-Chang Hung, image post-treatment by Andy Eccleston.

This structure is made by laying down, by chemical vapour deposition, 12 pairs of alternate layers of silicon oxide and silicon nitride, each exactly one quarter of a light wavelength in thickness. This is coated by a 240 nm (half a wavelength) thick layer of the polymer polystyrene, doped with an organic dye called Lumogen red, which in turn is coated by another 12 pairs of layers, this time of tellurium oxide and lithium fluoride, thermally evaporated. The pillar is carved out of the resulting layer cake structure by using a focused ion beam.

The multilayers act as perfect mirrors. Imagine putting a light source in between two parallel mirrors – you’d get an infinite (if the mirrors are perfect) series of reflections of the light. In our situation the dye molecule is the light source; when it emits a single photon, that photon is going to interfere with its ghostly counterparts emitted from the reflections, which are all in phase with each other. This makes it a very efficient producer of single photons – potentially these could be used for quantum cryptography or quantum computing.

All of this has already been demonstrated using quantum dots – tiny particles of inorganic semiconductor – as the light emitter. What’s the advantage of using an organic dye instead? In these devices, photons are emitted when an electron and a hole annihilate. These electron-hole pairs – called excitons – are very weakly bound in ordinary semiconductors, which means that these devices only work at rather low temperatures, about 50 K. In organic molecules the charges distort the structure of the molecule itself, which means that the exciton is bound much more strongly and the device will work at room temperature. It goes without saying that this feature makes the possibility of an economically viable quantum computer seem much closer. To be fair, though, the organic materials have disadvantages, too – they are susceptible to being bleached by bright light.

The work is a collaboration between my colleagues in physics, Ali Adawi, Ashley Cadby, Daniele Sanvitto, Liam Connolly and Richard Dean, who are postdocs and grad students in the groups of David Lidzey, Mark Fox, and Maurice Skolnick. Device fabrication was done with the help of Wen-Chang Hung and Abbes Tahraoui, in Tony Cullis’s group in our Electronic and Electrical Engineering Department. It’s reported in the current edition of Advanced Materials here (subscription required).

Taking the high road to large scale solar power

In principle there’s more than enough sunlight falling on the earth to meet all our energy needs in a sustainable way, but the prospects for large scale solar energy are dimmed by a dilemma. We have very efficient solar cells made from conventional semiconductors, but they are too expensive and difficult to manufacture in very large areas to make a big dent in our energy needs. On the other hand, there are prospects for unconventional solar cells – Graetzel cells or polymer photovoltaics – which can perhaps be made cheaply in large areas, but whose efficiencies and lifetimes are too low. In an article in this month’s Nature Materials (abstract, subscription required for full article, see also this press release), Imperial College’s Keith Barnham suggests a way out of the dilemma.

The efficiencies of the best solar cells available today exceed 30%, and there is every reason to suppose that this figure can be substantially increased with more research. These solar cells are based, not on crystalline silicon, like standard solar cell modules, but on carefully nanostructured compound semiconductors like gallium arsenide (III-V semiconductors, in the jargon). By building up complex layered structures it is possible efficiently to harvest the energy of light of all wavelengths. The problem is that these solar cells are expensive to make, relying on sophisticated techniques for building up different semiconductor layers, like molecular beam epitaxy, and currently are generally only used for applications where cost doesn’t matter, such as on satellites. Barnham argues that the cost disadvantage can be overcome by combining these efficient solar cells with low-cost systems for concentrating sunlight – in his words “our answer to this particular problem is ‘Smart Windows’, which use small, transparent plastic lenses that track the sun and act as effective blinds for the direct sunlight, when combined with innovative light collectors and small 3rd-generation cells,” and he adds “Even in London a system like this would enable a typical office behind a south-facing wall to be electrically self-sufficient.”

Even with conventional technologies, Barnham calculates that if all roofs and south-facing walls were covered in solar cells this would represent three times the total generating capacity of the UK’s current nuclear program – that is, 36 GW. This represents a really substantial dent in the energy needs of the UK, and if we believe Barnham’s calculation that his system would deliver about three times as much energy as conventional solar cells, this represents pretty much a complete solution to our energy problems. What is absent from the article, though, is an estimate of the total production capacity that’s likely to be achievable, merely observing that the UK semiconductor industry has substantial spare capacity after the telecoms downturn. This is the missing calculation that needs to be done before we can accept Barnham’s optimism.

Nanoscience in the European Research Area

Most research in Europe, in nanotechnology or any other field, is not funded by the European Union. Somewhere between 90% and 95% of research funding comes from national research agencies, working with their own procedures, to their own national priorities. This bothers some people, who see this as yet another example of the way in which Europe doesn’t get its act together and thus fails to live up to its potential. In research, the European Commission fears that, compared to rivals in the USA or the far east, European efforts suffer from fragmentation and duplication. Their solution is the concept of the “European Research Area”, in which different national funding agencies work to create a joint approach to funding, as well as doing what they can to ensure free movement of researchers and ideas across the continent. As part of this initiative, national research agencies have come together to form thematic networks. Nanoscience has such a network, and it is meeting this week in Amsterdam to finalise the details of a joint funding call on the theme of singly addressable nanoscale objects.

Another way of looking at the issue of the many different approaches used in funding nanoscience across Europe is that this gives us a laboratory of different approaches, a kind of controlled experiment in science funding models. Yesterday’s meeting was devoted to series of overviews of the national nanoscience landscape in each country. This was instructive and contrasting; among the large countries one had the German approach, with major groups across the country being supported with really substantial infrastructure. The French had most logical and comprehensive overall plan, while the talk describing the British effort (given by me) couldn’t entirely hide its ad-hoc and largely unplanned character. The presentations from smaller countries varied from really rather impressive displays of focused activities (from the Netherlands, Finland and Austria in particular), to more aspirational talks from countries like Portugal and Slovakia.

How do the European nations rank in nanoscience? The undisputed leader is clearly Germany, with France and the UK vying for second place. Readers of this blog will know that I’m suspicious of bibliometric measures, but some interesting data was shown showing France second and the UK third by total numbers of nanoscience papers, but with that order being reversed when only highly cited papers were considered. But the efforts of the rich, smaller European countries are very significant; these are countries with high per person GDP figures which typically spend a higher proportion of GDP on research than larger countries. They combine this with a very focused and targeted approach to the science they support. The Netherlands, in particular, looks very strong indeed in those areas that it has chosen to concentrate on.

Another draft nano-taxonomy

It’s clear to most people that the term nanotechnology is almost impossibly broad, and that to be useful it needs to be broken up into subcategories. In the past I’ve distinguished between incremental nanotechnology, evolutionary nanotechnology and radical nanotechnology, on the basis of the degree of discontinuity with existing technologies. I’ve been thinking again about classifications, in the context of the EPSRC review of nanotechnology research in the UK; here one of the things we want to be able to do is to be able to classify the research that’s currently going on. In this way it will be easier to identify gaps and weaknesses. Here’s an attempt at providing such a classification. This is based partly on the classification that EPSRC developed last time it reviewed its nanotechnology portfolio, 5 years ago, and it also takes into account the discussion we had at our first meeting and a resulting draft from the EPSRC program manager, but I’ve re-ordered it in what I think is a logical way and tried to provide generic definitions for the sub-headings. Most pieces of research would, of course, fit into more than one category.

Enabling science and technology
1. Nanofabrication
Methods for making materials, devices and structures with dimensions less than 100 nm.
2. Nanocharacterisation and nanometrology
Novel techniques for characterisation, measurement and process control for dimensions less than 100 nm.
3. Nano-modelling
Theoretical and numerical techniques for predicting and understanding the behaviour of systems and processes with dimensions less than 100 nm.
4. Properties of nanomaterials
Size-dependent properties of materials that are structured on dimensions of 100 nm or below.
Devices, systems and machines
5. Bionanotechnology
The use of nanotechnology to study biological processes at the nanoscale, and the incorporation of nanoscale systems and devices of biological origin in synthetic structures.
6. Nanomedicine
The use of nanotechnology for diagnosing and treating injuries and disease.
7. Functional nanotechnology devices and machines
Nanoscale materials, systems and devices designed to carry out optical, electronic, mechanical and magnetic functions.
8. Extreme and molecular nanotechnology
Functional devices, systems and machines that operate at, and are addressable at, the level of a single molecule, a single atom, or a single electron.
Nanotechnology, the economy, and society
9. Nanomanufacturing
Issues associated with the commercial-scale production of nanomaterials, nanodevices and nanosystems.
10. Nanodesign
The interaction between individuals and society with nanotechnology. The design of products based on nanotechnology that meet human needs.
11. Nanotoxicology and the environment
Distinctive toxicological properties of nanoscaled materials; the behaviour of nanoscaled materials, structures and devices in the environment.

All comments gratefully received!