There’s a good review in Nature Reviews: Cancer (with free access) about the ways in which nanotechnology could help the fight against cancer – Cancer Nanotechnology: Opportunities and Challenges . The article, by Ohio State University’s Mauro Ferrari, concentrates on two themes – how nanotechnologies can help diagnose and monitor cancer, and how it could lead to more effective targeting and delivery of anti-cancer agents to tumours.
The extent to which we urgently need better ways of wrapping up therapeutic molecules and getting them safely to their targets is highlighted by a striking figure that the article quotes – if you inject monoclonal antibodies and monitor how many of these molecules get to a target within an organ, the fraction is less than 0.01%. The rest are wasted, which is bad news if these molecules are expensive and difficult to make, and even worse news if, like many anti-cancer drugs, they are highly toxic. How can we make sure that every one of these drug molecules get to where they are needed? One answer is to stuff them into a nanovector, a nanoscale particle that protects the enclosed drug molecules and delivers them to where they are needed. The simplest example of this approach uses a liposome – a bag made from a lipid bilayer. Liposome encapsulated anti-cancer drugs are now clinically used in the treatment of Karposi’s sarcoma and breast and ovarian cancers. But lots of work remains to make nanovectors that are more robust, more resistant to non-specific protein adsorption, and above all which are specifically targeted to the cells they need to reach. Such specific targeting could be achieved by coating the nanovectors with antibodies with specific molecular recognition properties for groups on the surface of the cancer cells. The article cites one cautionary tale that illustrates that this is all more complicated than it looks – a recent simulation suggests that it is possible to get a situation in which targeting a drug precisely to a tumour can make the situation worse, by causing the tumour to break up. It may be necessary not just to target the drug carriers to a tumour, but to make sure that the spatial distribution of the drug through the tumour is right.
The future will probably see complex nanovectors engineered to perform multiple functions, protecting the drugs, getting them through all the barriers and pitfalls that lie between the point at which the drug is administered and the part of the body where it is needed, and releasing them at their target. The recently FDA approved breast cancer drug, Abraxane, is an advance in the right direction; one can think of it as a nanovector that combines two functions. The core of the nanovector consists of a nanoparticulate form of the drug itself; dispersing it so finely dispenses with the need for toxic solvents. And bound to the drug nanoparticle are protein molecules which help the nanoparticles get across the cells that line blood vessels. It’s clear that as more and more functions are designed into nanovectors, there’s a huge amount of scope for increases in drug effectiveness, increases that could amount to orders of magnitude.
Nanoscale Science and Technology is a new, graduate level interdisciplinary textbook which has just been published by Wiley. It’s based on the Masters Course in Nanoscale Science and Technology that we run jointly between the Universities of Leeds and Sheffield.
The book covers most aspects of modern nanoscale science and technology. It ranges from “hard” nanotechnologies, like the semiconductor nanotechnologies that underly applications like quantum dot lasers, and applications of nanomagnetism like giant magnetoresistance read-heads, via semiconducting polymers and molecular electronics, through to “soft” nanotechnologies such as self-assembling systems and bio-nanotechnology. I co-wrote a couple of chapters, but the heaviest work was done by my colleagues Mark Geoghegan, at Sheffield, and Ian Hamley and Rob Kelsall at Leeds, who, as editors, have done a great job of knitting together the contributions of a number of authors with different backgrounds to make a coherent whole.
There’s a nice piece in Slate by Paul Boutin reporting on his trip round the Hewlett-Packard labs in Palo Alto. The opening stresses the evolutionary, short term, character of the research going on there, stressing that these projects only get funded if they are going to make a fast return for the company, usually within five years. The first projects he mentions are about RFID (radio frequency identification), and these are discussed in terms of Walmart, supply chains and keeping track of your pallets. I can relate to this because my wife used to be a production planner. She used to wake up in the night worrying about whether there were enough plastic overcaps in the warehouse to pack the next week’s production, but she knew that the only way to find out for sure, despite all their smart SAP systems, was to walk down to the warehouse and look. But despite these mundane immediate applications it’s the technologies that are going to underlie RFID that also have such uncomfortable implications for a universal surveillance society.
The article moves on to talk about HP’s widely reported recent development of crossbar latches as a key component for molecular electronic logic circuits (see for example this BBC report, complete with a good commentary from Soft Machines’s frequent visitor, Philip Moriarty). The author rightly highlights the need to develop new, defect tolerant computer architectures if these developments in molecular electronics are to be converted into useful products. This nicely illustrates the point I made below, that in nanotechnology you may well need to develop systems architectures that accommodate the physical realities of the nanoscale, rather than designing the architecture first and hoping that you’ll be able to find low-level operations that will suit your preconceived notions .
Nanoscientists would love to have an instrument which would allow them to see what they were doing while they picked individual molecules up and moved them around. At the moment researchers can manipulate individual molecules with scanning probe microscopy techniques, and high resolution transmission electron microscopy allows structures to be visualised with resolutions better than an individual atom. A major grant has recently been awarded to a team of UK scientists to combine these technologies, developing instrumentation that combines nanoscale actuators with high resolution electron microscopy. The result should be a new tool for manipulating single atoms and molecules while they are being imaged, with atomic resolution, in three dimensions.
The ��2.3 million ($4.4 million) grant comes from the UK government’s Basic Technology Program. It is led by Beverley Inkson and Guenter Moebus here at the University of Sheffield, and also involves the nanoscience group at the University of Nottingham.
The image is a simulated interaction between an electron beam and a surface, showing the size of the electron beam to scale with the atoms making up the surface. The immediate uses that are foreseen for this technology are mostly as a nanoscale research tool, with applications to research in nanoscale electronic, magnetic and electromechanical devices, the manipulation of fullerenes and nanoparticles, nanoscale friction and wear, biomaterials, and systems for carrying out quantum information processing.
More details can be found in this one-page PDF.