Nanotechnology gets complex

The theme of my book Soft Machines is that the nanomachines of biology operate under quite different design principles from those we are familiar with at the macroscale. These design principles exploit the different physics of the nanoworld, rather than trying to engineer around them. The combination of Brownian motion – the relentless shaking and jostling that’s ubiquitous in the nanoworld, at least at normal temperatures – and strong surface forces is exploited in the principle of self-assembly. Brownian motion and the floppiness of small scale structures are exploited in the principle of molecular shape change, which provides the way our muscles work. We are well on our way to exploiting both these principles in synthetic nanotechnology. But there’s another design principle that’s extensively used in Nature, that nanotechnologists have not yet exploited at all. This is the idea of chemical computing – processing information by using individual molecules as logic gates, and transmitting messages through space using the random motion of messenger molecules, driven to diffuse by Brownian motion. These are the mechanisms that allow bacteria to swim towards food and away from toxins, but they also underly the intricate way in which cells in higher organisms like mammals interact and differentiate.

One argument that holders of a mechanical conception of radical nanotechnology sometimes use against trying to copy these control mechanisms is that they are simply too complicated to deal with. But there’s an important distinction to make here. These control systems and signalling networks aren’t just complicated – they’re complex. Recent theory of the statistical mechanics of this sort of multiply connected, evolving networks is beginning to yield fascinating insights (see, for example, Albert-László Barabási’s website). It seems likely that these biological signalling and control networks have some generic features in common with other complex networks, such as the internet, and even, perhaps, free market economies. Rather than being the hopelessly complicated result of billions of years of aimless evolutionary accretion, we should perhaps think of these networks as being optimally designed for robustness in the noisy and unpredictable nanoscale environment.

It seems to me that if we are going to have nanoscale systems of any kind of complexity, we are going to have to embrace these principles. Maintaining rigid, central control of large scale systems always seems to be a superficially good idea, but such control systems are often brittle and fail to adapt to unpredictability, change and noise. The ubiquity of noise in the nanoscale world offers a strong argument for using complex, evolved control systems. But we still lack some essential tools for doing this. In particular, biological signalling relies on allostery. This principle underlies the operation of the basic logic gates in chemical computing; the idea is that when a messenger molecule binds to a protein, it subtly changes the shape of the protein and affects its ability to carry out a chemical operation. Currently synthetic analogues for this crucial function are very thin on the ground (see this abstract for something that seems to be going the right way). It would be good to see more effort put in this difficult, but exciting, direction.

5 thoughts on “Nanotechnology gets complex”

  1. One of the interesting differences between your approach and that of the mechanosynthesis crowd is that you talk about strong surface forces, while Drexler emphasizes the absence of friction as significant factor in a well designed nanomachine. His bearings and other sliding interfaces are supposedly nearly frictionless.

  2. Yes, indeed, this is an interesting issue. Actually we know a lot more about nanoscale friction than we did when Nanosystems was written. It turns out that the mechanism that dominates observed wearless friction between clean surfaces – an atomic stick-slip process described by the Tomlinson model – was not included in the list of possible dissipation mechanisms in Nanosystems. As it happens I had a bit of a discussion about this with Ralph Merkle yesterday, in which he maintained that this wasn’t relevant because “well designed” by definition means that it must be designed so this mechanism isn’t operative. Well maybe, but if you don’t consider it as a possibility you don’t know just how severe the constraints need to be (which would need to be light loads, incommensurate surfaces) to make sure it doesn’t happen. The one place I’ve found where you can make a direct comparison between a Nanosystems dissipation estimate for a “well designed” bearing and the results of a serious MD simulation show that the Nanosystems dissipation estimate is substantially too low.

  3. “Rather than being the hopelessly complicated result of billions of years of aimless evolutionary accretion, we should perhaps think of these networks as being optimally designed for robustness in the noisy and unpredictable nanoscale environment.”

    Don’t tell me you are joining the ranks of the “Intelligent Design” crowd, are you Dr Jones? I can see the headline “Nano-scientist supports Intelligent Design and rejects Evolution”

    Now son,
    If you want to repent this hearsay and rejoin The Community,
    you will need to do a slight penance,
    First, you must write a blog post on : how the process of evolution is being used to solve problems.
    Second, you must write an essay “Darwin and D’A Thompson : understanding biology at the nano-scale”

    I am sure (with time) we will remember this as a simple failure to communicate accurately, a writer carried away with a metaphor not an endorsement of non-science over science.

    Via con Dios

    Father Charlie McDarwin
    Office of the Inquisition
    Woods Hole, Mass

  4. Gosh, that will teach me to be more careful in my choice of words! In the hope of lenient treatment, I refer you to Soft Machines (the book) which has a great deal about how effective evolution is as a tool for hard optimisation problems in multidimensional spaces.

  5. I have been looking at ‘Drexlerian’ Nanotech for some time. And I have come to the conclusion, that it may be possible for there to be a half way house between the ‘Statistical optimisation’ crowd and the ‘Mechanical synthesis’ crowd.

    I believe that good old fashioned Cybernetics (wiener analysis) could do the trick! Essentially, rather than mechanical devices at the nanoscale, (which most probable be too sticky to move around at room temperature). One could molecular blobs which are bathed in a sea of communication molecules.

    However the divergence with nature is that these blobs could be put together to make mechanical like structures!

    Looking a extremely simple 1_d linear models, it appears that the main problem with this idea is that the blobs would interact nonlinearly over large transport volumes. Tantalizing though.

    An amateur mathematician

Comments are closed.