Consider the H2O molecules in a water droplet. The molecules react to intermolecular forces that aim to align them relative to their neighbors. At the same time, they undergo rapid thermal motion in the form of intramolecular vibrations and fluctuations of position and orientation within cages formed by neighboring molecules. Thermal motion implicates randomness, which counteracts the ordering effect of intermolecular forces. As the droplet is cooled, thermal motions decrease. Eventually, at a low enough temperature, randomness is suppressed; nucleation and the growth of an ice crystal follows. This phase transition is reversible. As the ice heats, water molecules intensify their motion and the ice melts. The equilibrium phase of water is the solid phase below 0°C (32°F); the fluid phase occurs above that temperature.
Intermolecular forces in water are described by the energy E and the entropy S. At each temperature T and for a given phase, fluid or solid, both variables can be measured or calculated. Energy and entropy can be combined to form free energy F = E – TS. The formula leads to the minimum free energy principle, which states that the thermodynamically stable phase minimizes free energy in equilibrium. This principle implies that the phase behavior of a system at equilibrium is governed by an extremal principle.
Static self-assembly at or near equilibrium leads to the formation of static structures, such as the ice crystal described above. Although static structures have many practical applications, most prominently in the synthesis of materials, there are limits. Once equilibrium has been reached, no further evolution takes place. Dynamic self-assembly occurs far from equilibrium. But to keep a system out of equilibrium, energy must continuously be added to the system. This energy is absorbed, used to perform the desired function, and returned to the environment in a different form.
Life is a sophisticated example of dynamic self-assembly. Energy is provided to the system in the form of sunlight, or, more specifically, the dissipation of electromagnetic radiation. Sunlight is absorbed in the atmosphere, converted multiple times, and eventually emitted back into space as infrared radiation. The dissipation of sunlight involves the conversion of a smaller number of high-energy photons into a larger number of low-energy photons. As a result of this process, the entropy of the universe is increased. All dissipative processes follow the same general principle: energy originally contained in a few degrees of freedom—low entropy—is spread over many degrees of freedom—high entropy.
In 1977, Ilya Prigogine received the Nobel Prize in Chemistry for his work on dissipative structures.1 Prigogine postulated a principle of minimum entropy production that only applies near thermodynamic equilibrium and that works to transient effect. These are limiting constraints. While the minimum free energy principle ensures that static self-assembly is well understood, it appears there is no general extremum principle that governs the evolution of systems far from equilibrium.2 Dynamic self-assembly cannot be predicted completely, if only because it is often chaotic.
Dynamic self-assembly often reaches a stationary state if energy input persists for a sufficiently long time. A system is in a stationary state if we observe the system at two times and cannot know with certainty which of the two times came earlier. Stationary states can be very diverse. Simple stationary states are either time-invariant or oscillatory. A house heated in winter will eventually reach a steady state with constant temperature once energy loss to the environment balances energy input.
The Belousov–Zhabotinsky (BZ) reaction is a classic example of an oscillatory stationary state. Several chemical reactions occur simultaneously in solution, causing the solution to oscillate multiple times between two colors.3 Some of the BZ reactions are autocatalytic; others catalyze each other.4 Once energy input ceases, the stationary state falls back to equilibrium. In the case of the BZ reaction, the oscillation dies down once all the chemical energy in the system is used up.
Growing evidence stresses the importance of dissipation in controlling self-assembly far from equilibrium. In their paper, Ghaith Makey et al. investigated groups of particles across sizes and materials—CdTe (cadmium telluride) quantum dots, polystyrene colloids, gram-positive cocci, gram-negative bacilli, yeast cells, and mammalian cells—suspended in a fluid.5 The experimental setup extends prior work by the authors at Bilkent University’s Simply Complex Lab, which showed that a femtosecond laser heats a fluid locally with ultrafast pulses.6
The paper describes two main results. The first concerns the degree of control over particle activity. The experiment introduces energy into the motion of particles with high spatial and temporal precision. By controlling where and when the particles aggregate it becomes possible to write “Hi!” and “Bye!” as well as to form triangular, rectangular, circular, and star shapes. Such manipulation is not only fascinating, but also has practical relevance for systems as diverse as cells, microorganisms, and fluorescent quantum dots. Because particles are dragged along with the fluid, their size and material composition are less relevant for the manipulation to work.
The combination of bottom-up and top-down manipulation demonstrated in the work by Makey et al. is a clever utilization of the nonlinear multiphoton absorption of ultrafast pulses. By contrast, optical tweezers, which can hold and move microscopic objects, use the force of radiation pressure from a highly focused continuous laser.7 Optical tweezers have found applications in biology, medicine, nanoengineering, and nanochemistry. In 2018, Arthur Ashkin was awarded the Nobel Prize in Physics for this invention. The experimental setup employed by Makey et al. does not directly apply forces to individual particles, but drives part of the system out of equilibrium in a controlled manner.
The experiments described in the paper are limited to particles constrained between two glass plates in a quasi-two-dimensional system. Further confinement is accomplished in some cases by means of a cavitation bubble that nucleates spontaneously due to localized boiling of the solution. Determining whether a similar degree of control of convective fluid flow is possible in a three-dimensional setup requires further experimentation.
The second result of the work by Makey et al. concerns universal growth and fluctuation statistics. The authors measured the evolution of the filling ratio with particles and organisms in selected imaging areas during the growth of aggregates. They found that the filling ratio follows the same sigmoidal curve regardless of the type of particles used. It was also shown that fluctuations of the growing interface exhibited nontrivial Tracy–Widom (TW) statistics. This is one possible deviation from conventional Gaussian statistics, which is expected from the central limit theorem for all systems approaching equilibrium. TW statistics are increasingly observed in physically diverse systems.
The existence of universality is an interesting theoretical problem. Universal scaling is common in static self-assembly near phase transitions. This has led to a search for universal behavior in systems far from equilibrium. Due to the diversity of the systems studied by Makey et al., their data analysis presents some challenges. Indeed, some of the results are difficult to interpret. Filling area is not a convenient measure to study growth because it does not increase linearly, even in static self-assembly. The extraction of TW statistics is reliant on high-pass filtering in the form of a temporal span analysis. The agreement of the data up to the eighth moment of the distribution is certainly a stringent test. Even so, the analysis undertaken required choices of the temporal span parameter in a certain window and was performed for only a single particle type.
How can researchers purposefully create and control more diversely complex behavior? In recent years, chemists have made tremendous progress in synthesizing nanoparticle building blocks that can be assembled to form intricate functional materials.8 Self-assembly far from equilibrium has also been advancing rapidly in macromolecular chemistry.9 Such processes bear a close resemblance to biological systems and are therefore an excellent starting point.
Chemical reaction networks could turn out to be ideal self-assembling systems. Five guiding principles have been proposed as key design blueprints for these networks: molecular recognition, maintenance of nonequilibrium conditions, feedback loops, reaction–diffusion processes, and, finally, compartmentalization and communication.10 These principles are employed repeatedly in biological systems to build and operate nanoscale machinery far from equilibrium. A complicating factor here is that highly interconnected systems, such as chemical reaction networks, are almost impossible to understand from first principles.
The task of creating self-assembling systems far from equilibrium can be usefully compared to another field where many degrees of freedom are coupled nonlinearly. Neural networks are essentially black boxes that can be trained efficiently. Yet the exact process by which neural networks encode information is not yet well understood. Nonetheless, neural networks may eventually inspire techniques to train a chemical reaction network, whether through artificial evolution or simulated annealing, to perform the desired function or form the desired structures.