The Edge of Stability
- Ethan Smith
- Jun 4
- 6 min read

Many facets of our universe seem to gravitate towards unstable, chaotic equilibria.
One such example is criticality, a threshold where a system balances on the knife's edge between two distinct states. The critical point is a narrow, fragile state that exists between two phases of a system. For instance, it could be the limbo state at which water transitions from a solid to a liquid. In criticality, the substance exists neither as liquid nor solid, and even minuscule changes can tip the entire system decisively in either direction.
Despite the fragility, many lifeforms have evolved to exploit the fickleness of this fragile equilibrium, playing right up at the edge of stability. Namely, they can reap the power of using small inputs that can cascade into large-scale, coordinated responses across the entire system.
The Ising model provides a toy example of this phenomenon, describing a system of particles with magnetic spins pointing either upwards or downwards. The system aims to minimize energy, which is achieved by aligning spins in the same direction as neighbors. On the other hand, environmental noise of the system can cause particles to randomly flip their spin direction, with increasing odds of this happening as we increase the temperature (not always physical temperature, so more generally vulnerability to external randomness) of the system.

At low temperatures, the system is able to reliably align all the spins of the particles. Random flips are quickly dealt with because the particles' neighbors quickly "undo" the flip due to the force of driving neighbors to match the same spin state.
At higher temperatures, more vulnerable to thermal noise, random flips occur so frequently that they overwhelm any attempt at organization, leaving the system in perpetual disorder.
At the critical temperature, something extraordinary emerges. The system becomes "scale-invariant," spontaneously generating intricate, self-similar patterns that repeat at every scale of observation. Zoom in on any region and you'll find the same organizational motifs. This fractal-like structure enables long-range correlations. The influence of each particle extends infinitely throughout the system. A single spin flip in one corner can ripple across the entire lattice, demonstrating how microscopic events at criticality can orchestrate global transformations across the whole system.

One such piece of nature that operates at criticality is the brain. In a healthy state, the brain nicely balances excitatory and inhibitory signals to maintain criticality. It embodies neither the completely nearly static order of lower temperatures nor the complete disorder at higher temperatures, but rather the semi-ordered critical state, allowing single neuron firings to cause neural avalanches, affecting may downstream neurons. The word avalanche is used to draw the analogy to a small snowball rolling down a hill or a small rock shifting on a mountain causing a chain reaction and leading to a massive avalanche.
This video explains the phenomena more extensively:
This is not exclusive to just biological or physical systems, it also pertains to our symbolic and mathematical models. Namely, neural networks exhibit a similar preference for living at the edge of stability.
One example is the optimization process itself. A paper found that neural network optimization gradually increases the sharpness or slope of descent until it reaches a threshold value, beyond which any sharper descent would cause destabilization and lead to exploding gradient norms. What makes this phenomenon special is that the threshold is dependent on the learning rate, but it occurs regardless of what it is set to.
Smaller learning rates have a higher threshold for destabilization, while higher learning rates are more easily prone to it. However, regardless of what's picked, sharpness of descent still increases to meet this threshold.

Another example is that of self-supervised representation learning methods. Naive representation learning can often run the risk of collapsing to a trivial solution. In collapse, regardless of what you feed into the encoder, all output embeddings are identical. The loss reduces to zero, indicating that training was "successful," but nothing was learned. Every SSL method addresses this differently, whether it be enforcing reconstruction, using teacher networks that are momentum copies of the original network, or using contrastive learning or negative pairs to push points away from each other, among other tricks. Striking this balance to maintain variation in representations is key to success.
Then, another well-known one is the GAN equilibrium. In GAN training, we run the risk of the generator outpacing the discriminator, leading to mode collapse, or the discriminator outpacing the generator, which results in the loss of useful feedback signals and halts learning. Effective learning occurs in the sweet spot where the generator and discriminator oppose each other somewhat equally.
It can also be seen systems of humans and social phenomena modeled with game theory. For instance, another example that comes to mind is capitalism. Capitalism is a competition over resources and profit, such that while all entities act in self-interest, the competition can benefit consumers as businesses fight tooth and nail to attract customers. However, it's an imperfect competition. If there were perfect competition, with all profit spaces already occupied, the system would remain static. There would be no new companies or places for growth. On the other hand, if a company manages to outcompete all others and drive them to extinction to the point of a single monolithic company, then the system would also have no space for competition. It is the imperfections as well as government regulations that allow for windows of opportunity to have substantial impact in the system.

Democracies aim for a similar balance that requires a lot of work to maintain, but seemingly one of the robust governmental systems when all goes right. Healthy democracies balance between authoritarian control (where dissent is crushed) and anarchic fragmentation (where no collective decisions can be made). The tension between competing political forces creates responsiveness to public needs and changing times.
There's an argument to be made for our scientific fields of study as well. As Thomas Kuhn observed, scientific fields operate most productively when established theories face just enough anomalies to drive innovation without completely undermining existing knowledge frameworks. In other words, we should always have open questions that lead to new discoveries, but it can also be unsettling when conflicting information comes in that requires us to rethink everything we know, although possibly rewarding as well.
Ecosystems are no stranger to unstable dynamics as well. I can think of a few famous examples. Predator prey dynamics rarely exist with static population sizes nor constant growth. Instead, the two oscillate and self regulate. A surplus of predators results in a rapid decline in prey population due to hunting, which then results in predators dying off due to a deficiency of food. The decline in predator population then allows the prey population to recover, finding this entire process in a repeating cycle, namely the limit cycle model. The simplified mathematical model dismisses a lot of other real-life variables at play, but it illustrates the phenomena well overall.

Another visualization is the chaotic population growth of rabbits. Past a certain growth rate we observe massive oscillations between each generation. As we marginally increase the growth rate, the period of the oscillations doubles, representing the bifurcations on the chart. And somehow ties back to the Mandelbrot set... This video explains it better than I ever will be able to.

Human-scale urban planning experiences this too. Thriving cities exist between sterile over-planning, which kills organic growth, and chaotic sprawl, which creates dysfunction. The most vibrant neighborhoods often emerge from this critical balance of structure and spontaneity. It's a very pervasive dilemma in a lot of human developments to balance between strict structures for order and freeform growth risking instability, like the startup design compared to the monolithic companies.
Instability also occurs in the speciation of ecosystems as well, balacing between sterile monocultures and chaotic species explosions. Moderate disturbances (like forest fires or grazing) create diversity by preventing any single species from dominating while avoiding complete ecosystem collapse.
The more microscopic ecosystems participate here as well. Your immune response operates at criticality—too weak and pathogens overwhelm you, too strong and you develop autoimmune disorders. The system maintains delicate vigilance, ready to mount massive responses to genuine threats while avoiding attacking your own tissues.
Gene regulatory networks in cells balance between rigid lock-in (where genes can't respond to environmental changes) and chaotic noise (where gene expression becomes random). Critical gene networks allow cells to differentiate into specialized types while maintaining plasticity.
On one hand it's surprising how nature often prefers "risk," though on the other hand, time and time again, it seems to have the space to deliver the greatest payoffs per resources expended, and these balancing acts are at the heart of many complex systems we have.

Comments