In the 20^{th} century, we gained a deep understanding of the physical world using equations and the mathematics of continuous variables as the chief source of insights. A continuous variable varies smoothly across space and time. Unlike the simplicity of rockets, which follow Newton’s laws of motion, there isn’t a simple way to describe a tree. In the 21^{st} century, we are making progress understanding the nature of complexity in computer science and biology based on the mathematics of algorithms, which often have discrete rather continuous variables. An algorithm is a step-by-step recipe that you follow to achieve a goal, not unlike baking a cake.

Self-similar fractals grow out of simple recursive algorithms that create patterns resembling bushes and trees. The construction of a real tree is also an algorithm, driven by a sequence of decisions that turn genes on and off as cells divide. The construction of brains, perhaps the most demanding construction project in the universe, is also guided by algorithms embedded in the DNA, which orchestrate the development of connections between thousands of different types of neurons in hundreds of different parts of the brain.

Learning and memory in brains is governed by algorithms that change the strengths of synapses between neurons according to the history of their activity. Learning algorithms also have been used recently to train deep neural network models to recognize speech, translate between languages, caption photographs and play the game of Go at championship levels. These are surprising capabilities that emerge from applying the same simple learning algorithms to different types of data.

How common are algorithms that generate complexity? The Game of Life is a cellular automaton that generates objects that seem to have lives of their own. Stephen Wolfram wanted to know the simplest cellular automaton rule that could lead to complex behaviors and so he set out to search through all of them. The first twenty-nine rules produced patterns that would always revert to boring behaviors: All the nodes would end up with the same value, fall into an infinitely repeating sequence or endless chaotic change. But rule thirty dazzled with continually evolving complex patterns. It was even possible to prove that rule thirty was capable of universal computation; that is, the power of a Turing machine that can compute any computable function.

One of the implications of this discovery is that the remarkable complexity we find in nature could have evolved by sampling the simplest space of chemical interactions between molecules. Complex molecules should be expected to emerge from evolution and not be considered a miracle. However, cellular automata may not be a good model for early life, and it remains an open question to explore what simple chemical systems are capable of creating complex molecules. It might be that only special biochemical systems have this property, and this could help narrow the possible set of interactions from which life could have originated. Francis Crick and Leslie Orgel suggested that RNA might have these properties, which led to the concept of an RNA world before DNA appeared early in evolution.

How many algorithms are there? Imagine the space of all possible algorithms. Every point in the space is an algorithm that does something. Some are amazingly useful and productive. In the past these useful algorithms were hand crafted by mathematicians and computer scientists working as artisans. In contrast, Wolfram found cellular automata that produced highly complex patterns by automated search. Wolfram’s law states that you don’t have to travel far in the space of algorithms to find one that solves an interesting class of problems. This is like sending bots to play games like StarCraft on the Internet that try all possible strategies. According to Wolfram’s law, there should be a way to find algorithms somewhere in the universe of algorithms that can win the game.

Wolfram focused on the simplest algorithms in the space of cellular automata, a small subspace in the space of all possible algorithms. We now have confirmation of Wolfram’s law in the space of neural networks, which are some of the most complex algorithms ever devised. Each deep learning network is a point in the space of all possible algorithms and was found by automated search. For a large network and a large set of data, learning from different starting points can generate an infinite number of networks roughly equally good at solving the problem. Each data set generates its own galaxy of algorithms, and data sets are proliferating.

Who knows what the universe of algorithms holds for us? There may be whole galaxies of useful algorithms that humans have not yet discovered but can be found by automated discovery. The 21^{st} century has just begun.