Hormesis Is Redundancy
Nature is the master statistician and probabilist. It follows a certain logic based on layers of redundancies, as a central risk-management approach. Nature builds with extra spare parts (two kidneys), and extra capacity in many, many things (say lungs, neural system, arterial apparatus, etc.), while design by humans tend to be spare, overoptimized, and have the opposite attribute of redundancy, that is, leverage—we have a historical track record of engaging in debt, which is the reverse of redundancy (fifty thousand in extra cash in the bank or, better, under the mattress, is redundancy; owing the bank an equivalent amount is debt).
Now, remarkably, the mechanism called hormesis is a form of redundancy and statistically sophisticated in ways human science (so far) has failed us.
Hormesis is when a bit of a harmful substance, or stressor, in the right dose or with the right intensity, stimulates the organism and makes it better, stronger, healthier, and prepared for a stronger dose the next exposure. That's the reason we go to the gym, engage in intermittent fasting, or caloric deprivation, or overcompensate for challenges by getting tougher. Hormesis lost some scientific respect, interest and practice after the 1930s partly because some people mistakenly associated it with the practice of homeopathy. The association was unfairly done as the mechanisms are extremely different. Homeopathy relies on other principles, such as the one that minute, highly diluted parts of the agents of a disease (so small they can hardly be perceptible, hence cannot cause hormesis) could help medicate against the disease itself. It has shown little empirical backing and belongs today to alternative medicine, while hormesis, as an effect, has shown ample scientific evidence.
Now it turns out that the logics of redundancy and overcompensation are the same—as if nature had a simple elegant and uniform style in doing things. If I ingest, say, fifteen milligrams of a poisonous substance, my body will get stronger, preparing for twenty, or more. Stressing my bones (karate practice or carrying water on my head) will cause them to prepare for greater stress, by getting denser and tougher. A system that overcompensates is necessarily in overshooting mode, building extra capacity and strength, in anticipation for the possibility of a worse outcome, in response to information about the possibility of a hazard. This is a very sophisticated form of discovering probabilities via stressors. And of course such extra capacity or strength becomes useful—in itself—as opportunistic as it can be used to some benefit even in the absence of the hazard. Redundancy is an aggressive, not a defensive approach to life.
Alas, our institutional risk management methods are vastly different. Current practice is to look in the past for worst-case scenario, called "stress test" and adjust accordingly, never imagining that, just as the past experienced a large deviation that did not have a predecessor, that such deviation might be insufficient. For instance, current systems take the worst historical recession, the worst war, the worst historical move in interest rates, the worst point in unemployment, etc., as an anchor for the worst future outcome. Many of us have been frustrated—very frustrated—by the method of stress testing in which people never go beyond what has happened before and even had to face the usual expression of naive empiricism, "do you have evidence?" when suggesting that we need to consider worse.
And, of course, these systems don't do the recursive exercise in their mind to see the obvious, that the worst past event itself did not have a predecessor of equal magnitude, and that someone using the past worst case in Europe before the Great War would have been surprised. I've called it the Lucretius underestimation, after the Latin poetic philosopher who wrote that the fool believes that the tallest mountain there is should be equal to tallest one he has observed. Danny Kahneman's wrote, using as backup the works of Howard Kunreuther, that "protective actions, whether by individuals or by governments, are usually designed to be adequate to the worst disaster actually experienced (...) Images of even worse disaster do not come easily to mind." For instance, in Pharaonic Egypt, scribes have tracked the high-water mark of the Nile and used it as worst-case scenario. No economist had tested the obvious: do extreme events fall according to the past? Alas, backtesting says: no, sorry.
The same dangerous recklessness can be seen in the methodology used for the Fukushima nuclear reactor, built to the worst past outcome, and not imagining and extrapolating to much worse. Well, nature, unlike risk engineers, prepares for what has not happened before, assuming worse harm is possible.
So if humans fight the last war, nature fights the next war. Of course, there is a biological limit to our overcompensation.
This form of redundancy remains vastly more extrapolative than our minds that are intrapolative.
The great Benoit Mandelbrot, now gone for a year, saw the same fractal self-similarity in nature and in probabilities of historical and economic events. It is thrilling to see how the two domains unite under the notion of fractal-based redundancy.
p.s. The word "fitness" in the common scientific discourse does not appear to be precise enough. I am unable to figure out if what is called "Darwinian fitness" is merely intrapolative adaptation to current environment, or if it contains an element of statistical extrapolation. In other words, there is a significant difference between robustness (is not harmed by stressor) and what I've called antifragility (i.e., gains from stressors).