3-Sentence Summary:
Antifragility is the previously unnamed (and therefore somewhat less visible) idea that describes the opposite of fragility, where fragility refers to things that suffer in the presence of chaos, robustness refers to things that remain unaffected by chaos, and antifragility refers to things that benefit from chaos.
Things that are fragile tend to be more exposed to the negative aspects of unexpected events (a.k.a. Black Swans) in spite of the fact that they often appear less volatile up to that point (like a turkey leading up to Christmas), as opposed to those things that whilst volatile, are adaptable in times of uncertainty (like a Taxi Driver whose income is up and down on a daily basis, but whose job remains stable over time, without him carrying a risk of being fired).
Things that tend to be more organic tend to be more antifragile (compared to things that are non-organic/mechanical/man-made/constrained) since they have the capacity to adapt to and learn from stressors and volatility.
Notes:
Antifragility is the previously unnamed (and therefore somewhat less visible) idea that describes the opposite of fragility.
Fragility refers to things that suffer in the presence of chaos, and robust things remain unaffected by chaos. Antifragility refers to things that benefit from chaos.
Domain-dependence refers to our propensity to notice a concept in one domain but forget it in another domain.
The effect of antifragility occurs via overcompensation - When we lift weights, the body adapts by being getting stronger, above and beyond being able to lift that same weight. The same applies to drug tolerance.
This can be seen as redundancy - Nature likes to over-insure itself.
Lucretius problem refers to believing the tallest mountain we've seen is the tallest mountain there is. Always assume that worse could happen!
Some jobs can be fragile to criticism, but others are antifragile to the point where criticism is beneficial. e.g. banned books become popular, and rock stars become more famous for doing crazy things.
There are 2 categories - Organic (complex), which tend to be antifragile, preferring acute stressors with recovery over chronic stressors, and non-organic (mechanical/non-complex) which degrade over time without a mechanism for self-healing and strengthening.
Antifragility of the group often comes at the expense of fragility to individual elements that make up the group. e.g. evolution requires randomness and mutations (often negative) in individuals, but also produces positive mutations that get passed on and improve the species overall. e.g. Entrepreneurs (individuals) can be fragile, but the group benefits from their failures (as well as successes).
More variability does not necessarily mean more risk. Small regular ups and downs (e.g. entrepreneurs) show volatility but allow for learning (hence antifragility), but manmade smoothing of volatility can create less volatility short-term, but fragility to black swans (extreme, unexpected events) such as getting fired (employees).
The turkey problem - Based on previous information (getting fed and looked after each day), the turkey thinks he's safe, until one day a week before Christmas, when he experiences a black swan (is killed) - Don't think that just because things have been good, they will remain that way, particularly if they've been extremely constrained.
Things that don't experience randomness become fragile, through not being exposed to the necessity of corrections and learning, whilst showing no outward risks - stopping small forest fires increases the chance of bigger ones.
Artificially constrained systems become prone to black swans.
Naive interventionism - intervening when doing nothing would have been a better idea.
When considering an intervention, we should not only consider the probabilistic benefits but also the probabilistic downsides.
Iatrogenics - the harms done by a treatment (such as a medical treatment) which is often overlooked because some people see benefits, even though the harms can often be larger. This is against the Hippocratic oath. Every time there is intervention, there are iatrogenics.
Interventions in the business cycle lead to fragility, in that companies aren't allowed to fail early (and try again) and minimise long-term damage to the system.
Interventions aren't always bad. Limiting size, speed, and concentration (since these things have disproportionate effects) can reduce Black Swans. e.g. Road speed limits.
Procrastination can be protective, stopping unnecessary action when no action would have been better.
Access to data increases intervention. e.g. Michael Jackson's doctor checked his health often, leading to many interventions to things that might have worked out ok by themselves.
Receiving information less frequently allows for cancellation of some of the noise. e.g. not reading the news every day, but reading it each month. You'll only get what stood out.
Prediction is not necessarily bad, but you don't want it to expose you to excessive risk. The risks are in the iatrogenics.
You don't have to predict when the black swan will occur, but you should predict that there will be one, and create antifragility (or at least robustness) to it.
We should separate black swans into separate categories: Those that are unpredictable and consequential, and those that aren't of serious concern due to being predictable and/or inconsequential.
You can't predict in general, but you can predict that those who predict will be more likely to experience trouble, as a result of prediction errors.
Having an 'indifference to fate' positive or negative makes one robust as it helps avoid the issue of being afraid to lose what you have.
However, if you're able to eliminate the downside, either by insuring against losses, or, like Seneca, reframing losses as unimportant, whilst keeping the upside by enjoying the upsides (one-way bookkeeping), you become antifragile.
That way, volatility will give you exposure to positives and not negatives.
The Barbell Strategy involves extreme conservatism on one side (removing risk of extreme downside) and extreme risk on the other (exposing yourself to positive black swans). e.g. most of your investments in index funds, and a tiny amount in risky investments. e.g. Work a boring day job, write books in your spare time. e.g. Study the minimum needed to get the required mark on the exam, and base the rest of your reading on your interests, quitting books when you're bored.
An example of asymmetry is options/optionality, a contract where one retains the choice to take the upside and disregard the potential downside, this may come at a price but if there are potential extreme gains, it could be worth it.
Technologies aren't invented and then used. Instead, randomness feeds us discoveries and the intelligence comes in our ability to recognise the useful from the useless - identifying the option.
Tinkering - trial and error - can expose you to this randomness, where even the error can be useful info.
The narrative and reality of something are not always aligned, and conflation is common (this is known as an epiphenomenon). e.g. education level of a country leads to wealth, when the reality is that it's the other way around.
We should be sceptical of narratives and instead favour what happens in practice and what has come organically rather than what's being imposed top-down.
It is commonly thought that academics come up with theories that are then put into practice. In fact, it is the theories that come from the practice, and an overreliance on the current theories can then lead to mistakes, since these theories can often be less effective than (and behind) what the practitioners are already doing.
Just because you don't understand something about the way things have been done for years, doesn't mean they're stupid (traditions etc). In fact, in many cases, it is better to understand the outcomes and magnitudes of those outcomes rather than the true/false nature of a thing.
The average value of something isn't always the best indicator, and often the variability about the average is more important. e.g. Being in a room at an average temperature of 20 degrees for an hour is fine, but if it was -60 degrees for 30 mins and 100 degrees for 30 minutes, that wouldn't be ok.
No matter how many white swans there are, you'll never be able to prove that all swans are white, but one black swan can disprove it.
The Lindy effect - things that have been around for a certain amount of time will probably be around for that amount of time into the future (presuming they're non-perishable) - Time will eliminate the rest.
The burden of proof is on the person who wants to introduce the intervention that goes against nature. We should use them only when the payoff is large. e.g. to save a life.
Via negativa - the idea of depending on the absence of things or taking things away, rather than adding things in an attempt to improve the situation. Adding things bring iatrogenics.
Skin in the game, or better yet, soul in the game, helps avoid the problem of transferring fragility/antifragility from one party to another without also transferring the potential downsides.
Look at a person's actions to see if they have skin in the game.
Artisans tend to have skin/soul in the game and create beneficial products as opposed to corporations that generally have to use our biases against us, and sell things that are prone to iatrogenics.
If you enjoyed this summary, you’ll probably enjoy the full book. Get it here: LINK TO BOOK (AMAZON)
Or get it for free on audiobook when you sign up for an Audible account: LINK TO AUDIOBOOK (AMAZON)
(This website uses amazon referral links as part of the Amazon Associates program.)