Learning From the Mistakes of Others


I just finished reading Antifragile. I wanted to share some notes that spoke to me and might speak to you as well.

If you haven’t heard of Nassim Taleb, he is a genuine philosopher. Taleb has a unique skill to change the way we view the world through the strength and originality of his ideas. His other books include Fooled by Randomness, The Black Swan, Skin in the Game.

In going through Antifragile, I pulled out key takeaways across all the chapters that I will re-read periodically. I give those to you now and highly recommend that you get the book. But if you’ve only 5 minutes, read this instead.


What is Antifragility

Author Nassim Taleb defines the term antifragile this way:

Some things benefit from shocks; they thrive and grow when exposed to volatility, randomness, disorder, and stressors and love adventure, risk, and uncertainty. Yet, in spite of the ubiquity of the phenomenon, there is no word for the exact opposite of fragile. Let us call it antifragile. Antifragility is beyond resilience or robustness. The resilient resists shocks and stays the same; the antifragile gets better. This property is behind everything that has changed with time: evolution, culture, ideas, revolutions, political systems, technological innovation, cultural and economic success, corporate survival, good recipes (say, chicken soup or steak tartare with a drop of cognac), the rise of cities, cultures, legal systems, equatorial forests, bacterial resistance … even our own existence as a species on this planet.

Often it’s impossible to be antifragile, but falling short of that you should be robust, not fragile. How do you become robust? Make sure you’re not fragile. Eliminate things that make you fragile. For example, you have to avoid debt because debt makes the system more fragile.


Learning from the mistakes of others

Antifragility by layers is largely about the errors of others — the antifragility of some comes necessarily at the expense of the fragility of others. In a system, the sacrifices of some units — fragile units, or people — are often necessary for the well-being of other units or the whole. 

The fragility of every startup is necessary for the economy to be antifragile, and that’s what makes, among other things, entrepreneurship work: the fragility of individual entrepreneurs and their necessarily high failure rate.

Antifragility gets a bit more intricate — and more interesting — in the presence of layers and hierarchies. Take a business example, Restaurants are fragile; they compete with each other, but the collective of local restaurants is antifragile for that very reason. Had restaurants been individually robust, hence immortal, the overall business would be either stagnant or weak and would deliver nothing better than cafeteria food.

It is often the mistakes of others that benefit the rest of us — and, sadly, not them. 

The engineer and historian of engineering Henry Petroski present a very elegant point. Had the Titanic not had that famous accident, as fatal as it was, we would have kept building larger and larger ocean liners and the next disaster would have been even more tragic. So the people who perished were sacrificed for the greater good; they unarguably saved more lives than were lost. The story of the Titanic illustrates the difference between gains for the system and harm to some of its individual parts.

The same can be said of the debacle of Fukushima: one can safely say that it made us aware of the problem with nuclear reactors (and small probabilities) and prevented larger catastrophes. 

Every plane crash brings us closer to safety, improves the system, and makes the next flight safer — those who perish contribute to the overall safety of others. Swiss flight 111, TWA flight 800, and Air France flight 447 allowed the improvement of the system. 

But these systems learn because they are antifragile and set up to exploit small errors; the same cannot be said of economic crashes since the economic system is not antifragile the way it is presently built. Why? There are hundreds of thousands of plane flights every year, and a crash in one plane does not involve others, so errors remain confined and highly epistemic — whereas globalized economic systems operate as one: errors spread and compound.

Good systems such as airlines are set up to have small errors, independent from each other — or, in effect, negatively correlated to each other, since mistakes lower the odds of future mistakes. This is one way to see how one system can be antifragile (aviation) and the other fragile (economy). If every plane crash makes the next one less likely, every bank crash makes the next one more likely. 

And of course, we learn from the errors of others as well.

You may never know what type of person someone is unless they are given opportunities to violate moral or ethical codes. I remember a classmate, a girl in high school who seemed nice and honest and part of my childhood group of anti-materialistic utopists. I learned that against my expectations (and her innocent looks) she didn’t turn out to be Mother Teresa or Rosa Luxemburg, as she dumped her first (rich) husband for another, richer person, whom she dumped upon his first financial difficulties for yet another richer and more powerful (and generous) lover. 

In a nonvolatile environment, I (and most probably she, too) would have mistaken her for a utopist and a saint. Some members of society — those who did not marry her — got valuable information while others, her victims, paid the price. 

Further, my characterization of a loser is someone who, after making a mistake, doesn’t introspect, doesn’t exploit it, feels embarrassed and defensive rather than enriched with a new piece of information, and tries to explain why he made the mistake rather than moving on. These types often consider themselves the “victims” of some large plot, a bad boss, or bad weather. 

Finally, a thought. He who has never sinned is less reliable than he who has only sinned once. And someone who has made plenty of errors — though never the same error more than once — is more reliable than someone who has never made any.


Other Notable Takeaways:

  1. Antifragile benefits from shocks; they thrive and grow when exposed to volatility, randomness, disorder, and stressors and love adventure, risk, and uncertainty. Let me be more aggressive: we are largely better at doing than we are at thinking, thanks to antifragility. I’d rather be dumb and antifragile than extremely smart and fragile, any time.
  2. “Never ask the doctor what you should do. Ask him what he would do if he were in your place. You would be surprised at the difference.” — Gerd Gigerenzer
  3. If you have more than one reason to do something (hire an employee, marry a person, go on a trip), just don’t do it. It does not mean that one reason is better than two, just that by invoking more than one reason you are trying to convince yourself to do something. Obvious decisions (robust to error) require no more than a single reason.
  4. Suckers try to win arguments, nonsuckers try to win.
  5. Abundance is harder for us to handle than scarcity.
  6. Thought about investing. Never ask anyone for their opinion, forecast, or recommendation. Just ask them what they have — or don’t have — in their portfolio.
  7. “People think focus means saying yes to the thing you’ve got to focus on. But that’s not what it means at all. It means saying no to the hundred other good ideas that there are. You have to pick carefully. I’m actually as proud of the things we haven’t done as the things I have done. Innovation is saying no to 1,000 things.” — Steve Jobs
  8. Success brings an asymmetry: you now have a lot more to lose than to gain. You are hence fragile.
  9. Wind extinguishes a candle and energizes fire. Likewise with randomness, uncertainty, chaos: you want to use them, not hide from them. You want to be the fire and wish for the wind.
  10. More data means more information, but it also means more false information. More data — such as paying attention to the eye colors of the people around when crossing the street — can make you miss the big truck.

National Entrepreneur Day

Entrepreneurship is a risky and heroic activity, necessary for growth or even the mere survival of the economy.

In order to progress, modern society should be treating ruined entrepreneurs in the same way we honor dead soldiers, perhaps not with as much honor, but using exactly the same logic (the entrepreneur is still alive, though perhaps morally broken and socially stigmatized). For there is no such thing as a failed soldier, dead or alive (unless he acted in a cowardly manner) — likewise, there is no such thing as a failed entrepreneur or failed scientific researcher.

Most of you will fail, disrespected, impoverished, but we are grateful for the risks you are taking and the sacrifices you are making for the sake of the economic growth of the planet and pulling others out of poverty. You are at the source of our antifragility. Our nation thanks you. 

 Nassim Taleb