A person’s talent is to turn all environmental signals into meaningful information. This allowed us to create a scientific method, to philosophize about the nature of being and to invent complex mathematical models.
Our ability to reflect on and manage the world does not at all mean that we are good at it. We tend to narrowly think in our ideas about it. Having come to any judgment, we cling to him with a dead grip.
Human knowledge is constantly increasing, and such a dogmatic approach is not effective. Two hundred years ago, doctors and scientists were absolutely confident in their knowledge of medicine, but just imagine that, after contacting a doctor with complaints of a runny nose, you are prescribed a prescription for leeches!
Confidence in judgments makes us derive concepts from the framework of the conceptual framework that we accepted as true. How to understand medicine without knowing about the existence of microbes? You can come up with a reasonable explanation of the disease, but it will be erroneous because of the lack of important information.
Such thinking can lead to unexpected surprises. Sometimes events do not surprise because they are accidental, but because our worldview is too narrow. Such surprises are called “black swans” and can force us to reconsider the picture of the world.
Before a man first saw a black swan, they all assumed that they were only white. White color was considered their integral part. Seeing the black swan, people radically changed the idea of this bird. Black swans are just as common as white swans, and as fateful as bankruptcy due to the fall of the stock market.
“Black swans” can have fatal consequences for those who are blind to them
The effect of the “black swan” is not the same for everyone. Some can seriously suffer from it, and others will not even notice it. Access to relevant information is important: the less you know, the greater the risk of becoming a victim of the “black swan”.
Example. Imagine that at the races you put on your favorite horse named Rocket. Because of the horse’s physique, her list of awards, jockey skills and sluggish competition, you put all the money to her victory. And now imagine your surprise when the Rocket not only did not run after the start, but preferred to just lie down. This is a “black swan”. Given the information available, the Rocket was supposed to win, but for some reason you lost all your money. On the contrary, the owner of the Rockets got rich by putting against it. Unlike you, he knew that the Rocket would go on strike to protest against animal cruelty. This knowledge saved him from the “black swan”.
The influence of “black swans” can affect not only individuals, but also entire societies. In such cases, the “black swan” can change the world, influencing, for example, philosophy, theology and physics.
Example. Copernicus suggested that the Earth is not the center of the universe, and the consequences were colossal: the discovery cast doubt on both the authority of the ruling Catholics and the Bible itself.
Subsequently, this “black swan” marked the beginning of a new European society.
We are very easily confused even by elementary logical errors
People often make mistakes, making a prediction based on what they know about the past. Considering that the future is a reflection of the past, we are mistaken, because many unknown factors run counter to our assumptions.
Example. Imagine that you are a turkey on a farm. For many years the farmer has fed you, holil and cherished. Focusing on the past, there is no reason to expect changes. Alas, on Thanksgiving Day you were beheaded, fried and ate.
Making forecasts based on the past, we are mistaken, and this leads to serious consequences. A similar misconception is cognitive distortion, when we seek evidence only of already existing beliefs.
We do not accept information that contradicts what we already believe, and we are unlikely to conduct further research. But if we decide to understand, then we will look for sources that dispute this information.
Example. If you are firmly convinced that “climate change” is a secret collusion, and then you will see a documentary called “Undeniable evidence of climate change,” it is likely that you will be very upset. And if you look for information on the Internet, in search terms, you specify “climate change is a hoax,” and not “evidence for and against climate change.”
That is, we involuntarily make the wrong conclusions: this is inherent in our nature.
Our brain groups information in a way that prevents accurate predictions
In the course of evolution, the human brain has learned to classify information so as to survive in the wild. But when we need to learn and adapt quickly to a dangerous situation, such a method is completely useless.
An incorrect classification of information is called a false narrative: a person creates linear descriptions of the current situation. Due to the huge amount of information that we receive daily, our brain chooses only one that it considers important.
Example. You probably remember what you ate for breakfast, but you hardly call the color of each passenger’s shoes on the subway.
To give meaning to the information, we link it. So, thinking about your life, you mark certain events as significant, and build them into a narrative explaining how you became who you are.
Example. You love music because your mother sang to you before going to bed.
So you can not fully understand the world. The process works only with an eye for the past and does not take into account the almost limitless interpretation of any event. Even tiny events can have unpredictable, important consequences.
Example. Butterfly, flapping wings in India, causes a hurricane in New York a month later.
If we place the causes and effects in the order in which they arise, we see clear, cause-and-effect relationships between events. But since we see only the result – a hurricane – one can only guess which of the simultaneously occurring events actually affected such an outcome.
It is difficult for us to distinguish scalable and unscalable information
We do not distinguish between types of information – “scalable” and “unscaled”. The difference between them is fundamental.
Non-scalable information, such as body weight or height, has...
But nonphysical or fundamentally abstract things, such as the distribution of wealth or sales of albums, are scalable.
Example. If the album is sold via iTunes, there is no limit to the number of sales: it is not limited to the volume of physical copies. And because the transactions are online, there is no shortage of physical currency, and nothing will prevent you from selling trillions of albums.
The difference between scalable and unscalable information is crucial to seeing the exact picture of the world. If the rules that are effective for non-scalable information apply to scalable information, errors will occur.
Example. You want to measure the wealth of the population of England. The easiest way is to calculate the wealth per capita, adding income and dividing it by the number of citizens. But wealth is scalable: a tiny percentage of the population can own an incredibly large percentage of wealth.
Data on per capita incomes will not reflect the actual state of affairs in your income distribution.
We are too confident that we are known
Everyone wants to protect themselves from danger. One way is to assess and manage risks. Therefore, we buy insurance and try to “not put all the eggs in one basket.”
The majority makes every effort to assess the risks as accurately as possible, so as not to miss opportunities and at the same time not to do something that you can regret. To do this, you need to assess all the risks, and then the likelihood that these risks materialize.
Example. Let’s say you are going to buy insurance, but without wasting money. Then you need to assess the threat of illness or accident and make a measured decision.
Unfortunately, we are convinced that we know all the possible risks from which we must defend ourselves. It’s a game error: we tend to react to risk as a game with a set of rules and probabilities that can be determined before it starts.
To consider the risk in this way is very dangerous.
Example. Casinos want to earn as much money as possible, so they have developed a security system and disqualify players who win too much and often. But their approach is based on a game error. The main threat of the casino is not lucky and not thieves, but kidnappers taking the child of the casino owner as hostages, or an employee who did not submit a declaration on income to the Tax Service. Serious dangers for the casino are completely unpredictable.
It does not matter how much we try. It is impossible to foresee exactly any risk.
Why do you need to be aware of your ignorance?
Realizing that you do not know much, you can better assess the risks
Everyone knows the phrase: “Knowledge is power.” But when knowledge is limited, it is more profitable to recognize this.
Focusing only on what you know, you limit your perception of all possible outcomes of this event, creating a fertile ground for the emergence of the “black swan”.
Example. You want to buy shares of the company, but know too little about the stock market. In this case, you will follow several falls and ups, but, in general, pay attention only to the fact that the trends are positive. Believing that the situation will continue, you spend all your money on stocks. The next day the market crashes, and you lose everything that you had.
Having studied the topic a little better, you would see the numerous ups and downs of the market throughout history. By focusing only on what we know, we are exposing ourselves to serious risks.
If you admit that you do not know something, you can significantly reduce the risk.
Example. Good poker players know that this principle is crucial to success in the game. They understand that the cards of their opponents can be better, but they also know that there is certain information that they do not know – for example, the opponent’s strategy and the degree of his determination to go to the end.
Realizing the presence of unknown factors, players focus exclusively on their cards, better assessing possible risks.
The idea of limitations will help us make the right choice
The best defense against cognitive traps is to understand the forecasting tools well, as well as their limitations. Let it not save from a miss, but it will help reduce the number of unsuccessful decisions.
If you realize that you are subject to cognitive distortion, it is much easier to understand that you are looking for information that confirms already existing statements. Or, knowing that people like to reduce everything to clear, causal narratives, you will be inclined to seek additional information for a better understanding of the “picture as a whole”.
You need to know about your shortcomings.
Example. If you understand that there are always unforeseen risks, despite the prospect of the possibility, you will be more careful to invest in it a lot of money.
It is impossible to overcome all the accidents or our limitations in understanding the complexity of the world, but it is possible, at least, to mitigate the damage caused by ignorance.
The most important thing
Although we constantly make predictions, we are doing poorly. We are too confident in our knowledge and underestimate our ignorance. The inability to understand and determine the randomness and even our very nature contribute to the unsuccessful decision-making and the emergence of “black swans”, that is, events that seem impossible and compelling to rethink the understanding of the world.
Mistrust to “because”. Instead of wanting to see events in a clear causal relationship, consider a number of opportunities, not focusing on one.
Realize that you do not know something. For meaningful forecasts for the future, be it purchase of insurance, investment, job change and so on, it is not enough to take into account all the “known” to you – this gives only a partial understanding of the risks. Instead, admit that you do not know something in order not to unnecessarily restrict the information you are dealing with.