4 ways to know our evaluation systems to avoid mistakes

The purpose of this article is to arrive at a deeper understanding of judgments, intuitions and choices.

Systematic errors are called biases and are preconceptions that recur in predictable ways in certain circumstances. Understanding and knowing biases gives more reasoning tools and helps you avoid mistakes. All research has documented systematic errors in the thinking of normal and healthy people.

They were attributed to the structure of the cognitive mechanism, not to the fact that emotions corrupt thought. No denigration of intelligence or emotions, therefore, but a disenchanted discovery of mechanisms and thought patterns.

The mental system 1 and the mental system 2

We have two mental systems for assessing reality:

–  System 1:  operates quickly and automatically, with little or no effort and no sense of voluntary control. It is the set of impressions and sensations that spontaneously originate and are the main sources of the explicit beliefs and deliberate choices of system 2. Here are some examples of the activities attributed to system 1: understanding which is the most distant or closest object, orienting oneself towards the source of a sudden sound, complete the phrase “bread and …”, answer “2 + 2”, detest the hostile tone of a voice, read words on large billboards, drive on a deserted road, understand simple sentences.

–  System 2 : directs attention to demanding mental activities that require focus, such as complex calculations. The operations of system 2 are very often associated with the subjective experience of action, choice and concentration.

Some examples are: focus on the circus clowns, focus on the voice of a particular person in a crowded and noisy room, rummage in memory to retrieve a particular memory, keep a pace faster than what comes naturally, check the adequacy of our behavior in a social situation, counting how many times the letter “A” appears on a page of text, calculating what is 27 × 72, comparing the value of two washing machines, checking the validity of a complex logical argument.

These demanding activities interfere with each other: one could never calculate what 17 × 25 does while turning left in heavy traffic. They depend on working memory, one of the executive functions of the brain, which organize and plan a task to be performed in a series of stages.

System 2 seems to be the main actor, but in reality the protagonist of the book (and most of our choices, intuitions, decisions) is system 1. System 2 is what we think we are, system 1 is what we are deep we are.

The automatic operations of system 1 generate surprisingly complex models of ideas, but only system 2, which is slower, is able to process thoughts in an ordered series of stages. This division of work between the two systems is very efficient and allows great energy savings.

The problem however is that system 1 is often subject to bias which tends to commit in specific situations. Furthermore, the problem is that it cannot be turned off. Optical illusions are an example of system error 1. The cognitive illusions that lead to biases are even more powerful than optical illusions, and are more difficult to recognize, at least when we make first-person mistakes.

It is easier to recognize someone else’s mistakes than one’s own. In summary: system 1 is automatic, system 2 is reflective. Spontaneously, each of us tends to adopt the least demanding way we find to achieve the same goal. Effort is a cost and our brain tends to limit it as much as possible. Laziness is deeply rooted in human nature.

In the elaboration of events and ideas, a coherent associative activation takes place. Every element is connected; it supports and strengthens others. The word evokes memories, which arouse emotions, which in turn provoke facial expressions and other reactions.

These create an associative memory. An example of this function, which can trigger a bias, is priming. Being exposed to a word, phrase or situation leads to immediate and easily measurable changes in the words or situations evoked accordingly.

Priming also changes our actions, even by evoking an un explicit situation. It is also used on an advertising level, to induce thoughts or actions automatically, without our realizing it.

In all this sea of ​​cognitive activities, what the brain is looking for is fluidity, flow. Persuasive messages try to trigger cognitive fluidity, which entails a feeling of familiarity, truth, positivity, lack of effort.

To be persuasive it has been shown that you need to use larger fonts, good contrast between characters and background, simple language, if possible rhymes, simple names, evocative and at the same time common words. If possible, it is good to repeat the keywords within the text.

Good mood, intuition, creativity, credulity and greater activity of system 1 go hand in hand. At the same time, vigilance is lowered with respect to logical errors. At the opposite pole we find that sadness, vigilance, suspicion, analytical method, strong commitment and activation of system 2 go together.

Cognitive fluency is both a cause and a consequence of the feeling of well-being. Some evidence shows that people are more susceptible to inconsistent persuasive messages such as commercials when they are relaxed and deconcentrated.

Another bias is the halo effect, which is the tendency to appreciate (or detest) everything about a person based on a few known elements. It is proven that anyone who is good-looking, or has a beautiful voice, is better judged in his way of expressing himself.

This is because system 1 tries to build a coherent story, wants to jump to conclusions as soon as possible. The important thing for system 1 is consistency, not completeness of information. This is also why we are always looking for news or data that confirm our initial idea, creating the famous confirmation bias. The more consistent the story, the more confident we are that we are right, even excessively.

Judgments are formed thanks to a heuristic process based on attempts that progressively bring the response closer. It is the method for finding adequate, albeit imperfect, answers to difficult questions. In many cases we replace the target question – the one on which we have to make a judgment – a heuristic question, which is the simplest, which is answered instead of the other.

For example, to the question “how popular will the president be in 6 months?” subconsciously we replace the question “how popular is the president today?” At this point the answer to the first question becomes simpler. We don’t even notice that we made the replacement while answering the first question, but this is the heuristic mechanism.

In this way, the brain saves energy and the expensive system 2 is not questioned much.

Biases and phenomena

There are several biases in the analysis of certain phenomena.

Eg:

The tendency to see particular patterns in randomness, such as some anniversaries, some extraordinary performances, and so on. In fact, the case is in action.

The anchoring effect . When people have to assign a value to an unknown quantity they start from a certain available value. For example, if you think about buying a house, you will be influenced by the market value. The negotiations will start from there. only later can we move away from the anchor. Anchoring is due to priming and insufficient adjustment, which derives precisely from insecurity in moving away from the safe anchor.

Availability heuristics and its bias . It is the process by which the possible frequency of an event is estimated based on the ease with which the examples come to mind. It also happens through the replacement of the questions we have seen before.

If we have to estimate how many people divorce after the age of sixty, we automatically ask ourselves how many of our direct or indirect acquaintances have divorced after the age of sixty, and based on that we estimate the number corresponding to the initial question.

The process generates bias when we have to estimate the risk of terrorist attacks in the days immediately following an attack, or the risk of earthquakes, and so on. Or, conversely, if no examples come to mind, we underestimate the incidence or frequency of a phenomenon.

In practice, rare but impactful events seem much more frequent, because we can recall them more easily, and the sensation of cognitive fluidity is greater. This causes so-called inattention to probability.

The fallacy of the conjunction . People tend to judge two joint events (Linda is a bank teller and feminist) more likely than the single event (Linda is feminist). In reality, of course, the occurrence of two conditions simultaneously is more unlikely than the occurrence of the single condition.

Correcting prediction errors is a task of system 2. It is tiring and complex. The effort is justified only when the stakes are high and when one is particularly determined not to make mistakes.

Are we so sure about the things we think we know?

Good stories are a simple and consistent description of people’s actions and intentions. The halo effect contributes to consistency.

In creating a story, we attribute causal value to random events (it’s the narrative fallacy that Taleb talks about in  The Black Swan ). Also, we don’t mind non-events. We focus only on what has happened, and not on what hasn’t happened.

So, we neglect both the role of chance and the role of luck or bad luck. All our reconstruction then is a posteriori, and creates the bias of “hindsight”, which manifests itself with the widespread belief of “I always knew”.

All the recipes for success are based on these dynamics. For example, the gap between the companies analyzed in the Built to last book narrowed to practically zero in the period following the study. Another example is the inability to predict what could go wrong in one of our projects.

We make an estimate of costs, times and efforts that is based on what we believe is likely, but not on the unexpected. This is called “planning fallacy,” and it affects mainly optimists and those most risk-prone, such as CEOs.

To mitigate the fallacy of planning, an external vision is needed, which measures and evaluates for example the results of companies or realities that have ventured into similar tasks in the past or in the recent present. There is another technique, called pre-mortem: before approving a project, imagine a future scenario (for example after one year) in which the result has been disastrous.

The group of individuals who know about the ongoing decision must report in 5-10 minutes of that disaster. In this way, the imagination of competent individuals is freed, and the a priori enthusiasm of the group is eliminated. Narrative storytelling and judgment give us a comfortable sense of security, and so it is even more difficult to question our beliefs.

This corroborates the confirmation bias. Furthermore, data that invalidate fundamental assumptions, and therefore threaten people’s lives and self-esteem, are not received. The mind does not digest them. To overcome evaluation problems, you need to develop a skill, becoming an expert in your field.

Developing a skill requires an environment that is regular enough to be predictable and prolonged practice. Thanks to the competence you can then improve your intuition.

Economic choices: prospect theory

In economics, it is now proven that we are risk averse when it comes to earnings (100% chance of winning 100 euros is better than 50% chance of getting 200 euros), while we are risk oriented when it comes to losses ( 50% chance of losing $ 200 is better than 100% chance of losing $ 100.)

This is the essence of prospect theory. Furthermore, the gain or loss should not be understood in an absolute sense – as stated in previous economic theories – but in relation to the starting value and the proportion, for example, between acquired capital and potential capital.

So earning or losing 1000 euros for a poor man will be very different than for a millionaire. The two will implement different strategies and thought patterns, even if the figure is the same. For the same reason, the change of probability from 95 to 100% is much more important than the change of probability between 50 and 55%. The difference in absolute terms is the same, but perceptions and consequences are very different.

Even perceptions of the value of an object change almost universally. For example, the fact of owning an object increases its perceived value. Exchanges and the value of supply and demand are very different between laymen and professional traders.

Among the uninitiated there is a tendency to observe greater price and emotional fluctuations, among professional traders not, because they are used to continuous exchanges and the use of rationality in decision-making processes.

The two selves: the experiential self and the mnemonic self

In well-being measurements, it has been observed that it is very difficult to achieve objectivity by people.

In particular, an “experiential self” comes into play at the very moment in which the experience is lived (for example, it answers the question “does it hurt now?”), And a “mnemonic self” while the experience is ending and from the moment it ended (for example, it answers the question “how was it, overall?”).

For example, it has been measured that intense pain in the last minutes of a medical procedure, despite pain of equal intensity throughout the rest of the procedure, also worsens overall unpleasant sensations from a distance.

In practice, the person remains an unpleasant reminder of the whole procedure as a whole. For the person, this becomes the truth. It is as if the experiential self were blocked by the mnemonic self, in the exact moment in which everything ends, indeed, even a little earlier.

This of course changes any future decisions, which will be aimed at avoiding the pain or the feeling of being unwell remembered. For the same reason, the ending of a tragic film strikes us a lot regardless of whether the scenes before the decisive one last 50 or 90 minutes.

The overall memory of the film depends much more on the final scene than on all previous events.

 

Lascia un commento