My-side bias: what it is and how it distorts our perception of things
My-side bias, or "myside bias", leads us to live in a bubble of self-confirmation.
Have you ever wondered why debates become more and more polarized? Why is it that when two people argue, it is almost impossible for them to reach an agreement? How is it possible that, even when presented with solid evidence to the contrary, people defend their opinions so aggressively?
However rational we consider ourselves to be, it seems that human beings have a natural tendency to seek, interpret, favor and remember information that supports our previous beliefs and values, regardless of whether there are facts that contradict them.
This natural tendency has a name: it is my-side bias.. Below we will take a closer look at this widespread and potentially harmful psychological phenomenon and the research that has shed some light on how it occurs.
What is the bias on my side?
Not infrequently, when we are talking to someone about any given topic, we explain what we think and what the "facts" are. We explain all the evidence we have found from all sorts of "reliable" sources. We know that person has an opinion contrary to ours and we trust that, after giving her this evidence, she will change her mind, but that just doesn't happen. No, she is not deaf, nor has she ignored us, it has simply happened that since what we have told her contradicts what he thinks she has belittled our "facts", thinking that we are uninformed.
My-side bias is a psychological phenomenon that causes us to have a tendency to seek, interpret, favor and recall information that supports or confirms our previous beliefs and values. tendency to seek, interpret, favor and remember information that supports or confirms our previous beliefs and values, ignoring or downplaying evidence that contradicts what we believe.ignoring or downplaying evidence that contradicts what we believe in. Basically, this bias is an inherent defect of our brain in the way it processes information, which leads us to make biased decisions or adopt wrong views and opinions.
Although all human beings are victims of this bias, this psychological phenomenon is considered as potentially dangerous, in the sense that it in the sense that it makes us practically blind to any information that, no matter how truthful it may be, if it is contrary to what we think, we will consider it false or not rigorous. or not very rigorous. In fact, some theorists on this pattern of thinking, such as Keith E. Stanovich considers it to be essentially responsible for the idea of post-truth: we only see what we want to see.
Implications of this cognitive bias
Over the last decades Stanovich along with other cognitive researchers such as Richard F. West and Maggie E. Toplak have experimentally addressed this bias. One of its main implications is that human beings tend to look for information that gives strength to our opinions, omitting or discarding any data that, however true and demonstrable, we consider less rigorous. People look for information that look for information that gives strength to our hypotheses, instead of looking for all the evidence, both confirming and disproving..
In fact, this is something quite easy to understand by looking at how people behave in practically any subject they want to document. For example, if we find a person who is pro-life, that is, against abortion, she will be more likely to look for information that proves her right and, what is more, it is even possible that she will become even more opposed to abortion. She will rarely look for information explaining why abortion should be a universal right or whether the fetus of a few weeks does not feel, and if she does, she will read such content in a very skeptical and superficial light.
Interestingly, the fact of looking for information that is on both sides of a debate, i.e., looking for data favorable and unfavorable to the opinion one has already made from the beginning, seems to be related to personality traits, seems to be related to personality traits rather than intelligence.. In fact, some research suggests that more confident people tend to look for data that proves and disproves both sides of the debate, while more insecure people look for that which gives strength to their beliefs.
Another clear implication of this bias is how the same information is how the same information is interpreted differently based on our underlying beliefs.. In fact, if two individuals are given exactly the same information about a subject, they will most likely end up having different points of view, totally or partially opposed, since even if the message is identical, their interpretation of it will not be, and their way of seeing it will be personally biased.
- You may be interested in, "Are we rational or emotional beings?"
The death penalty experiment
A good example of this can be found in an experiment carried out at Stanford University, in which researchers sought participants who already showed strongly divided opinions on the same issue: being for or against the death penalty.. Each participant was given brief descriptions of two studies, one comparing U.S. states with and without capital punishment and the other comparing the murder rate in a state before and after the introduction of the death penalty.
Following this description, they were given more detailed information on both studies and asked to evaluate how reliable they thought the research methods were in both investigations. In both the pro-death penalty and anti-death penalty groups reported that they had changed their attitudes somewhat at the beginning of the study when they were given the brief description, but when given more details most returned to their previous beliefs.Despite having the evidence that gave solidity to both studies. They were more critical of sources contrary to their opinion.
German cars and American cars
Another study showed that intelligence does not protect us from bias on my side. In this case, participants' intelligence was measured before they were given information about a fact on which they had to state their opinion. The fact in question was about cars that could pose safety problems. The participants, all of whom were Americans, were asked whether they would let German cars with safety problems drive on U.S. roads. They were also asked the reverse question: whether they thought that American cars with defects should be allowed to drive in Germany.
Participants who were informed about German cars with safety problems said that they should be banned in the USA. The US authorities have been informed about their US counterparts, however, that they should be able to travel in Germany. On the other hand, those who were informed about their American counterparts said that they should be allowed to drive in Germany. That is, they were more critical of the safety of German cars because they were German and driven at home and more lax about American cars because they were American and driven abroad. Intelligence did not reduce the likelihood of my-side bias..
Memory and bias on my side
Although people try to interpret a piece of information as neutrally as possible, our memory, which will be biased by our own beliefs, will act in favor of remembering what supports our point of view, i.e., we have selective memory. Psychologists have theorized that information that fits with our existing expectations will be more easily stored and remembered than information that disagrees. In other words, we memorize and remember better what makes us right and forget more easily what goes against us..
How does this relate to social networks?
In view of all this, it is possible to understand the seriousness of the implications of the bias on my side when receiving and interpreting any information. This bias makes us incapable of effectively and logically evaluating the arguments and evidence given to us, no matter how solid they may be. We can believe more strongly in something that is doubtful simply because it is on "our side" and be very critical of something that, despite being very well demonstrated, as it is "against us" we do not see it as rigorous and reliable.
But of all the implications of this we have one that is directly related to social networks, especially their algorithms.especially their algorithms. These digital resources, by means of "cookies" and remembering our search history, make us be presented with resources that are related to something we have already seen before. For example, if we search for images of kittens on Instagram we will start to see more pictures of these animals in the magnifying glass section.
What is the implication of these algorithms with the bias on my side? A lot, since we are not only looking for pictures of animals or food on social networks, but opinions and "facts" that confirm our pre-established opinion. So, if we search for a vegetarianism blog we will find in the search section as many related ones, both politically neutral as would be vegetarian recipes as well as blog posts, images and other resources that talk about animal brutality and criminalize "meat" people.
Considering that we are unlikely to look for information contrary to our point of view, it is only a matter of time before our opinions become more radical, it is only a matter of time before our opinions become more radical.. As the networks show us resources in favor of our point of view, we will progressively go deeper into the subject and, taking the example of vegetarianism, it is even probable that we will end up in vegan sectors, supporters of more intense actions towards the meat sector.
Based on this, and especially applied to political ideologies, many people consider that these algorithms are killing democracy. The reason for this is that, as the algorithm does not present us with all the available points of view on the same subject, but rather presents us with what favors our opinion, making us less likely to compare options. Since we are not facing different "truths" and we are trapped in the comfort of our own point of view because of social media we are really being manipulated.
It is for this reason that, as an attempt to escape the trap of our own mind and how social media helps us to lock ourselves even more into what we believe, it never hurts to seek out opinions contrary to our own. Yes, it's true, the bias on my side will make us tend to see them in a more critical and superficial way, but at least the attempt can give us some ideological freedom and freedom of opinion.. Or at least clear the search history and not give the social network of the day a chance to trap us in our own beliefs.
Bibliographical references:
- Macpherson, R. & Stanovich, K. (2007). Cognitive ability, thinking dispositions, and instructional set as predictors of critical thinking. Learning and Individual Differences 17 (2007) 115–127.
- Stanovich, K., West, R. (2007). Natural myside bias is independent of cognitive ability. THINKING & REASONING, 2007, 13 (3), 225 – 247
- Stanovich, K., West, R. (2008). On the failure of cognitive ability to predict myside and one side thinking biases. THINKING & REASONING, 14 (2), 129 – 167
- Sternberg, R. J. (2001). Why schools should teach for wisdom: The balance theory of wisdom in educational settings. Educational Psychologist, 36, 227 – 245.
- Stanovich, K.E.; West, R.F.; Toplak, M.E. (2013), Myside bias, rational thinking, and intelligence, Current Directions in Psychological Science, 22 (4): 259–64, doi:10.1177/0963721413480174, S2CID 14505370
- Toplak, M. E., & Stanovich, K. E. (2003). Associations between myside bias on an informal reasoning task and amount of post-secondary education. Applied Cognitive Psychology, 17, 851 – 860.
- Lord, Charles G.; Ross, Lee; Lepper, Mark R. (1979), Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence, Journal of Personality and Social Psychology, 37 (11): 2098–09, CiteSeerX 10.1.1.372.1743, doi:10.1037/0022-3514.37.11.2098, ISSN 0022-3514
(Updated at Apr 14 / 2024)