Y. Alp Aslandogan
Evaluations and decisions play an important role in our social life. As we want our evaluations to be accurate and decisions to be fair, so we also want the evaluations of others about us to be accurate and their decisions about us to be fair. But what if there are hidden psycho-social processes quietly working in the human mind that affect these evaluations and cause them to be biased? What if these affect good-intentioned people and lead to serious consequences? What are those processes and what can we do about them? We better start by illustrating what we mean.
In a study conducted by social psychologists Tversky and Kahneman, people were asked to guess the percentage of African nations that were members of the United Nations. Two groups were first asked to estimate whether this number was lower or higher than a threshold: the first group was asked whether it is more or less than 45 percent and the second group was asked whether it is more or less than 65 percent. Then both groups were asked to give their estimates of the actual percentage. The researchers demonstrated that the group that was given the lower threshold estimated a lower value for the actual percentage of African nations that are members of the United Nations. The group that was given the higher threshold estimated a higher percentage. The pattern has held in other experiments for a wide variety of different subjects of estimation. A bias in the estimation of African members of the United Nations may not sound like a big deal, but how about decisions that affect people’s lives seriously?
Consider sentencing in court trials. Social psychologist Mussweiler and colleagues asked trial judges with more than 15 years of experience to consider sentencing demands made by non-experts in a legal crime case before issuing a final sentence. The two sentencing demands were 34 months and 12 months. The judges, despite their experience and despite the fact that the crime was the same, were influenced by the demands. Judges who considered the high demand of 34 months prior to their decision gave final sentences that were almost 8 months longer than judges who considered a low demand of 12 months initially. If prior exposure to a piece of information can make a difference as much as 8 more months in prison, then we ought to know what is going on. The examples above illustrate a psychological heuristic known as “anchoring and adjustment.” The heuristic suggests that when faced with a decision-making or estimation situation, people start with an anchor, or a reference point, and make adjustments to it to reach their final estimate. The anchor serves as a first approximation and then the person makes adjustments to it to reach a final estimate or decision. Why does the human mind utilize this heuristic? The answer is simple—we do not always have a whole lot of information to reach an accurate estimate or best decision.
Therefore, the mind sometimes needs shortcuts, especially under pressing circumstances. To understand this process, let’s have a look at the limitations of human cognition. Models of human cognition Cognitive psychologists and sociologists have worked to develop models of human social cognition. These models emphasize four aspects of our social cognition, which is the way we perceive others. The first is the role of prior knowledge versus information immediately available. For example, when we see a policeman directing the traffic, we use our prior knowledge in our perception. We may assume that he is carrying a gun and he has communications equipment to talk with his station. We deduce these features from our prior knowledge about the traffic police, even if we may not be in a position to observe that particular policeman’s gun or communications device. Relying on prior information in our judgments is called “top-down” processing as opposed to “bottom-up” or data-driven processing. Typically, relying on top-down processing, such as relying on stereotypes, requires fewer processing resources.
The second aspect is the limitation of our cognitive processing capacity. The human cognitive system is modeled to consist of our sensory organs, a sensory register (memory) that temporarily stores our perceptions of external stimuli, a short-term memory, a long-term memory, attention resources and executive control processes. When we receive information in the form of audio-visual or other sensory stimuli, they are processed by our cognitive system and transferred to our short-term memory. Through a process of encoding and categorization, the information is organized and stored in the long-term memory. A part of the long-term memory is “active” or readily accessible. Our further use of information in our long-term memory activates the information, and the lack of use deactivates it, making it less readily accessible. Our behavioral response results from our processing of information. According to this information-processing model of human cognition, the amount of information that can be processed by our cognitive system is restricted in terms of storage, flow and inference (Huitt, 2003).
The third aspect is the amount of cognitive processing that is determined by capacity (amount of free resources) and motivation. Factors such as interest, importance, and relevance determine the motivation to allocate more cognitive resources. We are more likely to devote more cognitive processing resources to subjects that are more interesting, important or relevant in our judgment. The fourth aspect is the interplay between automatic and controlled processing. Automatic processes require fewer resources. Given our limited processing capacity, time and other types of constraints have consequences for our cognitive processing. Under constraints, most individuals tend to simplify their processing by relying on less information, relying on automatic cognitive processes as opposed to conscious ones, or relying on prior information as opposed to information available in the circumstances. Anchoring is one such a simplification. The nature of the situation we are facing will determine which of these mechanisms will be selected. They will be reused or abandoned depending on whether they provide a sound basis for our responses to the social environment. If the simplifications lead to interpretations that harm us, we are likely to abandon them. If, on the other hand, there is no harm or there is a benefit, then we are likely to reuse those simplifications.
Anchoring effect Anchoring is defined as the effect of a prior judgment of an object, the anchor, on our future judgments regarding another object. These judgments may be about a numerical value, a probability, or even a moral or legal judgment. As an example, consider the situation where people are asked whether the population of a city is greater than or smaller than a value. Let’s say two groups of people are asked the same question with two anchors: Group A is asked whether the population of Houston is more than or less than 500,000. Group B is asked whether the population of Houston more than or less than 2,000,000. In this example, the values of 500,000 and 2,000,000 serve as anchors. Both groups are then asked: What is your estimate of the population of Houston? Experiments indicate that the people who were given the lower anchor on average give a lower estimate for the population and vice versa. How does anchoring happen? Cognitive psychologists tell us that human judgment is essentially relative or comparison based, even if we are not asked to make a comparison explicitly. So, in evaluating the present object or person, our minds search for an anchor. A particular anchor may be selected because it is readily accessible, because it is suggested to us, or “self-generated via an insufficient adjustment process” (Mussweiler et al). Our prior cognitive processing of the anchor increases the accessibility of anchor-consistent knowledge, which influences our subsequent judgments. For example, when we meet a person from a country, the first person we met from that country may become our anchor. If we had a positive experience with the first person, we are likely to interpret the actions of this new person with a positive light. While cognitive heuristics such as anchoring help us make quick decisions under constraints, they may also lead to errors. The price we pay for the economy provided by the heuristics is “systematically biased judgments under certain conditions.” (Bless et al., p. 24). For instance, a car dealer may offer you a very high price as an anchor and ask you to make a counter offer. Experiments have demonstrated that under such circumstances, the value of the initial offer has a significant effect on what people will be willing to pay for the final price of the car. If the initial offer is very high, the customer is likely to accept a higher negotiated price and vice versa. The anchoring effect can also be observed when we take a few characteristics of a person and consciously or subconsciously fit them to a stereotype.
In such cases, we may misperceive their motivations or misunderstand their circumstances. Sometimes the anchor can be manipulated by some person other than ourselves, as in the case of the car dealer. Sometimes, we may pick the anchor unintentionally based on information obtained from the mass media. As the mass media tend to focus on rare events that tend to be negative, the stereotypes formed based on information solely derived from the mass media may be misleading (Said, 1997). What are some of the lessons we can derive from our discussion of the anchoring effect? For one, we need to point the mirror at ourselves and ask: Are we forming stereotypes of others that may be inaccurate? Are we influenced by the anchoring effect in our judgment of other people? To see whether we may have anchors for evaluating people of different backgrounds consider your initial thoughts about Christians, Jews, Muslims, Hindus, Buddhists, Americans, Russians, Asians, Africans, Mexicans, etc. Are these anchors based on scientific data or news media coverage of events or personal encounters with one or more individuals? It may be infeasible to try to collect encyclopedic information about every nationality, religion, or culture that we encounter. But in matters that impact our society we ought to do a better job of researching a diversity of resources. But perhaps a more important lesson is this—the anchoring effect is here to stay as part of the reality of human cognition. If we would like to provide accurate, reliable information to people about ourselves, our culture and values, we should reach them before they form a negative anchor or stereotype. If we would like our cultural background, religion, or values to be understood without distortion, we need to reach out to as many people as possible around us and interact with them. We need to hold conversations, and share meaningful experiences with them to anchor their future judgments in an accurate reference point. The anchoring effect is demonstrated to be pervasive and robust in psychology. It is not likely to disappear in the foreseeable future. However, we do have the opportunity to reach out and help form positive anchors for better human relationships.
- Bless, H., Fiedler, K., Strack, F. 2004. Social Cognition, New York: Taylor and Francis.
- Huitt, W. 2003. The Information Processing Approach to Cognition. Educational Psychology Interactive. Valdosta, GA: Valdosta State University.
- Mussweiler, T., & Strack, F. 2001. The Semantics of Anchoring. Organizational Behaviour and Human Decision Processes, 86, 234–255.
- Said, E. 1997. Covering Islam: How the Media and the Experts Determine How We See the Rest of the World, New York: Random House.