Nutshell: Cognitive biases that undermine candidate selection

By Future Talent Learning

Tags: Exclude

Status quo bias is just one of the cognitive biases that undermines us in the workplace – and particularly during the interview and selection process.

 

There are more cognitive biases than even the best psychologist could readily name. We all have them. They influence our understanding, actions and decisions, but, because we are often not aware of them, they can manifest themselves in subtle, insidious ways.

 

We might pay too much attention to our first instincts without properly weighing up the pros and cons; we might naturally gravitate towards hiring someone because they have a similar background, interests or experience; we might make unwarranted assumptions about certain individuals, groups or even ourselves.

 

In the workplace, where we have to make daily judgements about tasks and relationships, it’s easy to see how these biases might get in the way, no matter how well intentioned we want to be. We might think we’re entirely rational, objective and logical, but we can’t entirely escape our unconscious or implicit attitudes and assumptions.

 

At best, we can merely acknowledge and try to mitigate them. That’s why building self-awareness can help, even though it is not always a complete solution. Undetected or untested, they can trip us up, contribute to poor decision-making and bad judgements, even lead to stereotyping and discrimination.

 

Why do we tend towards bias? 

Psychologist Daniel Kahneman’s concept of 'fast and slow thinking' offers insight into why cognitive biases are so common and so difficult to overcome. His fast System 1 thinking is our default thinking mode, automatic and intuitive. In contrast, slow System 2 thinking is more deliberate and effortful, more complex and controlled.

 

Unsurprisingly, given the thousands of decisions we need to make every day, System 1 thinking is in charge most of the time, protecting us from the cognitive strain of having to slow down and think something through using System 2. 

 

But these cognitive shortcuts also come at a cost. System 1 thinking is more impulsive, more emotional, more optimistic and leads us to follow our first impressions, often despite evidence to the contrary.

 

Kahneman gives the example of a person remodelling their kitchen. Despite overwhelming evidence that most people under-budget the likely cost, we still believe that, for us, the outcome will be different. System 1 thinking does not apply previous knowledge with any degree of reliability. 

 

In essence, then, our default System 1 thinking makes us less rational and far less correct in our thinking than we’d like to give ourselves credit for. And because of this, it also brings with it a great deal of built-in prejudice.

 

We all have stereotypes about different social groups, and when confronting these stereotypes, we need System 2 in control to help us modify our initial, knee-jerk reactions. It’s certainly possible for us to reduce the impact of bias on our thinking; it takes what Kahneman calls a process of self-critique or quality control. But, because this takes mental effort, it’s hard. Given the choice, it’s easy to see why, often, we never bother or give up and default to the status quo. 

 

Status quo bias: better the devil you know?

Sticking with what we know or like may be entirely rational, the product of a careful weighing up of options before the decision is made. But, more often than not, it’s entirely irrational, leading us to ignore evidence or reject options for choices that might actually benefit us, whether that a better-tasting version of Coke or a more diverse workforce.

That’s status quo bias and that’s why it’s potentially so damaging. Unsurprisingly, it interacts with a range of psychological principles:

 

  • It’s related to loss aversion, the principle that the potential for loss weighs greater than the potential for gain.



  • It’s been associated with regret avoidance, the reason why we so often choose a lunchtime sandwich we’ve eaten before, lest we’re disappointed when a different sandwich doesn’t quite come up to scratch. 



  • Through sunk cost fallacy, the more we invest in the status quo, the more likely we are to continue to invest in the status quo.



  • Because of the mere exposure effect, we tend to prefer something we’ve been exposed to before. 



  • We may want to avoid cognitive dissonance, those inconsistent and often uncomfortable thoughts that may come with the conflict inherent in competing options. 

 

When we consider these factors, it begins to make sense that incumbent candidates are more likely to win political elections than their less well-known challengers (better the devil you know) or that there’s a tendency, if we’re not careful, to recruit colleagues who look and feel just like us

 

How and why does status quo bias impact on our workplaces? 

In his book, Alchemy: The Surprising Power of Ideas That Don’t Make Sense, adman Rory Sutherland makes a case for seeing status quo bias as the prism through which we might identify and challenge a lack of workplace diversity.

 

He quotes study reported in Harvard Business Review which suggests that, when it comes to diversity, it’s our bias against the unfamiliar, the square peg who might upset the status quo, that’s at the root of workplaces stuffed with people who look and think the same way. 

 

The researchers looked at the effect the number of women or non-white candidates on a shortlist had on a university’s hiring decision. When there was only one woman on a shortlist of four, her chance of getting hired was zero; when there were two, her chances rose to 50%.

 

They concluded that a lone woman on the shortlist tended to highlight how different she is from the norm – triggering the status quo bias that made recruiting that woman a seemingly riskier proposition. But adding another woman to the shortlist created a new status quo, making recruiting a woman less of a threat to what was considered normal. It suggests that the prejudice we apply to a lone woman might be a prejudice applied to a lone anyone. 

 

For Sutherland, if this is true, then we may not be casting the net wide enough when it comes to workplace diversity. Our challenge may not be so immediately obvious as a straightforward focus on disadvantaged groups such as women or people from minority backgrounds, important as these are.

 

We may instead need to dig deeper to find out what the real status quo is. In many creative industries for example, there is undoubtedly a bias towards hiring physically attractive people. In other industries, older people or introverts are at an unfair disadvantage.

 

Class is another obvious factor: we shouldn’t be overly self-congratulatory for hiring a particular quota of women, for example, if their only real diversity that’s more than skin deep is that they all went to different Oxbridge colleges. 

 

We can all rehearse the ethical and commercial arguments in favour of diversity of thought, but perhaps status quo bias challenges us to wonder if what we mean by diversity isn’t always as clear-cut as we think it is. And is also a lot less simple to fix. For example, it may be more entrenched in larger organisations than in smaller ones, where one or two new hires might more easily change the dynamics at play.

 

Just as those techies stuck with IBM as the safe option, probably for longer than was wise, then, as Sutherland says, “the less eccentric your hire, the less blame you are exposed to if something goes wrong”. The alternative might just be too difficult to contemplate. 

 

Maybe we could all do with a healthy dose of Kahneman’s more considered System 2 thinking when it comes to overcoming bias at work. 

 

While we’re at it, here are five other biases to look out for during hiring:

 

1. Halo effect

 

“Surely not? She's so articulate…”

 

When we admire something about someone – their smarts, their sales nous, their way with words – it’s hard not to let that spill over into other personality areas. That’s the halo effect.

 

Halo effect is often associated with the physical attractiveness stereotype, when we find it hard to believe that such a lovely looking person could ever do anything wrong or bad. But it’s just as toxic when we allow the way we feel about a person’s ability in one area to blind us to the fact that they might not be the all-rounder we think they are. 

 

We shouldn’t overlook red flags because our interviewee looks the part – or used to work for a firm that we revere.

 

Note, too, the reverse: just because we don’t admire one thing about a candidate, doesn’t mean they’re all bad. 

 

2. Anchoring

 

“The first impression is the last impression”

 

In anchoring, our decisions are influenced by a particular reference point or ‘anchor’. This tends to be previously accepted information or the first piece of information we learn about a topic.

 

For example, we may believe that our new hire must strongly resemble the outgoing employee in all characteristics or use the qualifications of the first candidate as a benchmark against which to measure everybody else (no matter what other skills or experience others have to offer). We may feel that we can employ a marketing manager on a particular salary because that’s the figure we have in mind (after a cursory discussion with a friend).

 

However, we can address anchor bias by taking pains not to rely on a single source of information, paying attention to conflicting facts and asking for second opinions.

 

3. Confirmation bias

 

“If the facts don’t fit the theory, throw out the facts” (Albert Einstein)

 

We know what we like, don’t we? Read the newspaper we agree with? Follow like-minded people on Twitter? But we have to guard against that becoming a tendency to look for, interpret, listen to and remember only that information and those beliefs that confirm our own preconceptions.

 

In interviews, we may make snap decisions – based on perceived truths and initial ‘instincts’ about a candidate’s value – and spend the rest of the interaction looking for information that cements our views. Research shows that 60% of interviewers make a decision about a candidate’s suitability within 15 minutes of meeting them. 

 

Instead, we should look at people holistically and try to consider facts as objectively as possible, carefully considering opposing interpretations of a candidate from colleagues.

 

4. Authority bias

 

“But my boss thinks we should hire them…”

 

Might we find ourselves persuaded to hire or reject a particular candidate on the basis of our boss’ opinion – despite the fact they know less about the role and candidate than we do?

 

People in positions of authority are probably more experienced than most of us, but that doesn’t mean they know it all. Beware the tendency to believe, and to be more influenced by, the opinions of an authority figure. They might not be as well informed – or as objective – as we think. 

 

5. Bandwagon effect

 

“Well, if HR is on board, that’s good enough for me…”

 

Closely related to groupthink or herd behaviour, the bandwagon effect leads us to do or believe things because a large enough group of other people act or believe the same. But the weight of numbers could still be leading us astray. Just because others are set on a course of action, it doesn’t mean that it’s right. 

 

When making a hiring decision, we should be able to make our reasoning explicit and demonstrate (to ourselves and others) that we are not simply following the crowd.

 

Embracing difference

A final thought. One of the best ways to tackle our biases is to seek out and spend time with people whose opinions, experiences and values might differ from ours. Being open to, and reflecting on, different worldviews is a powerful way to avoid the traps of common cognitive biases.

 

But of course, if you’re prone to falling victim to the confirmation bias, that’s exactly what you’d have expected us to say.

 

 

Test your understanding

  • Outline Daniel Kahneman’s concept of 'fast and slow thinking' – and how this can lead to bias.

  • Explain what we mean by 'status quo bias' and give one other example of a cognitive bias we may fall prey to in candidate interviews.

     

What does it mean for you?

  • Consider your recent interactions with existing or potential colleagues and reflect on the biases you might be prone to and how you might overcome these.