Users’ minds take shortcuts to get through the day. Usually, they’re harmless, sometimes they’re even helpful. But what happens when they’re not? In this #mtpcon Digital Americas session, David Dylan Thomas, the author of Design for Cognitive Bias, uses real-world examples to identify some particularly harmful biases that frequently lead users to make bad decisions.
Watch the session in full or read on for the highlights from David’s talk:
In brief
- As humans, we have biases that affect our decision-making and behavior
- A cognitive bias is a series of shortcuts our minds take to help us get through the day but they can harm our decision-making
- Types of bias include: confirmation bias, notational bias, and self-serving bias
- We can use design fight bias – one way includes creating a Black Mirror episode to consider the ways a user might misinterpret or misuse technology
Decision making – it’s all about pattern recognition
David begins by explaining that many of the decision-making and behavioral biases that people have, including implicit racial and gender biases, result from pattern recognition. He uses an example of a product manager having to hire a web developer. With two identical resumes, the person that fits the more common social norm we recognize is more likely to be hired simply because they fit the pattern. However, as David explains, this comes as a result of cognitive biases.
What is cognitive bias?
A cognitive bias is a series of shortcuts our minds take that often help us get through the day but can sometimes hurt our decision-making. Our minds run on autopilot, so we don’t have to think through too many things, but it can sometimes result in errors; those errors are known as cognitive biases.
Some cognitive biases present an illusion of control, such as when we roll a dice. If you need a higher number, you are likely to exert more effort to try and attain that number even though how hard you roll a die has no impact on how it lands.
Confirmation bias
David also provides examples for confirmation bias. These biases are extremely difficult to combat. You may not know you have them (we call this a bias blind spot) since 95% of cognition happens below the conscious level. And, even if you do know, you will do it anyway, making it tough to fight these biases.
However, as product managers, content and design choices can be made to help keep harmful cognitive biases in check (or leverage them for good). For example, when hiring a developer, the city of Philadelphia printed out resumes. They redacted names with a marker since it was deemed unnecessary information, making it difficult for them to be impartial. To avoid assessing GitHub portfolios, they created a Chrome plugin to hide personal information and remove bias.
Cognitive fluency
“Another term you want to think about is cognitive fluency, and it’s this idea that if something looks like it’s going to be easy to read, it’s probably easy to do,” David explains.
If something is easier to read or rhymes, it automatically feels more certain. This works even when addressing medical problems. If the information is presented in a way that is hard to read, then it’s viewed as complicated. David highlights another example from the ‘Click It Or Ticket” campaign to encourage more people to wear seatbelts. When the legislation had been passed previously indicating that those who failed to follow the rule would be given a ticket and fined, there wasn’t much change. However, the rhyming effect of the campaign led to an increase in people following the law.
How to fight the framing effect
Another dangerous bias is known as the framing effect. The way you frame a question can impact people’s choices. Interestingly enough, bilingual people find it easier to see through the framing. As product managers, you can use the framing effect for good. By changing the way certain questions are asked, it’s possible to uncover what the conversation is really about and what outcome users are looking for.
What about our own cognitive biases?
Some other biases to take note of include:
- Notational bias: This, says David, is “a bias for the way we see the world and forms how you write about it”. David cites examples of how music can be different in other parts of the world.
- The self-serving bias: This is, says David, “If something goes well, that’s my fault if something goes poorly, that’s your fault.” This bias often occurs between humans and computers. Notational bias also appears in the way content gets structured and the way editorial guidelines are set.
Using design to avoid bias
David explains his initial misunderstanding of the scientific method. Rather than being meant to confirm our biases, its purpose is to help avoid confirmation bias. Some ways to fight bias include speculative design, for example creating a Black Mirror episode to consider the ways a user might misinterpret or misuse technology.
Also, try creating an assumption audit before starting any project by asking five questions:
- What identities does your team represent? Only self-identify as you are comfortable, but consider your intersectionality (gender, race, age, disability, income, location, neurodiversity, education, and more).
- How might those identities influence the design of this experience?
- What identities are not represented in your group?
- How might their absence influence the design of this experience?
- What might you do to include, honor, and give power to those perspectives in your design process?
David concludes his talk with something that can be used as the primary takeaway for product managers. A way of thinking. He says: “How can we define our jobs in a way that allows us to be more human to each other?”
Explore more #mtpcon video content