Blog

The 5 cognitive psychology biases that are affecting user testing and how to avoid them

Jul 23, 2018

Cognitive psychology is a form of psychological study which focuses on mental processes. Within this, there are patterns of behaviour called cognitive biases. These biases negatively affect the way in which an individual acts, often leading to the creation of a more “socially acceptable” version of themselves.

When running testing with users, these cognitive biases can affect the way testers act and respond to tasks, which can affect both the data produced and any research results. This isn’t good!

To combat this, I’ve put together a list of the five most common cognitive biases that affect user testing and some simple suggestions of how to overcome them:

 

#1 - The Framing Effect:

The framing effect can be an easy mistake to make when conducting user testing, as it concerns the way we phrase questions. People do not like making decisions independently; subsequently, they take cues from the environment to make a judgement on how they should act and respond.

 

An app tester makes notes about the product

 Researchers should avoid introducing bias via the wording of their questions.

 

To demonstrate, we may ask a tester: “What are the positives of this product?”. Asking in this manner leads the individual to only discuss only the favourable aspects of the product. Additionally, this can also cause a knock-on effect, with the respondent more likely to only discuss things in a more positive light for the remainder of your interaction.

A more appropriate question would be: “How do you feel about this product?”. The nature of this question enables the individual to be much more open with their feedback, and empowers an increased likelihood of yielding unbiased results.

#2 - Confirmation Bias:

This bias is a fault of the researcher. Naturally, humans are likely only to use information that supports their viewpoint.

 

A hedgehog lab workshop mid-flowhedgehog lab UX Researcher, Laura McKay, hard at work!

 

Confirmation bias can be introduced when we’re designing tests. Researchers may include leading questions (cough *framing effect* cough) in their procedure, or could even neglect to scrutinise information that goes against their hypothesis.

To avoid this, researchers must remember that their overarching aim is to learn from users. Whether results validate the researcher's initial point of view is irrelevant. Unbiased data does not lie, and researchers often find they learn more from what they hadn't thought about than what they had.

#3 - Social Desirability Bias:

Typically everyone wants to fit in with others, to make friends and create bonds with other people. However, this can cause issues in user testing, introducing the potential of research participants providing answers that are more acceptable to other testers. This prevents researchers from hearing the actual views of all taking part.

 

Attendees of a hedgehog lab workshop in discussionBeing honest is being the hedgehog. 

 

Social desirability bias is often prevalent in focus groups. If a question is asked which could lead to controversial opinions being aired, some participants may not wish to disclose their true feelings, and instead go with the general feeling of the group. Ultimately, this prevents the researcher from obtaining a true data set.

To overcome this, some choose only to conduct 1-to-1 interviews. However, focus groups can still be used provided that researchers reassure participants before commencing a session. They should also clearly state that there are no right or wrong answers, encouraging as many diverse opinions as possible.

 

#4 - Fundamental Attribution Error

This error relates to the tendency of individuals to focus on personal characteristics when something goes wrong. We have a habit of attributing blame to someone else's behaviour instead of analysing the whole situation and correctly assigning the problem.

 

A user tests an app while making notesIt's important to pay attention to how users are interacting with the product.

 

In testing, we see this when a user tells a researcher that they made a mistake while working through the research procedure. This may then be recorded as user mistake as opposed to an issue with the tech, as it is a common misconception that issues are human error as opposed to a design problem.

To compensate for this type of cognitive bias, researchers should be vigilant to what the user is doing and provide the participant with continued reassurance. The blame of human misuse is all too common, while reports of ill-fitting design remain low. Accordingly, it is essential that the user understands they're not doing anything wrong by pointing out any issues they are having, or flaws in the design.

 

#5 - Clustering Illusion Bias

A final frequent cognitive bias is another that the researcher is guilty of. The clustering illusion works for the human desire to organise and group things. The bias occurs when researchers identify a pattern in the data, ignoring everything else that the data set shows. 

 

hedgehog lab's logo drawn on a sticky noteKnowledge of psychology helps us avoid introducing bias when conducting user research.

 

This can be demonstrated in user research collection when the researcher assumes that, because 5 people have liked a feature, everyone will have liked it. In a sample of five this would be the case. However, from a sample of 30, the same conclusions cannot be drawn.

To avoid this bias, the researcher should ensure that they consider all evidence equally and think about the why. Are the results a factor of the sample size, demographics, questions, or any other variable? Researchers can also tackle this bias by working collaboratively when analysing data. When working as a pair, these simple groupings become less noticeable.

 


 

User Research might provide the platform for success, but the hard work doesn't stop on launch. Learn more about the product life cycle and monetisation. 

A_product_is_for_life_not_just_for_launch-4

Learn the importance of a monetisation plan

Want to work with us?

Get in touch