Insights

Screen-Shot-2020-03-06-at-14.41.57.png

What if you could see inside your users head?

We have started to measure what the user is thinking and it has given us quite a surprising result: we understand them up to 50%. This is significant. We tend to believe that when we talk to users and spend time in their environment, we get good user understanding.Certainly this is true, but the level of understanding may not be has good as we think.We have studied three different product development technologies – designing smart solutions for driving safety, designing accessible technology, and designing accessories for musicians. The accuracy is counted looking at how similar the user’s own statements are to those interpreted by the product developer. Below are examples interpreted user statements. We leave it to the reader to assess how accurate each interpretation is.

 


User

Product Developer

[Recalling] the uncomfortable feeling when I cant see where I’m reversing with the car.

Feeling happy to think about parking solutions.

User (person with visual impairment): During this explanation I’m feeling annoyed about
all the inaccessible touch screen / electronic devices I encounter in public.

Feels strongly that kiosks and public things should be accessible and is frustrated because
she sees how they could easily be accessible
but aren’t.

Thinking that the neck strap prevents me to bow towards the audience in an elegant way.
It could fall while bending forward.

Recalling the relief of taking the shoulder strap
off and being free of the sax’s weight.

 

How accurately the product developers knew what the users were thinking and feeling during the interviews, or what we call empathic accuracy, for all our data was 30-50%.

 

 

So far, we have used methods from psychology to measure product developers’ cognitive empathy, that part of empathy we are more aware of and can control. But, how can we measure the more subconscious empathy, emotional empathy? Here we turn to cognitive neuroscience, where one way of studying emotions has been through physiology. For example, similar physiological reactions indicate that people share emotions while watching a movie together (without talking to each other) (Golland et al., 2015). 

Others (Zaki et al., 2009) have taken a step further and looked into the brain. For example, in one study people were asked to tell emotional stories in front of a camera. Then, they used a device to register how were they feeling at every moment of the video, while they were sharing their stories. Observants of those videos went into an MRI scanner, and while they watched these clips, they had to guess how the person in the video was feeling at every moment. A clear link between brain activation and understanding feelings was observed.

We are now embarking on seeing if we can do the same in the context of product development. Maybe we cannot see in the users brain yet, but what can we learn if we place a designer in an MRI machine and look inside their brain while they are trying to understand a user?  Stay tuned.

 

Golland, Y., Arzouan, Y. and Levit-Binnun, N. (2015) ‘The mere co-presence: Synchronization of autonomic signals and emotional responses across co-present individuals not engaged in direct interaction’, PLoS One, 10(5), e0125804. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC444630

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC444630Zaki, J., Weber, J., Bolger, N. and Ochsner, K. N. (2009) ‘The neural basis of empathic accuracy’, Proceedings of the National Academy of Sciences of the United States of America, 106(27), 11382-11387  https://www.pnas.org/content/106/27/11382.full

 

 

 

adminWhat if you could see inside your users head?
Share this post