HOW TO EVOKE EMPATHY USING VR

I’ve heard a lot of people speak on techniques for enhancing empathy in VR. Here are some of the more memorable tidbits I’ve picked up (I’ll keep adding to this list as I go along):

cloudssidra.jpg

1. See Through Someone Else’s Eyes

This is the most obvious, popular approach: use a character’s point-of-view (POV) to walk a mile in their shoes.

In Clouds Over Sidra, the audience follows a little Syrian girl through a day in her life at a refugee camp. The 360 film is effective: it was used at UNICEF fundraisers and reportedly increased donations by 100%.

Jeremy Bailenson at Stanford created a similar setup where the viewer is placed in the POV of an African American woman. He went a step further however — deepening immersion via body mirroring.

When viewers move their hands, the black woman in the mirror moves her hands. When they shake their heads, she shakes hers. As a result, they start to take on the reflection as their own. Neuroscientists calls this “body transfer” — when one begins to associate the reflection with their own identity.

Self-interaction deepens the illusion: I heard Shiraz Akmal from Spaces speak at XTech earlier this year, and he said the most compelling part of this experience was touching his arm with his other hand, seeing the woman in the mirror do the same, and feeling the feedback.

External touch feedback is the icing on the cake: I participated in an exhibit recently called “Neurosociety.” Towards the end, we were seated in reclining chairs, given Oculus headsets, and placed inside the POV of a small doll. The doll was positioned just like us, sitting in a reclining chair not unlike a dentist’s. A hand came in offscreen and rapped the doll’s knee with a stick. At the same time, a mechanical arm extended from our chairs and hit all of our knees with a stick as well. The effect was instantaneous. All of a sudden, I felt wary of this disembodied hand. Whatever it could do to the doll, it could do to me. I WAS THE DOLL. (Most people started squirming and taking off their headsets when the hand pulled out a needle and started poking it at the doll’s eyes.)

Mirror feedback is super powerful — think about the mirror box V.S. Ramachandran invented to alleviate phantom limb pain.

(See the Rubber Hand Illusion for another trippy experiment on this topic)

Mirror box for phantom limb therapy (img source)

Mirror box for phantom limb therapy (img source)

 

2. Make Eye Contact

Gary the Gull making eye contact (source)

Let’s take the example of Gary the Gull from Limitless. The creators looked at standard gaming experiences — why do they feel like an uncanny valley?

From their Fast Co design article:

If you walk by a character in a game, they don’t momentarily meet your eyes–then look away–as a person might on the street. If you venture too close, most will just stand there, not acknowledging their own personal space. These points might sound small, but try imagining those behaviors in the real world. They’re vital to human social dynamics.

The Limitless team realized that Gary has to react to where the viewer is — otherwise, it just feels fake. So they added eye contact when you walk too close to Gary — a simple feature with a huge payoff.

Eye contact in Bigscreen VR (source)

Eye contact in Bigscreen VR (source)

I heard Darshan Shankar of Bigscreen VR talk about how he implemented eye contact — to make it truly realistic, the eyes can’t just automatically match when pointed toward each other. Our eyes make flitting back-and-forth movements called saccades. We generally lead with our heads when we turn and our eyes snap to meet whatever/whomever we’re looking at. By mimicking natural head and eye movements, POV experiences in VR can feel MUCH more immersive.

 

3. Use Distance to Evoke Emotions

When talking about empathy and presence, we have to talk about Oculus Story Studio. The team went all out researching novel mechanics for Henry, their Emmy-winning short film.

Key takeaway: spatial position matters.

Slapstick comedy is different: in VR, it’s not necessarily funny to see someone fall right in front of you.

When making Henry, the animators wanted to do a close-up shot of Henry having a vulnerable moment. But they realized that having the character crying right in your face is actually kind of unsettling. So they tried moving him further away and put him in a corner. Turns out: seeing someone cry at a slight distance elicits much more empathy from viewer.

Oculus Story Studio also implemented eye contact for Henry (source)

Oculus Story Studio also implemented eye contact for Henry (source)

Let’s push this further: what if you positioned a character to mirror the viewer’s body language? Instead of in a corner straight ahead, a character could be crying a few feet away on the viewer’s side, as if they were sitting on a bench together, with the character just out of reach. We already know we instinctively feel connected to people in real life who mirror our body language. Would leveraging that in VR invoke more closeness/intimacy with the virtual characters?

And what is the optimal distance to place characters relative to the viewer — is it different if you’re trying to elicit empathy vs rage vs disgust?

 

4. Use Scale

One thing I wish the studio Here Be Dragons had done with “Clouds Over Sidra” is use size. Force viewers to look up at adults, make them feel like the small, vulnerable 12-year old girl whose perspective they are experiencing.

The converse also applies: to empathize with a bully — show what it’s like to feel big and scary. Show how it feels to know that people are intimidated by you — does that change how you behave toward others?

A great case study I’d like to see: Put people in the eyes and size of a large black male who is marked as a threat for no reason at all. How differently does a traffic stop feel from one perspective vs the other?

Make sure to extend binaural audio to this context—e.g. if the viewer hears a voice from a smaller person, have the sound come from below.

 

5. Interacting with Others => Behavior Change

Flying through a city to find and save a diabetic child (source)

Jeremy Bailenson performed another experiment on altruism.

In the experience, you are a superhero flying through a city and have to deliver a life-saving insulin injection to a child. The hypothesis was that handing over the medicine to save the child would be a kinesthetic reminder to help others.

After the experiment, participants observed an actress dropping a bunch of pens on the ground. Those who helped the virtual diabetic child were much more likely and much quicker to help the actress.

Why Is This?

Kinesthetic reminders are incredibly powerful because we remember multisensory experiences much better than we do visual or auditory reminders.

Also, we have a habit of acting in accordance with our past behaviors. If we helped someone in the past, we are much more likely to help them in the future. We behave consistently because to go against that would be invalidating our past behaviors.

 

additional Presence Tips:

Leverage reflexes, e.g. flinch response: The brain is easily fooled, but given time, it adapts. Soo… don’t give it time to adapt. Evoke instinctual reactions, such as the withdrawal reflex (this is the reason we sometimes pull our hands back from a hot stove before we actually feel the pain). In the right VR context, you can use fire or a splash of water to the same effect. It should always be sudden and unexpected, however; our instincts kick in when we are surprised and not given time to think through a reaction. Leveraging these template responses can thoroughly enhance presence in VR.

 

Some more resources:

  • Machine to Be Another: performance piece where body transfer is heightened with another person’s help
  • Mirror Neurons: the brain mechanism for how watching others evokes an empathetic reaction
 

If you have more stuff you think should be on this list, hit me up! @shmallick or mallick.skm@gmail.com