Why empathy & tech, and why now?

In many ways, this feels like a really weird time to be writing a book about empathy.

Especially empathy & technology, the latter of which is broad but also broadly applies only to people with a certain kind of privilege. Access to broadband, to schools with tech resources, to hardware like laptops or phones or ipads. Access to disposable income that can be spent on something like a VR headset (which I admittedly haven’t been able to justify yet myself).

And with all that’s going on in the world, I also think a lot about the targets of empathy crusades.

Who are you trying to build empathy for? Who are you trying to build it in? Why? What are the potential inherent biases there? Who are you othering, intentionally or not? I’m grateful to Sundance’s Kamal Sinclair and documentarian Michele Stephenson, among others, for helping me think through this with their work.

I want this book to further this conversation, and start new ones. But on a lot of recent days, with the weight of what’s happening in the wider world, and the many small heartbreaks in my own circle, I wonder if it matters.

My recent trip to New York to attend part of the Games for Change summit helped me think through a lot of this and gave me some validation that the stories I’m trying to tell do matter, even if it’s not always clear exactly how and why. The thing I keep coming back to: empathy is not endorsement, and empathy is not enough.

Here I am experiencing 1,000 Cut Journey, an immersive VR experience depicting the life of a young black man in America that moved me almost to tears.

XR4C

I spend a lot of time (always have) wondering if I should be doing something else. But I’m dedicated to finding a way to productively add to this conversation about the future of empathy. Maybe I’ll fail, but at a time when nothing feels like enough, this is what I have to offer, and I have to try.

Advertisements

The Future of Feeling

Hi all! It’s been kind of quiet here recently because I’ve been working on a pretty big project that I can now finally announce: I’m writing a book!

It’s about empathy, of course. The future of empathy and technology, to be more precise. I get to interview lots of people who are creating technology aimed at building and/or preserving empathy in our tech-obsessed world, and it’s honestly a dream come true.

I’ll still be blogging here a bit. Even 60,000 words isn’t enough to cover everything empathy ;) And I want to thank all of you for reading –  you helped me get here!

Stay tuned for updates, and more nerdy posts in the coming months.

We like to move it

Virtual reality is often referred to as an “empathy machine,” a term coined in 2015 by tech entrepreneur and artist Chris Milk in a TED Talk. The idea is that while reading about something, or even watching a documentary, can be moving, there’s something uniquely intimate about virtual reality. It puts you “in” a situation in a way that other media doesn’t.

I’ve written before about how this idea has taken hold in service of social causes, and how “future tech” that’s really right around the corner could take empathy to a whole new level. Research is ongoing into what really happens when people put on VR headsets. Do they really feel more empathy for the characters they’re “watching,” or for people who experience the things they “experience” in VR? Some evidence shows that the answer is yes, but feedback about overwhelm and empathy fatigue after VR experiences is also common.

A couple of weeks ago Jeremy Bailenson, one of the foremost experts on VR, wrote in WIRED about some new evidence that the most effective way to create empathy through a VR experience is to make the user move around.

Bailenson, a professor of communication at Stanford, conducted a study in 2013 in which participants simulated being colorblind. Half used their imagination, while the other half used VR. They were then asked to lift up and sort objects in VR. The results showed that those who had experienced colorblindness in VR had a much harder time completing the task, and after the study, they spent a lot more time helping others when asked to do so.

The next study Bailenson plans to release will show a correlation between moving around a virtual coral reef and subjects’ desire to know more about ocean conservation.

He goes into a lot more detail in the piece, which you should read! This strategy of making people move around while having a VR experience might be the answer to a lot of criticisms of empathy focused VR. It makes sense to me just from a muscle-memory standpoint, but it will be interesting to see what the data shows about how VR, movement, and empathy are actually connected in our brains.

Empathy is both given & made

When the subject of empathy comes up, there’s often a debate about whether we’re born with it, or whether it’s something we learn. As with most things, the answer is probably not at either end of the spectrum  – it’s most likely in the middle.

In the past few months, I’ve been researching and writing about both ends.

For Woolly, I wrote about the empathy movement in podcasting, where a growing collection of shows aims to get people to listen to (and have) tough conversations. I wrote about my personal retreat into podcasts (and away from cable news and social media) after the 2016 presidential election, and how some of them – especially With Friends Like These – helped me find empathy where I didn’t expect it.

Then I wrote for Vitals, Lifehacker’s health vertical, about the newest development in the search for an empathy gene. Researchers have figured out that at least some of individuals’ differences in empathy can be explained by DNA, so we might inherit our empathy levels, and disorders characterized by low empathy, like schizophrenia, might have a genetic cause. But they’re still trying to find out how. This latest study didn’t come up with any major revelations, but it’s a step forward, and it also validated a lot of previous findings.

That’s all for now. Apologies for being so absent these past few months. I moved from New York back to North Carolina and have been settling in. Now that things are starting to feel normal, I’ll be back to blogging more regularly!

Poking the empathy part of the brain

Hello and Happy New Year! Hopefully by now you’ve settled into 2018 and dug out of the frozen tundra (if you’re in the U.S.) Here in Brooklyn we have reached the “grey slush and surprise ice patches” phase of winter and it’s gross as ever. I could really write a whole post about the disgusting things that happen to the sidewalks here when the snow melts, but I’ll spare you… Instead, let’s talk about something else that seems kind of obvious but is actually super interesting: empathy’s role in moral dilemmas like, you know, whether or not to kill someone.

In a recent UCLA study, researchers found that they could guess whether a person would harm or kill another person based on how their brains reacted when they saw someone else in pain.

The researchers showed 19 volunteers two videos in which a hand was either touched by a cotton swab or stabbed with a needle. This is a pretty common way to measure empathy, and that’s what these scientists did: they used an fMRI machine to measure brain activity in the places where this specific type of empathy – acknowledging someone else’s pain and feeling for them – is thought to reside. They also apparently analyzed researchers’ “mirror neurons,” which are neurons that are said to fire both when a person feels something and when they see someone else appearing to experience that same feeling. (There’s some controversy about mirror neurons.)

In addition to showing them the videos, the researchers asked the participants some common questions aimed at determining whether a person would hurt or kill someone else: there’s the “crying baby during wartime” question, for example, when you’re meant to say whether you would suffocate a baby whose cries might give away your location to the enemy. They also asked whether the volunteers would torture someone in order to prevent a bomb from killing some people, or harm research animals in order to cure AIDS.

The researchers guessed that those with more empathy related action happening in their brains during the needle video would be less likely to hurt the crying baby, and they turned out to be correct, at least with this small sample size. They didn’t find a correlation between brain activity and willingness to hurt someone else to help a larger number of people, however. The reason, they argued, may be that those decisions are a lot more complex.

“It would be fascinating to see if we can use brain stimulation to change complex moral decisions through impacting the amount of concern people experience for others’ pain,” said Marco Iacoboni, director of the Neuromodulation Lab at UCLA and one of the leading mirror neuron experts, in a statement. “It could provide a new method for increasing concern for others’ well-being.”

I highlighted just a few of the reasons this research does not suggest causation (and, as you hopefully know, research rarely does). But I’m actually more interested in Iacoboni’s quote. I’ve been researching and writing a lot over the past year about the different ways people are trying to increase empathy in the human brain. Most of the time, these stories involve tech like virtual reality or Fitbit-like gadgets. But Iacoboni’s suggestion of using brain stimulation to potentially make people more empathetic decision-makers doesn’t seem that far-fetched…though it does seem kind of like taking the easy way out. I’m sure he means this mostly (if not completely) for academic purposes, but I wouldn’t put it past tech companies to find a quick way to capitalize on “instant empathy.” We already have brain stimulation gadgets that are meant to help with stress and anxiety and a host of other things.

There are a couple of concerns, here. First is regulation to keep people from accidentally harming themselves or tech companies from doing nefarious things with the new level of personal brain data they might collect. Second is kind of the opposite: Do we want the government to potentially have the ability to stimulate our brains to change how we make complex moral decisions? I don’t mean to sound like a conspiracy theorist! But when so much sci fi stuff is becoming real life, it seems worth asking these questions.

Something to keep an eye on, for sure.

On empathy, endorsement, and what happens next

I walk a little over a mile to and from work each day, and I usually spend it listening to podcasts, or to books on Audible. After more than a year of this, I really look forward to certain days when I know certain podcasts will have a new episode out. Note to Self is one of them. I love Manoush Zomorodi’s style of reporting on technology and how it affects our lives, and I love how she’s styled herself as a guide to “our accelerating world.” Because wow, yes, is it ever accelerating.

Note to Self is often about technology in a technical sense, but the show also takes occasional detours into the psychology of how we interact with tech. This, of course, is my favorite thing to write and read about. So I was really excited when Zomorodi recently interviewed Dylan Marron. He’s a progressive YouTuber and writer who also has a new podcast, called Conversations With People Who Hate Me. It’s pretty much exactly what it sounds like. Remember when Lindy West called her troll and it became a viral This American Life episode? Marron does a similar thing on each episode of Conversations. He talks to the people who profess to think he’s the scum of the earth, and tries to find out why.

This is something I’m going to write more about soon – podcasts and radical empathy – but for the purposes of this blog post I really want to focus on one thing Marron said during this Note to Self episode: empathy doesn’t mean endorsement. This is a fact that’s become so obvious to me, I think I forget to enunciate it to others when I talk about empathy. I’ve never found such a succinct way of saying it, either. But it’s absolutely the truth: sitting down and listening to someone does not necessarily mean validating them, and it definitely doesn’t mean agreeing with them. It’s just…acknowledging them. Taking their perspective.

That can feel a little scary. I know that I have had experiences in which I read something written by someone with vastly different views from my own and as I prepare to put myself in their shoes I think, what if I can’t get back out? What if they convince me? But things don’t really happen that abruptly, most of the time. We make our decisions and create our ideologies based on a mix of experiences and information, and it all sort of flows together and tries to balance itself out, rarely truly solidifying into one thing. What I mean is, we’re always learning, always changing our minds a little bit, even if we don’t always notice it, or want to.

I thought about this concept a lot as I watched the recent Alabama election unfold. Everyone around me kept asking, “How could these people vote for a pedophile?” I can’t say the answer is the same for everyone who voted for Roy Moore, but I can say pretty confidently that many of them did so because they didn’t believe what they heard about him. Or, they only believed parts of what they heard about him.

Brian Resnick has a great piece about this up at Vox. He interviewed a lot of Moore supporters in Alabama before the election, and reading this piece, I feel like I can really empathize with these people. Trying to put myself briefly in their shoes, I feel afraid, I feel disappointed, I feel betrayed. This is something I tried to do when reading story after story about Trump supporters last year as well. And I think it’s a worthwhile practice. But the part that nobody really seems to talk about is… then what?1

What do I do with this information? What do I do with the fact that people of all parties and ideologies cling to confirmation bias and “motivated reasoning?” Well, it’s made me feel a little bit less hopeless about change, for one thing. That might seem counter-intuitive, but knowing that we’re all susceptible to this, and witnessing people have conversations about it that don’t end in name-calling or fist fights, is encouraging. It also helps me feel less angry, which is no small thing. Over the past couple of years I’ve found anger to be less and less useful for me, at least on a personal level. Being mad at friends or family or strangers who did something I see as wrong doesn’t actually accomplish anything for me, except raising my blood pressure. When I understand their points of view a bit better, I can take some of the emotion out of my reaction to them. And if we’re both on the same page about that, we can have a conversation, and figure out where we agree. And sometimes… sometimes, one or both of us can bend a little. Without the pressure to immediately admit or agree to anything, this can feel a lot easier.

There’s one major caveat to all of this. And it’s never far from my mind when reading and listening to these conversations. This isn’t just about liberals learning to empathize with conservatives. There’s a lot going on in the other direction as well. And, especially after the election of Doug Jones over Roy Moore in Alabama, it’s way past time to start asking people to empathize with another group who doesn’t get nearly enough attention despite their huge impact and disproportionate burden: black voters. Especially black women. It’s good that we’re talking about empathy so much, but we also need to be real with ourselves about who we reserve it for.

Coding Empathy

I am always looking for cool ways that educators are using technology to teach empathy. This usually involves some kind of perspective-taking exercise, whether it be a simple real-life simulation or a virtual reality experience. Honestly, sometimes even kids’ TV shows are super explicit about empathy, and it seems like that can have just as much of an impact as the flashier stuff, if the kids are comfortable and attentive enough.

Yesterday, though, I came across a cool tech solution for building empathy that one teacher is using in Australia that takes things a step further, into the world of STEM. Getting kids interested in science and math at a young age is probably even more buzzworthy than teaching them empathy at this point, so a lesson that puts those two things together? I’m intrigued.

Journalist Sarah Duggan writes for Australian Teacher Magagzine about a teacher in South Australia who has the teenagers in her digital technology class use a feature of the game Minecraft, called Code Builder, to build a base camp for refugees from an imaginary war ravaging their own country. She encourages them to use resources like the United Nations High Commissioner for Refugees website to make sure the camp has the right supplies and structure, and in the process, they get a small taste of what it’s like to be on the refugee side of the equation.

“One of the first things that struck the students was the number of people who had to share amenities,” the teacher, Renee Chen, tells Duggan. “For example, in times of crisis and emergency up to 20 people have to share one toilet … Very quickly, students learn that refugees have very little control over their circumstances.”

The assignment also helped students better understand some of their classmates who were refugees themselves. “The task sparked conversations between these two groups of students,” Chen said. “These discussions led to the sharing of stories, formation of personal connections, and the development of empathy.” She said it also made students feel less helpless about awful things happening around the world.

These anecdotal results are really similar to what students report after playing games or having VR experiences that put them in others’ shoes. In this case, coding was basically a way to create a virtual world in which to do that. But it got me thinking about how empathy might be useful in coding more broadly, and not just among students, but with professionals as well.

This led me to the blog Coding With Empathy, which is run by a developer named Pavneet Singh Saund, whose mission is to get his fellow developers to hone “soft” skills like empathy and collaboration in much the same way they make sure to keep learning their “hard” craft. One of the main selling points for this kind of thinking is that developers aren’t creating software just for the sake of it – they’re creating it for users. And there’s a growing movement focused on teaching developers to put themselves in those users’ shoes in order to create the best product for them.

Saund is far from the only one talking about this. The folks at Simple Programmer call empathy “a developer’s superpower,” and Creative Bloq says implementing more empathy can even make developing more efficient.

I’m not a software developer or engineer of any kind, so I can’t tell you from personal experience whether any of this is true. But I can tell you as a journalist who pays attention to these things that it’s notable how many new ways educators, developers, and professionals of all kinds are finding to work empathy into their missions.