Tech is a tool – turns out empathy is too.

One of the most confounding things about researching empathy is that it’s often talked about as if it’s an obvious Good Thing. I constantly read and hear that empathy will help fix this problem or improve that relationship or better that industry. But the reality I am coming to understand is that empathy is a mechanism, a tool of our brains and hearts, that does not have an inherent value. It comes up most often in a positive context because we see understanding others and putting ourselves in their shoes as a good thing, which it often is. But this kind of perspective-taking is also used for negative, manipulative reasons all the time. We don’t usually call that empathy, but in many cases the mechanism seems the same.

I’ve been thinking about this a lot since my June trip back to New York for the Games for Change summit. I eagerly consumed as many virtual reality experiences as I could in my one afternoon there, surrendering myself as much as possible to the narratives they immersed me in. I was a fly on the wall in the bedroom of a white supremacist teenager; I followed along with nonprofit workers as they helped to free a family in India from servitude; I embodied the virtual experience of a young black man. All of these were trying to teach me about a human experience I was unfamiliar with, and they all had positive intentions. But if the makers behind these VR projects could bring me close to tears for a former white nationalist and a young black man dealing with daily microaggressions in the same afternoon, what else could they do?

The major caveat to this line of thinking is, in my view, the fact that how we come to VR experiences – our own personal backgrounds, our expectations, or hopes for what we will see and feel – plays a big role in determining how they affect us. I don’t think many people are putting on VR headsets totally unencumbered by expectations or opinions about the content of what they’re about to see. As this technology becomes more ubiquitous, though, it might become easier to sneak these experiences on people; pulling the same levers that trigger empathy can trigger other things as well: hatred, anger, fear. We already do it with less fanciful technology, don’t we? Digital advertisers work to try to understand how we view the world so that they can put t he best ads in front of us; communications teams for politicians write certain words and phrases into speeches that they guess, by putting themselves briefly in the shoes of their base, will generate a reaction. Perspective-taking in service of manipulation isn’t new, but as technology evolves, the capacity for doing this on a larger and larger level grows.

This concern really started bubbling up for me a a couple of months ago, after I experienced a couple of rough days of trolling on Twitter. I realized that while I had always seen internet trolls as lacking empathy, what they were doing actually required a certain level of it. They just wielded it a different way than we’re used to thinking about. So of course, I tweeted some thoughts:



This gets at a lot of what’s been swirling around in my head as I write this book about the future of empathy and technology. The people I interview often remind me that technology is a tool, but I’m realizing that empathy is one too.

As I wrote in my last post, it feels like a weird time to be writing a book about empathy. But conversations like the one described above make me feel like it’s worth it, no matter how it turns out. Stay tuned for more.


Why empathy & tech, and why now?

In many ways, this feels like a really weird time to be writing a book about empathy.

Especially empathy & technology, the latter of which is broad but also broadly applies only to people with a certain kind of privilege. Access to broadband, to schools with tech resources, to hardware like laptops or phones or ipads. Access to disposable income that can be spent on something like a VR headset (which I admittedly haven’t been able to justify yet myself).

And with all that’s going on in the world, I also think a lot about the targets of empathy crusades.

Who are you trying to build empathy for? Who are you trying to build it in? Why? What are the potential inherent biases there? Who are you othering, intentionally or not? I’m grateful to Sundance’s Kamal Sinclair and documentarian Michele Stephenson, among others, for helping me think through this with their work.

I want this book to further this conversation, and start new ones. But on a lot of recent days, with the weight of what’s happening in the wider world, and the many small heartbreaks in my own circle, I wonder if it matters.

My recent trip to New York to attend part of the Games for Change summit helped me think through a lot of this and gave me some validation that the stories I’m trying to tell do matter, even if it’s not always clear exactly how and why. The thing I keep coming back to: empathy is not endorsement, and empathy is not enough.

Here I am experiencing 1,000 Cut Journey, an immersive VR experience depicting the life of a young black man in America that moved me almost to tears.


I spend a lot of time (always have) wondering if I should be doing something else. But I’m dedicated to finding a way to productively add to this conversation about the future of empathy. Maybe I’ll fail, but at a time when nothing feels like enough, this is what I have to offer, and I have to try.

The Future of Feeling

Hi all! It’s been kind of quiet here recently because I’ve been working on a pretty big project that I can now finally announce: I’m writing a book!

It’s about empathy, of course. The future of empathy and technology, to be more precise. I get to interview lots of people who are creating technology aimed at building and/or preserving empathy in our tech-obsessed world, and it’s honestly a dream come true.

I’ll still be blogging here a bit. Even 60,000 words isn’t enough to cover everything empathy ;) And I want to thank all of you for reading –  you helped me get here!

Stay tuned for updates, and more nerdy posts in the coming months.

We like to move it

Virtual reality is often referred to as an “empathy machine,” a term coined in 2015 by tech entrepreneur and artist Chris Milk in a TED Talk. The idea is that while reading about something, or even watching a documentary, can be moving, there’s something uniquely intimate about virtual reality. It puts you “in” a situation in a way that other media doesn’t.

I’ve written before about how this idea has taken hold in service of social causes, and how “future tech” that’s really right around the corner could take empathy to a whole new level. Research is ongoing into what really happens when people put on VR headsets. Do they really feel more empathy for the characters they’re “watching,” or for people who experience the things they “experience” in VR? Some evidence shows that the answer is yes, but feedback about overwhelm and empathy fatigue after VR experiences is also common.

A couple of weeks ago Jeremy Bailenson, one of the foremost experts on VR, wrote in WIRED about some new evidence that the most effective way to create empathy through a VR experience is to make the user move around.

Bailenson, a professor of communication at Stanford, conducted a study in 2013 in which participants simulated being colorblind. Half used their imagination, while the other half used VR. They were then asked to lift up and sort objects in VR. The results showed that those who had experienced colorblindness in VR had a much harder time completing the task, and after the study, they spent a lot more time helping others when asked to do so.

The next study Bailenson plans to release will show a correlation between moving around a virtual coral reef and subjects’ desire to know more about ocean conservation.

He goes into a lot more detail in the piece, which you should read! This strategy of making people move around while having a VR experience might be the answer to a lot of criticisms of empathy focused VR. It makes sense to me just from a muscle-memory standpoint, but it will be interesting to see what the data shows about how VR, movement, and empathy are actually connected in our brains.

Empathy is both given & made

When the subject of empathy comes up, there’s often a debate about whether we’re born with it, or whether it’s something we learn. As with most things, the answer is probably not at either end of the spectrum  – it’s most likely in the middle.

In the past few months, I’ve been researching and writing about both ends.

For Woolly, I wrote about the empathy movement in podcasting, where a growing collection of shows aims to get people to listen to (and have) tough conversations. I wrote about my personal retreat into podcasts (and away from cable news and social media) after the 2016 presidential election, and how some of them – especially With Friends Like These – helped me find empathy where I didn’t expect it.

Then I wrote for Vitals, Lifehacker’s health vertical, about the newest development in the search for an empathy gene. Researchers have figured out that at least some of individuals’ differences in empathy can be explained by DNA, so we might inherit our empathy levels, and disorders characterized by low empathy, like schizophrenia, might have a genetic cause. But they’re still trying to find out how. This latest study didn’t come up with any major revelations, but it’s a step forward, and it also validated a lot of previous findings.

That’s all for now. Apologies for being so absent these past few months. I moved from New York back to North Carolina and have been settling in. Now that things are starting to feel normal, I’ll be back to blogging more regularly!

Poking the empathy part of the brain

Hello and Happy New Year! Hopefully by now you’ve settled into 2018 and dug out of the frozen tundra (if you’re in the U.S.) Here in Brooklyn we have reached the “grey slush and surprise ice patches” phase of winter and it’s gross as ever. I could really write a whole post about the disgusting things that happen to the sidewalks here when the snow melts, but I’ll spare you… Instead, let’s talk about something else that seems kind of obvious but is actually super interesting: empathy’s role in moral dilemmas like, you know, whether or not to kill someone.

In a recent UCLA study, researchers found that they could guess whether a person would harm or kill another person based on how their brains reacted when they saw someone else in pain.

The researchers showed 19 volunteers two videos in which a hand was either touched by a cotton swab or stabbed with a needle. This is a pretty common way to measure empathy, and that’s what these scientists did: they used an fMRI machine to measure brain activity in the places where this specific type of empathy – acknowledging someone else’s pain and feeling for them – is thought to reside. They also apparently analyzed researchers’ “mirror neurons,” which are neurons that are said to fire both when a person feels something and when they see someone else appearing to experience that same feeling. (There’s some controversy about mirror neurons.)

In addition to showing them the videos, the researchers asked the participants some common questions aimed at determining whether a person would hurt or kill someone else: there’s the “crying baby during wartime” question, for example, when you’re meant to say whether you would suffocate a baby whose cries might give away your location to the enemy. They also asked whether the volunteers would torture someone in order to prevent a bomb from killing some people, or harm research animals in order to cure AIDS.

The researchers guessed that those with more empathy related action happening in their brains during the needle video would be less likely to hurt the crying baby, and they turned out to be correct, at least with this small sample size. They didn’t find a correlation between brain activity and willingness to hurt someone else to help a larger number of people, however. The reason, they argued, may be that those decisions are a lot more complex.

“It would be fascinating to see if we can use brain stimulation to change complex moral decisions through impacting the amount of concern people experience for others’ pain,” said Marco Iacoboni, director of the Neuromodulation Lab at UCLA and one of the leading mirror neuron experts, in a statement. “It could provide a new method for increasing concern for others’ well-being.”

I highlighted just a few of the reasons this research does not suggest causation (and, as you hopefully know, research rarely does). But I’m actually more interested in Iacoboni’s quote. I’ve been researching and writing a lot over the past year about the different ways people are trying to increase empathy in the human brain. Most of the time, these stories involve tech like virtual reality or Fitbit-like gadgets. But Iacoboni’s suggestion of using brain stimulation to potentially make people more empathetic decision-makers doesn’t seem that far-fetched…though it does seem kind of like taking the easy way out. I’m sure he means this mostly (if not completely) for academic purposes, but I wouldn’t put it past tech companies to find a quick way to capitalize on “instant empathy.” We already have brain stimulation gadgets that are meant to help with stress and anxiety and a host of other things.

There are a couple of concerns, here. First is regulation to keep people from accidentally harming themselves or tech companies from doing nefarious things with the new level of personal brain data they might collect. Second is kind of the opposite: Do we want the government to potentially have the ability to stimulate our brains to change how we make complex moral decisions? I don’t mean to sound like a conspiracy theorist! But when so much sci fi stuff is becoming real life, it seems worth asking these questions.

Something to keep an eye on, for sure.

On empathy, endorsement, and what happens next

I walk a little over a mile to and from work each day, and I usually spend it listening to podcasts, or to books on Audible. After more than a year of this, I really look forward to certain days when I know certain podcasts will have a new episode out. Note to Self is one of them. I love Manoush Zomorodi’s style of reporting on technology and how it affects our lives, and I love how she’s styled herself as a guide to “our accelerating world.” Because wow, yes, is it ever accelerating.

Note to Self is often about technology in a technical sense, but the show also takes occasional detours into the psychology of how we interact with tech. This, of course, is my favorite thing to write and read about. So I was really excited when Zomorodi recently interviewed Dylan Marron. He’s a progressive YouTuber and writer who also has a new podcast, called Conversations With People Who Hate Me. It’s pretty much exactly what it sounds like. Remember when Lindy West called her troll and it became a viral This American Life episode? Marron does a similar thing on each episode of Conversations. He talks to the people who profess to think he’s the scum of the earth, and tries to find out why.

This is something I’m going to write more about soon – podcasts and radical empathy – but for the purposes of this blog post I really want to focus on one thing Marron said during this Note to Self episode: empathy doesn’t mean endorsement. This is a fact that’s become so obvious to me, I think I forget to enunciate it to others when I talk about empathy. I’ve never found such a succinct way of saying it, either. But it’s absolutely the truth: sitting down and listening to someone does not necessarily mean validating them, and it definitely doesn’t mean agreeing with them. It’s just…acknowledging them. Taking their perspective.

That can feel a little scary. I know that I have had experiences in which I read something written by someone with vastly different views from my own and as I prepare to put myself in their shoes I think, what if I can’t get back out? What if they convince me? But things don’t really happen that abruptly, most of the time. We make our decisions and create our ideologies based on a mix of experiences and information, and it all sort of flows together and tries to balance itself out, rarely truly solidifying into one thing. What I mean is, we’re always learning, always changing our minds a little bit, even if we don’t always notice it, or want to.

I thought about this concept a lot as I watched the recent Alabama election unfold. Everyone around me kept asking, “How could these people vote for a pedophile?” I can’t say the answer is the same for everyone who voted for Roy Moore, but I can say pretty confidently that many of them did so because they didn’t believe what they heard about him. Or, they only believed parts of what they heard about him.

Brian Resnick has a great piece about this up at Vox. He interviewed a lot of Moore supporters in Alabama before the election, and reading this piece, I feel like I can really empathize with these people. Trying to put myself briefly in their shoes, I feel afraid, I feel disappointed, I feel betrayed. This is something I tried to do when reading story after story about Trump supporters last year as well. And I think it’s a worthwhile practice. But the part that nobody really seems to talk about is… then what?1

What do I do with this information? What do I do with the fact that people of all parties and ideologies cling to confirmation bias and “motivated reasoning?” Well, it’s made me feel a little bit less hopeless about change, for one thing. That might seem counter-intuitive, but knowing that we’re all susceptible to this, and witnessing people have conversations about it that don’t end in name-calling or fist fights, is encouraging. It also helps me feel less angry, which is no small thing. Over the past couple of years I’ve found anger to be less and less useful for me, at least on a personal level. Being mad at friends or family or strangers who did something I see as wrong doesn’t actually accomplish anything for me, except raising my blood pressure. When I understand their points of view a bit better, I can take some of the emotion out of my reaction to them. And if we’re both on the same page about that, we can have a conversation, and figure out where we agree. And sometimes… sometimes, one or both of us can bend a little. Without the pressure to immediately admit or agree to anything, this can feel a lot easier.

There’s one major caveat to all of this. And it’s never far from my mind when reading and listening to these conversations. This isn’t just about liberals learning to empathize with conservatives. There’s a lot going on in the other direction as well. And, especially after the election of Doug Jones over Roy Moore in Alabama, it’s way past time to start asking people to empathize with another group who doesn’t get nearly enough attention despite their huge impact and disproportionate burden: black voters. Especially black women. It’s good that we’re talking about empathy so much, but we also need to be real with ourselves about who we reserve it for.