Hello and Happy New Year! Hopefully by now you’ve settled into 2018 and dug out of the frozen tundra (if you’re in the U.S.) Here in Brooklyn we have reached the “grey slush and surprise ice patches” phase of winter and it’s gross as ever. I could really write a whole post about the disgusting things that happen to the sidewalks here when the snow melts, but I’ll spare you… Instead, let’s talk about something else that seems kind of obvious but is actually super interesting: empathy’s role in moral dilemmas like, you know, whether or not to kill someone.
In a recent UCLA study, researchers found that they could guess whether a person would harm or kill another person based on how their brains reacted when they saw someone else in pain.
The researchers showed 19 volunteers two videos in which a hand was either touched by a cotton swab or stabbed with a needle. This is a pretty common way to measure empathy, and that’s what these scientists did: they used an fMRI machine to measure brain activity in the places where this specific type of empathy – acknowledging someone else’s pain and feeling for them – is thought to reside. They also apparently analyzed researchers’ “mirror neurons,” which are neurons that are said to fire both when a person feels something and when they see someone else appearing to experience that same feeling. (There’s some controversy about mirror neurons.)
In addition to showing them the videos, the researchers asked the participants some common questions aimed at determining whether a person would hurt or kill someone else: there’s the “crying baby during wartime” question, for example, when you’re meant to say whether you would suffocate a baby whose cries might give away your location to the enemy. They also asked whether the volunteers would torture someone in order to prevent a bomb from killing some people, or harm research animals in order to cure AIDS.
The researchers guessed that those with more empathy related action happening in their brains during the needle video would be less likely to hurt the crying baby, and they turned out to be correct, at least with this small sample size. They didn’t find a correlation between brain activity and willingness to hurt someone else to help a larger number of people, however. The reason, they argued, may be that those decisions are a lot more complex.
“It would be fascinating to see if we can use brain stimulation to change complex moral decisions through impacting the amount of concern people experience for others’ pain,” said Marco Iacoboni, director of the Neuromodulation Lab at UCLA and one of the leading mirror neuron experts, in a statement. “It could provide a new method for increasing concern for others’ well-being.”
I highlighted just a few of the reasons this research does not suggest causation (and, as you hopefully know, research rarely does). But I’m actually more interested in Iacoboni’s quote. I’ve been researching and writing a lot over the past year about the different ways people are trying to increase empathy in the human brain. Most of the time, these stories involve tech like virtual reality or Fitbit-like gadgets. But Iacoboni’s suggestion of using brain stimulation to potentially make people more empathetic decision-makers doesn’t seem that far-fetched…though it does seem kind of like taking the easy way out. I’m sure he means this mostly (if not completely) for academic purposes, but I wouldn’t put it past tech companies to find a quick way to capitalize on “instant empathy.” We already have brain stimulation gadgets that are meant to help with stress and anxiety and a host of other things.
There are a couple of concerns, here. First is regulation to keep people from accidentally harming themselves or tech companies from doing nefarious things with the new level of personal brain data they might collect. Second is kind of the opposite: Do we want the government to potentially have the ability to stimulate our brains to change how we make complex moral decisions? I don’t mean to sound like a conspiracy theorist! But when so much sci fi stuff is becoming real life, it seems worth asking these questions.
Something to keep an eye on, for sure.