I'd like to open a discussion on making Effective Altruism more emotionally appealing. I'm especially interested in this topic due to my broader project, Intentional Insights, of spreading rational thinking, including about Effective Altruism, to a wide audience. As part of doing so, I find that I and other members of Intentional Insights engage with a number of people who are interested in EA when I present it to them, and accept the premise of doing the most good for the most number, but have trouble engaging with the movement due to the emotional challenges they experience. To be clear, this is in addition to some of the problematic signaling coming from the EA movement that Tom Davidson described well in his recent piece, "EA's Image Problem," and not explicitly due to the things outlined there, although they play a role indirectly.
What I am talking about is people who are interested in effective giving optimized to save the most lives, but then have trouble buying into the EA approach to it emotionally. They have challenges accepting that they have inherent cognitive biases that make their intuitions about optimal giving skewed. They have challenges letting go of cached patterns of giving to previous causes and accepting that their previous giving was suboptimal. They experience guilt over their previous giving or the lack thereof, something that causes them to flinch away from the EA movement. They have difficulty connecting emotionally with the data presented by venues like GiveWell as evidence for optimal giving, it doesn't feel emotionally rewarding to them. Moreover, they have emotional challenges with the difficulty of the many things they need to learn to "get" Effective Altruism - data analysis, utilitarian philosophy, rationality, etc. Many accept intellectually that an African life is worth the same inherently as an American life, but then have trouble emotionally enacting the implications of their intellectual recognition by optimizing their giving through contributions to alleviate African vs. American suffering.
For instance, right now Intentional Insights is focusing on spreading rational thinking and effective altruism to the skeptic/secular movement. People I talk to who accept the premise of optimizing giving to do the most good try to rationalize their current giving to secular communities as optimal for saving lives by saying things like "well, if I give to my secular community, it will create a venue where other secular people will feel safe and respected, and then we can give later to save the lives of Africans." Now, this is an awfully convenient way of justifying current giving, and I suspect it does not actually optimize for saving lives, but is just an example of confirmation bias. Sure, I present data from GiveWell on the benefits of giving to developing countries, but they still have an out that lets them preserve their self-image as rational people, since the QALYs of giving to a secular community and then potentially giving together latter are hard to quantify. Moreover, they often have trouble engaging with the dry data analysis, it just doesn't ring true to them emotionally.
This example illustrates some of the problems with accepting cognitive biases, letting go of cached patterns of giving, connecting emotionally with data, and enacting the implications of their emotional recognition, also known as the drowning child problem. Now, you might think the stances I described above are weird, and do not feel intuitive to you. I hear you, and my gut reaction also does not accept these stances. If I learn that something is true - i.e., if my goal is to give effectively to do the most good - then it is relatively easy for me to let go of cached patterns and update my beliefs.
However, I think I, and the brunt of EAs in general, are much more analytical in our thinking than the baseline. If we want to expand the EA movement, we can't fall into typical mind fallacy and assume that what worked to convince us will convince others who are less analytical and more emotionally-oriented thinkers. Too often, I have seen effective altruists try to convince others by waving data in their face, and then calling them intellectually dishonest and inconsistent thinkers when those others did not change their perspective due to their internal emotional resistance. We need to develop new ways of addressing this emotional resistance, in a compassionate and generous way, to grow the EA movement.
Something that I find worked with our outreach efforts is to help provide people interested in EA goals with emotional tools to address their internal emotional challenges. For instance, to address the guilt people experience over their previous giving, to address cached patterns, and help people update their beliefs, it helps to use the CBT tool of reframing by encouraging themselves to distance their current self from their past self, and remember that they did not have this information about EA when they decided on their previous giving, making it ok to choose a new path right now. Another approach I found helpful is to encourage people to think of themselves as being at the ordinary human baseline, and then orient toward improving, rather than seeing oneself as never able to achieve perfect rationality in one's giving. To address guilt in particular, teaching non-judgment and compassion toward oneself is really helpful. To help people connect emotionally with the hard data, we know what works to pull at people's heart strings - we should tell stories about the children saved from malaria, of the benefits people gained from GiveDirectly, etc. Indeed, I found that telling stories, and then supporting them with numbers and metrics, works well. Likewise, it helps to have effective altruists share personal and moving stories of why they got into effective altruism in the first place and why they are passionate about it, stories that illustrate their own strong feelings and go light on the data.
On an institutional level, I would suggest that EA groups focus more on being welcoming toward emotionally-oriented thinkers. Perhaps having people who are specifically assigned as mentors for new members, who can help be guides for their intellectual and emotional development alike.
What are your thoughts about these, and more broadly strategies for overcoming emotional resistance to Effective Altruism? Also happy to discuss any collaboration on EA outreach, my email is gleb@intentionalinsights.org.
EDIT: Title edited based on comments
P.S. This article is part of the EA Marketing Resource Bank project lead by Intentional Insights and the Local Effective Altruism Network, with support from The Life You Can Save.
I'm one of those people who has trouble connecting with EA emotionally, even though I fully "get" it rationally. My field is cost-benefit analysis for public programs so I fully understand the moral and statistical basis for giving to the mathematically "correct" charity. But I don't feel any particular personal connection to, say, Deworming the World, so I'm more apt to donate to something I feel connected to.
In EA thinking, emotions and "warm fuzzy" feelings tend to be looked upon disparagingly. However, our emotions and passions are powerful and essential to our humanity, and I think that accomplishing what we want (driving more resources to the needy in the most effective way possible) requires understanding that we are humans, not GiveBots.
To me, one solution is to use the tools of behavioral psychology to encourage people to give more where we want. I'm talking about touching heartstrings, helping us see the actual people we are helping, and talking stories instead of just numbers.
Thanks for the post!
Sounds like we are thinking along the same lines.
I don't particularly object to the content of the post, but could you please consider rewriting the title?
"Overcoming emotional resistance" honestly sounds like something deeply unpleasant pick up artists write about coercing women into unwanted sex (https://en.m.wikipedia.org/wiki/Pickup_artist#Practices)
Thanks, appreciate the suggestion! I edited the title.
I was about to delete my post (thanks Gleb_T for the quick change of name) but noticed a downvote. Could that person come forward and explain why they thought my post was unhelpful?
I'd also like to know that. I think your point was right on and thanks for helping improve the title.
I'm so glad people are becoming more aware of this issue! Right now the only addition I can think of is the “You are already a tribe member” tactic by Tyler Alterman that I found here: https://docs.google.com/document/d/1vsQdWIcL1nWdTTdQtB4uH1f_rIjDo27-CwaZUnfqEG4/edit#
Nice, thanks for that idea!
Too many thoughts all jumbled up, have to try and write more on this but:
Agreed on many of the points, except the weird and creepy - I'd like to understand more about that. More broadly, my point as I stated in the beginning of the piece was to open up a discussion, not give definitive answers. I'd like to hear many other people's thoughts on this.
reading again this was the bit i found wierd/creepy - For instance, to address the guilt people experience over their previous giving, to address cached patterns, and help people update their beliefs, it helps to use the CBT tool of reframing by encouraging themselves to distance their current self from their past self, and remember that they did not have this information about EA when they decided on their previous giving, making it ok to choose a new path right now. Another approach I found helpful is to encourage people to think of themselves as being at the ordinary human baseline, and then orient toward improving, rather than seeing oneself as never able to achieve perfect rationality in one's giving.
But i actually don't think these ideas are bad i just think the phrasing of them is off. the way you've wrtten this makes it seem a bit "oh we are the enlightened ones and here our our clever ways of manipulating you to join us" but i appreciate that is not what you meant - much more about EA as a movement of people who want to "do good better" than criticise people for not thinking same way we do or manipulate guilt.
Thank you for clarifying what I actually meant to convey. I'll work on phrasing it more effectively in the future.
I very much welcome the opening of this discussion.
Many utilitarian EAs simultaneously claim that EA is "compatible" with most other forms of ethical thinking while also continuing to make their arguments very narrowly consequentialist.
I genuinely believe that most EA actions are actually required of people who subscribe to other ethical systems, and I try my best to adapt my language to the person I'm trying to convince based on what they care about.
One example: many left-leaning students talk a lot about "privilege". I tell them that the best thing they could do if they were serious about finding it "problematic" that so many of us are overly privileged is by giving that privilege away!
Alternatively, people who care about justice are very receptive if you tell them that globalization means we now have reciprocal relationships with most of the world and that we elect governments that are utterly hypocritical on the issue of free trade, causing extreme poverty. Our riches often do in some sense come out of their poverty, and if one believes the global economy is in need of change they should refuse to submit to it by voting with their wallets as well as with their ballots.
Oh, I like that framing! Nice way of getting across to the audience that you are speaking to.
I absolutely agree, this is a crucial and ongoing challenge in EA. I am currently undertaking a course titled 'Civil Resistance and the Dynamics of Nonviolent Movements' online through the United States Institute of Peace. There are a lot of takeaways on how to build movements that are really directly applicable to EA, such as how to appeal to different audiences, how to upscale a movement (diversity of members is the key, which requires a diverse range of activities people can participate in, e.g. for EA not just analytical) and strategic analysis, and I'm thinking about the best ways to apply them through my chapter in Adelaide.
The link to the course is here, it's free for now. https://www.usipglobalcampus.org/training-overview/civilresistance/
Thank you for the resource, and glad you're doing this, Michael!
I'm new to EA, and my experience talking to people about it has been different than yours Gleb: They're very pragmatic, and ask an important question I don't have a good answer to.
Here's how my conversations usually go:
I explain EA/Peter Singer THEM..."yeah, but the ECONOMY! It would be bad for everyone because the world economy is driven by consumption. If we stop that it won't work." ME: "when people get richer, they have more money to spend to buy things, which provides a larger market for the first world, and thus could improve the world economy, or at least dampen the effects of less consumption in the rich world." THEM: "What about the people now who would lose their jobs? Like the factory worker who makes Ferraris, or starbucks coffee barista? If there were less frivolous spending, many of those people would be out of work." At this point I get stuck. They make a fair point don't they? If many people gave large amounts of their income to charity there would be some bad effects, undoubtedly - probably on the economy in the first world, at least temporarily, and on many people's livelihood in the first world. I have no doubt those negative effects would be much smaller than the positive effects that charity would have, but I don't have any proof.
Am I missing some logical counter-argument I could make here? Has an economist taken a stab at estimating the immediate and longer term effect on the world economy, of a segment or the entire population of the first world giving large parts of their income to charity?
The phrase you are looking for in the economics literature is 'pecuniary externality'. In general there are good reasons not to care about them.
Not sure about economist, but you can always make the counter-argument that spending on charity will get the economic wheels moving as well, and in a much better direction. For example, spending on AMF would cause production of malaria nets, and then shipping them overseas, and then the malaria net would cause a mosquito to not bite a productive worker, and that worker would not be sick and lose work time, etc.
Thanks Gleb. Any suggestions for where else I could post/who else I could ask? I'm sure someone's got to have put some numbers together!
Thanks! Ryan
Check out the Effective Altruism Facebook group