Hide table of contents

"Do we want accuracy, or do we want them to... make us more aware of extreme events and possibilities?" Thus begins Tyler Cowen's conversation with Philip Tetlock. After they've established that epidemiologists *had* warned that bats+exotic meat consumption could lead to a SARS-like pandemic, Tyler asks, "So those forecasters maybe weren’t entertaining enough?"

As EAs, we tend to be careful about accurately representing the evidence and our interpretations thereof. We try to avoid cherrypicking data and we sometimes signpost our uncertainty with "epistemic status" disclaimers. I love this aspect of the EA community. It's also very different than the bombastic, (over)confident style of figures who capture the public imagination, e.g. Rutger Bregman, who can confidently state that "UBI works" without feeling the need to advertise his epistemic status or provide disclaimers. Of course, it's not quite that straightforward, and he probably knows that - for example, under what conditions does it work? How are we measuring whether it works? Steven Pinker's Better Angels of Our Nature arguably became famous in part because of its bold, easy-to-summarize thesis and his downplaying of limitations, omissions, or counterexamples. It's also different than dramatic appeals by UN officials. To my chagrin, these quotes get picked up and repeated, even by people who should know better.

Forecasting is but one (albeit important) aspect of EA. We aren't satisfied with merely predicting what will happen; we want to effect change. Nonetheless, Tyler's underlying question remains: if we ultimately want people to act, how much should we prize accuracy vs. entertainment value? One way of effecting change is to alter the public consciousness, to introduce or popularize ideas that then become actions. This brings me to the philosophical argument that most effectively convinced research participants to donate a surprise bonus payment to charity at rates statistically higher than a control group. While it was written by Peter Singer and Matthew Lindauer, it conspicuously lacked some of the hallmarks of EA writing. For example, it didn't say *how many* children suffer from trachoma, it appealed to emotion, and it didn't compare the cost effectiveness to other possible interventions.

In my advocacy work, I regularly come across claims that are widely used despite being somewhat inaccurate or incomplete. And I'm torn. In our attempts to persuade the broader public, how much should we be willing to sacrifice our attachment to our internal commitment to accuracy, including in terms of epistemic modesty and hedging? In brief, to what extent do the ends justify the means?

-----

Note:

I see parallels between the above and the concept of musicians 'selling out', i.e. changing one's sound from what most appeals to their smaller community, to a sound that has wider appeal. That said, I see this as different than the debate over how broad the EA community should be. I'm not calling for a change to our internal style of communication - again, I love it and wish it were more common. Rather, I'm asking how far we should be willing to go in sacrificing our internal values of when trying to affect the public consciousness - assuming (safely, I think) that a style of writing that works for us doesn't necessarily work in the wider world.

New Answer
New Comment

2 Answers sorted by

This is a good question with no clear answer. Overall, I tend to be more of a pro-marketing person than what I perceive as the EA average.

However, I think there are a few good reasons to lean in the direction of "keeping more nuance and epistemic humility" that we might underappreciate (given the more obvious benefits of the other approach):

  1. Many of the world's most successful/thoughtful people will be unusually attracted to movements that are more nuanced and humble. For example, I wouldn't be surprised if Dustin Moskovitz were drawn to GiveWell partly because they weren't as "salesy" as other charities (though I don't know the details of this interaction, so maybe I'm wrong). People with a lot of resources or potential are constantly being approached by people who want to sell them on a new idea; a non-salesy EA could be very appealing in this context.
  2. If some groups within EA try to be more salesy, it could spark internal competition. Right now, I think EA does a pretty good job of being a neutral community, where someone who wants to contribute will get questions like "what are you interested in?", rather than lots of pitches on particular organizations/causes. If marketing becomes more prevalent, we might lose some of that collaborative atmosphere.
  3. Nuance and humility are also marketable! One type of pitch for EA that I think is undervalued: We are the social movement that values honesty more than any other movement. Political parties lie. Populist movements lie. We don't lie, even if the truth isn't exciting.
  4. EA doesn't actually need giant walls of statistics to market itself. As you noted, the Singer/Lindauer argument doesn't discuss statistics or prioritization. But it still does a good job of making one of the central arguments of EA, in a way that can be extended to many other causes. Even if this particular empathetic approach wouldn't work as well for longtermist orgs, pretty much every EA org is driven by a simple moral intuition that can be approached in a simple way, as Singer/Lindauer did with "what if it were your child?"

| if we ultimately want people to act, how much should we prize accuracy vs. entertainment value?

Maybe being vague enough so that what we are presenting is accurate, or can be interpreted as such, while upon elaboration, a greater understanding of EA concepts can be developed. Being personally entertaining by the ability to create interesting or captivating dynamics. This can motivate people to internalize broad objectives and seek own evidence to develop perspectives/ personal strategies. There will be no inconsistency between ends and means.

Curated and popular this week
Relevant opportunities