Hide table of contents

Although EA is trying to do the most good possible, it's likely that we've overlooked or missed some of the best opportunities to do so. I'm trying to think of examples of when this has happened, in order to better understand our blindspots as a community. As a fictitious example, if we're continually overlooking media opportunities that then have a sweeping cultural effect on society, this is worth adjusting for to make better decisions in the future.

One more fleshed out but not quite perfect example might EAs not supporting Extinction Rebellion during its incubation phase in 2018. This might not a great example for a couple of reasons:

  1. Whilst I think there's some evidence that Extinction Rebellion has been quite cost-effective in reducing carbon emissions, it's also not conclusive by any means. [1]
  2. There wasn't much EA grant making, if any, on climate related issues in 2018, which might actually be a larger blindspot of its own due to the increased interest in climate risks now.
  3. Extinction Rebellion still ended up happening and being quite successful, so the counterfactual of EA not supporting this was seemingly okay (although we have no idea about how successful it could have been with greater support early on).

Ignoring these points and assuming this example was much stronger, the case would be something like: Extinction Rebellion had significant positive impacts on reducing carbon emissions, potentially outperforming other grantees of EA funding working on climate issues, yet we failed to identify this opportunity a priori. Not only did we fail to identify this, but other charitable organisations did, which indicates they had some information or connection that the EA movement was lacking. So the question becomes, why did we overlook this opportunity, and how do we stop it happening again?[2]

One consideration to keep in mind is that if we can think of great things happened without the EA movement, then maybe it's fine as counterfactually our support wasn't needed for these interventions.[3] So the more interesting question might be, what could have been extremely impactful in doing good, but failed to take off at all, and the EA movement could have changed this?

 

I've generally been framing this around specific interventions within a cause area that we may have missed e.g. interventions that could have been extremely impactful within AI risk or Animal Welfare we overlooked. However, this could also be true for cause areas, such as EAs updating towards being more concerned about climate change in the past few years, whilst some organisations were somewhat ahead of the curve relative to us (albeit potentially for different reasons to our interests in climate tail risks). This question might be more like: Were some organisations or institutions ahead of the curve in terms of cause prioritisation relative to EA, and why do we think this was?

  1. ^

    Plus I worked for them for 2.5 years so there's probably a case of motivated reasoning.

  2. ^

    In the specific case of Extinction Rebellion, my somewhat uninformed prior is that as a movement, EA is generally more focused on technocratic theories of change, which focuses on persuading or working with elites, rather than more democratic or people-powered approaches.

  3. ^

    However, there's also the case that we could have scaled them up further or initiated them earlier, which is obviously also good.

40

0
0

Reactions

0
0
New Answer
New Comment

7 Answers sorted by

I'm continually haunted by this wired article on the early WHO and CDC guidance on COVID:
https://www.wired.com/story/the-teeny-tiny-scientific-screwup-that-helped-covid-kill/

CDC and WHO downplayed the importance of masks and airborne transmission in the first few months of the pandemic despite evidence of airborne spread and promising data from other countries showing masks reduced transmission.
https://en.wikipedia.org/wiki/Face_masks_during_the_COVID-19_pandemic_in_the_United_States
It took them over a year to officially acknowledge and recommend serious countermeasures to aerosol transmission, by which point we already had the vaccine.

There are a few ways EA could have helped 1) funding work 5-10 years ago to help public health organizations more quickly and accurately identify high risk viruses and disease vectors, 2) making sure there were a range of different professionals including aerosol scientists, engineers, and primary care providers on the key committees at public health organizations that gave official guidance, and 3) supporting a precautionary public health messaging blitz at the first evidence of aerosol transfer, before the strictest lockdown procedures were lifted.

Open Philanthropy is the primary (only?) major funder of biosecurity preparedness in the EA community, and gave ~$65 million to the cause area pre-pandemic, $20 million of which was to Johns Hopkins Center for Health Security (CHS) in September 2019 which in hindsight was very apt timing.

It's hard to fix dysfunctional institutions by giving them more money. Even if you give them money with a clear purpose like "add the right experts to deciding bodies, they might screw up hiring people or do something else poorly.

EA missed:

  • EA community building.
    It might seem odd to say that EA missed EA community building – but even until 2016/16 there was no or minimal support or funding for community building. That is about 4-5 years from EA being a thing to EA community building being widely accepted as valuable. When I talked to people, such as senior CEA staff, about it back at EAG Oxford in 2015 it felt like the key question was: should EA risk trying to outreach to any significant amount of people or just build a very narrow small community of super talented people. To get an EA coordinator in London in 2016 we had to fundraise lots of small donations from individuals in the London EA community. It is now fairly broadly accepted that funding local and especially university outreach is valuable.
     
  • Policy careers / longtermist policy work.
    I might be wrong but it was 2016/17 before 80,000 Hours began recommending policy careers as an option that was neither earning to give or direct work. And there was generally no funding for anyone trying to have an impact in the longtermist policy space (except for a one off experiment by OpenPhil to help found CSET) until maybe 2020 with the Survival and Flourishing Fund (SFF). Projects such as the APPG for Future Generations, CLTR, Simon Institute, and various individual EAs in think tanks, have all found that funders would or could not evaluate them or saw them as too risky. If you talked to most folk at EA orgs about this they would say, oh I don’t know about policy it sounds risky. I think this attitude is changing currently with the SFF and FTX Future Fund maybe being more inclusive.  

 

A key feature that makes both of these hard to evaluate as "missed" is that the ideas were not in anyway unknown to the EA community but the EA community was adverse to taking action because they were risk adverse and saw these projects as too risky. This may or may not have been the right decision. Given the EA community has not collapsed into infighting or been rocked by scandal (as many similar communities have been) I am tempted to say that EA orgs aversion to risk has been justified.

The lesson I would like EA folk to take away from this is that if we don’t want to miss things going forward (and I think we are currently missing many high impact things) then we should have a much better understanding of how projects can pose risks, what those risks look like, how they can be managed and so on.  

Personal take:

I strongly(?) agree with the high-level texture of both of these points. The first point seems especially egregious ex post. Though I wouldn't frame your timing quite the same way, feels like 2016(?)-2019(?) feels more dead re: CB than either before or after.

For a while, a) many EA orgs didn't believe in scale, and b) entrepreneurship was underemphasized in EA advice (so creating new orgs don't happen as often as they could), which didn't help.

I feel like most of the years I've been in EA has been in "keep EA small" mode, and "don't do irreve... (read more)

I noticed that many prizes / awards / contests have been announced on this Forum in the last semester - possibly more than in the rest of the last 5 years. I wonder why we didn't do it before - the case for this sort of competition has been argued for some times in the last few years, and EA Forum prize has been around for many years. So we knew that doing this was generally useful.
One of the possible reasons is that funding was a problem back then... but except for a few contests that actually involve substantial amounts of money (e.g., the blog prize), most awards are relatively cheap in comparison to most grants, and their potential benefits usually surpass their monetary value - they provide incentives and help elicit information. So it wouldn't have been a budgetary problem if 5y ago we had assigned about U$50k for the best ideas / posts / theses, etc.

One of the areas EA has hardly touched is investing.

Whether it is optimally investing capital now to be able to maximise the amounts of grants when the time is right (eg longtermist grants) or investing capital to create impact now (investments that create a net positive impact at a lower cost per unit of impact than EA grants). EA thinking is not yet applied in these areas.

The philanthropic capital which EA influences is highly impactful, but is still very limited in quantity, being only a small fraction of overall philanthropic capital. And philanthropic capital is only a very small fraction of total capital. 

If EA thinking can start to be applied to investing, the amount of capital used to create impact (although it will generally be less impactful than EA grants) could potentially raise the overall impact of EA by orders of magnitude.


 

EA thinking has been applied to these questions. Founders Pledge has long and, IMO, very good reports on Investing to Give and Impact Investing.

(Disclaimer, I used to work at FP, though I didn't work on either of these reports)

1
rboogaard
This indeed a start, but only scratching  the surface of a very important and potentially very impactful area

Thanks. I think this can be quite useful.

Were some organisations or institutions ahead of the curve in terms of cause prioritisation relative to EA, and why do we think this was?


It might be useful to distinguish cases where an area / project was overlooked and later proved to be impactful because:

a) someone else worked on it (which suggests it wasn't neglected at all);

b) something changed: there wasn't enough information back then, or it was too far away from a tipping point (like, maybe, climate change and biorisks?);

c) nobody actually considered the project;

d) we now disagree with the former reasoning for disregarding the project - e.g., I (and I believe many others) have changed my mind in the last few years on the effectiveness of working / investing in advocacy, lobbying, politics, etc. I believe I already had enough information to think this way in 2016 (and possibly way earlier).
I'd be more interested in cases of (c) and (d) above - though, of course, these explanations will likely overlap a lot.

An analyst I follow called Peter Zeihan managed to predict the Ukraine war and the impact it would have on global energy, food and fertiliser markets.  He's now predicting global famine and civil unrest considerably worse than the arab spring before the end of the year, potentially including the collapse of China.

If this is true, there's a wide range of projects that need to be considered by way of making agriculture less fertiliser-dependent, and most projects aimed at, e.g, saving lives from tropical disease are small fry by comparison.

3
Ramiro
I agree - this sounds a bit like a "weak" version of the case for Allfed, right? IGM Forum kind of agree with this prediction, but less intensely. On the other hand, I wonder about what would be the marginal impact of an EA project on that. I don't think this is neglected - I have recently read many journalists voicing these concerns, and I see some people in food systems concerned with similar problems in global supply chains... but then perhaps we could start discussing if neglectedness is still a useful metric when we deal if world-level problems.  
1
Morgan Allen
I personally don't think journalists have been voicing these concerns in anything like a manner proportionate to the risk of megadeath (compared with something like, say, climate change, which has become a perennial font of public hysteria even though its impact on quality of life will likely be minimal for most areas of the globe.) Climate change's effects on desertification and rising sea levels could plausibly either directly or indirectly kill tens of millions over the coming century- which isn't exactly good news, but... given our global population size, tens of millions could statistically die from various causes over the course of a century and the average person will probably never notice, especially when most of those deaths will occur in ecologically marginal areas where life was never easy to begin with. If Zeihan's analysis is correct, then hundreds of millions could plausibly perish from war and famine just over the coming decade.  If so, "keep global trade in fertiliser inputs cheap and safe" would have to be considered a project of overwhelming importance for rational altruists.

I think Extinction Rebellion is having a positive impact, although it might have been a mistake for EA to fund it because it might have damaged our credibility in other areas. Not everything has to be done as part of EA.

I am baffled as to what 'net positive impact' extinction rebellion is supposed to have made, given they tarnish the reputation of the environmental movement as a whole with vastly exaggerated projections of doom, actively block pursuit of non-renewable energy solutions like natural gas, carbon capture and nuclear, and promote an anti-natalist hysteria which is only going to make other economic and social problems worse over the coming decades.

1
Chris Leong
I'm not strongly sure about the sign of the impact. I'd lean towards positive, but maybe I'd change my mind if I did a deeper dive. My main point was that not everything that is impactful needs to be done inside of EA.

"Not everything has to be done as part of EA" is exactly the right attitude imo

3
JamesÖz
I agree with this!  I guess my reasoning behind this post was that if EA is a movement that claims to do (impartial) good, and some other group does something great by our own metrics, how come we missed this? It seems like EA has a big mission of trying to do the most good, so surely we should always be looking for opportunities to do so?

My impression is that EA could do more to make existing philanthropy more effective.

There are many existing charities that process billions of dollars[^1] per year. Many of these do not focus on effectiveness or have only recently become interested. I believe that a lot of good could result from making these charities more effective at what they do, or slightly moving their cause area to one that has more proven benefits.

My feeling is that EA did not interact much with existing "classical" charities. Maybe there are differences in worldview that prevented this? For example, many existing charities are faith-based, whereas EA seems explicitly secular. I think it would be desirable to bridge these worldview gaps if it allows EA to leverage the existing resources and networks of classical charities.

Several classical charities that I know of have recently become interested in effectiveness (and efficiency) due to donors caring more about these values. This might be another way for EA to have a large effect: influence donors so that they demand more effectiveness from their charities of choice. Organizations like The Life You Can Save do this to some extent, but focus on a few existing good charities rather than expanding the scope to the big-but-not-necessarily-effective players.

Another way of achieving this goal might be to influence development spending of countries more strongly. I know several cases where countries give part of their development budget to classic charities (e.g., Helvetas in Switzerland, Brot für die Welt in Germany). EA might be able to exert more influence in this area, similar to what EAF did for Zurich.

[^1] Sorry for the sloppy imprecision here... I hope that this post conveys my idea even without real numbers.

Comments5
Sorted by Click to highlight new comments since:

I'm not convinced that climate was a miss by effective altruism. It seemed much less neglected by more mainstream grantmakers, so I think it made sense for EA initially to focus on more neglected and more important issues.

I do think climate change should still be explored by EAs though to see if it can be competitive with global health and development opportunities. I'm currently much less convinced by the longtermist angle on climate change, though I still think that should be explored further in case I am wrong.

In terms of what is neglected, I have some thoughts, though not with enough confidence to make an actual "answer" to this question (instead just a comment):

  • I think traditional animal work (e.g., The Humane League) is still potentially underfunded by EA, especially compared to longtermism and global health/development.

  • I think there could be some smaller-scale opportunities in global health and development that are cost-effective per dollar but not as scalable, so they don't get picked up by scalability-focused grantmakers.

  • Work on invertebrates and more esoteric neglected animal work was really underinvested in prior to Rethink Priorities, but it seems like the issue right now is more finding talented people that want to work on the problem rather than money (that is, it seems really neglected by talent and thus currently has very minimal opportunities for spending).

Disclaimer: Just my personal opinion.

However, this could also be true for cause areas, such as EAs updating towards being more concerned about climate change in the past few years, whilst some organisations were somewhat ahead of the curve relative to us (albeit potentially for different reasons to our interests in climate tail risks)

Is this true? My impression is that the EAs who study climate have gotten less concerned about extreme climate risks in the last few years, while "lay EAs" are less interested than before in working on climate risks. 

I would sort of expect it to look like EA was becoming more concerned about climate change over time just because of movement growth and absorbing more marginal people causing a regression to the mean.

Good point  - my main rationale behind saying this was the increased number of organisations / roles within EA working on climate in the past few years, for example:

  • Founders Pledge starting in climate work around 2020 and now with a team of 3-4 (roughly)
  • Giving Green being incubated in 2020, now with a team of 6
  • Forethought doing work on climate risk (via John Halstead mainly, I think)
  • FHI now has someone working on climate
  • FTX Climate is now a thing
  • Rethink Priorities recently hired someone to work on climate within their global health and development team
  • Open Phil has introduced climate into their regranting challenge

Around 2018, I think there was comparatively much less activity in the EA climate world so I took this a sign that people must have updated in some way to thinking this was a more important problem to work on. A point that I didn't mention which might be true for Open Phil / Rethink is that growing concern for how climate change will affect global health and development could be a big factor, rather than the extreme tail risk scenarios.

It could be that both are true, as EA grows and professionalises, it is able to put more organised resources to areas that EA as a whole is less concerned about.

Curated and popular this week
Relevant opportunities