This is a Draft Amnesty Week draft. I wrote this very quickly for internal publication at CEA. It’s generally kinda embarrassingly written, oversimplified narratives from my memory of events I was poorly informed about at the time. I’m hoping to slip this in during the Draft Amnesty Festival without too much embarrassment. |
Commenting and feedback guidelines: Corrections, differing memories, and differing structures of narratives all welcome, though I may not reply. |
Talent gaps
Epistemic status: Even more so than the rest, the narrative got away from me here, and I might be overstating.
You probably know “talent gaps” from all the discussion about the ways in which the concept is maybe mis-applied. Fine. But I want to tell you why the post was a landmark moment and the single most important example of steering in the community’s history.
It’s fall 2015. JP is just getting involved in EA. Very few people currently at CEA have ever heard of it. It’s only been 2 years since the first EA summit & EA is still very nascent. If you asked a bright college-aged EA what their career plan was, 9 times out of 10 the answer would be earning to give. That’s broadly a mistake. Some of those college students will have heard that GiveWell is spinning out its basically-unknown GiveWell Labs initiative into a joint venture with Good Ventures called the Open Philanthropy Project, most people haven’t yet realized the implications.
Ben Todd, however, is paying attention. He sees tiny EA organizations struggling to hire and yet driving a small but growing movement. And the emergence of new cause areas like GCRs which had so painfully few shovel-ready projects. He also has some data. It will be two years before the first talent gap survey, but they’re already asking hiring managers how much they’d pay for their most recent hires, and getting numbers that far outstrip average E2G potential.
Ben is in a good position. He’s more engaged with the overall EA landscape at the time than almost anyone has been before or will be since. And he’s a strong writer at the helm of one of the most widely-read and respected EA bullhorns. All he has to do is write a post.
Within a short period of time, young JP would hear the post discussed at Stanford EA. The meme spreads fast, but humans are creatures of status quo. It takes a lot of repeating before JP, or most of the people at Stanford EA, change their career plans. The message might seem too strongly stated now, but even a year after the piece was published, the message was still mostly under-heard.
But over time, the EA community did change. By 2018, EAG applications were filled with ambitious object-level career plans, and many existing E2Gers had shifted their whole careers. It’s hard to know the counterfactual impact. It’s hard to ignore a ~$10B foundation in your space forever, but 80k’s push seems to me to be a big influence on how fast the transition happened.
Operations push 2018
In 2017, EA orgs all felt a dearth of ops staff. As an example, Tara, while CEO, was spending a significant part of her time doing CEA ops (which was then ops for the legal entity — there was no EV) because there was insufficient capacity without her.
80k and CEA both decided to pitch the community on considering operations work. CEA hosted talks at EAG on ops, and 80k published a post in march, followed by two podcasts in the summer. CEA held an Operations Forum in May 2018.
By 2019, the qualifications for operations applicants went from “1 year semi-relevant experience” to actually having finance experience, and CEA’s audits went from “all hands on deck overnighters” to “nobody but the ops team paid attention.”
Wait… Effective Altruism could be a thing??
[Before my time] The Centre for Effective Altruism was supposed to be an umbrella organization like EV is now. 80k and GWWC were the public facing brands. There were existing communities, but they were kinda separate strands in different locales who didn’t view themselves as being the same “thing.” Slowly, people realized that the name Effective Altruism was a good rallying point in memespace, and that people were actually attracted to that idea, maybe even more so than careers advice or effectively donating 10%.
The start of the EA as a flagpole mostly came from outside of the Oxford-sphere who set so much of it in motion. Kerry Vaughan, CEA USA, .impact, and (yes) Leverage Research get a lot of the initial credit.
This was non-obvious, and those agentic people did a lot of collectively steer the EA community into what it became by the time I found a THINK-seeded EA group.
The creation of EA spaces
[Stub] This is the sort of thing that CEA has down pat. But it wasn’t always the case. It was due to CEA’s vision, and position as leaders in the community that we were able to take on the space-creating projects that we have.
More case studies that I will leave for now
- EA Survey
- OP at some point pushed for GWWC to spin out of CEA and exist on its own
Themes
It’s hard to communicate that these were non-obvious insights that needed pushing by someone with context and vision.
You narrative talks about the movement switching from earn to give to career-focused. I think that has huge survivorship bias in it. There are now many GWWC pledgers who would not call themselves EA. As the movement became bigger, the career-focused side began to dominate discourse because there’s a lot more to say if you are career-focused and trying to coordinate things rather than if you are head down earning money.
Thanks for writing and posting this!
I think it's important to say this because people often over-update on the pushback to things that they hear about, because of visible second order effects, but they don't notice the counterfactual is the thing in question not happening, which far outweighs those real but typically comparatively minor problems created.
Executive summary: Several key moments in Effective Altruism’s history—such as the shift from earning to give toward talent-focused impact, the professionalization of operations roles, and the consolidation of EA as a movement—were the result of deliberate steering by engaged individuals and organizations.
Key points:
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.