N.B. You may be interested to read our follow-up post: Many EA orgs say they place a lot of financial value on their previous hire. What does that mean, if anything? And why aren't they hiring faster?
--
For the third year running we've surveyed leadership at EA organisations about a range of issues where their views might be relevant to EAs’ career decisions:
It complements the 2018 EA Survey which aims to collect information about everyone who says "they can, however loosely, be described as an effective altruist."
We asked leaders about:
- what skills and experience they most need;
- what skills and experience they think the community as a whole will need in the future;
- how many donations they'd be willing to forego for their latest hires;
- their view on the relative cost-effectiveness of the different EA Funds, and which new funds they'd like to see;
- how urgent their need for extra donations and staff is;
- and various other issues.
We also surveyed people who identify as members of the EA community and work directly on problems like animal welfare and poverty, to see how their views on some of these questions would compare.
Here are some of the findings:
- EA organisation leaders said experience with operations or management, and generalist researchers are what their organisations will need most of over the next five years.
- They said the community as a whole will most need more government and policy experts, operations experience, machine learning/AI technical expertise, and skilled managers.
- Most EA organisations continue to feel more 'talent constrained' than funding constrained, rating themselves as 2.8/4 talent constrained and 1.5/4 funding constrained.
- Leaders thought the key bottleneck for the community is to get More dedicated people (e.g. work at EA orgs, research in AI safety/biosecurity/economics, ETG over $1m) converted from moderate engagement. The second biggest is to increase impact of existing dedicated people through e.g. better research, coordination, decision-making.
- We asked leaders their views on the relative cost-effectiveness of donations to four funds operated by the community. The median view was that the Long-Term Future fund was twice as effective as the EA Community fund, which in turn was 10 times more cost-effective than the Animal Welfare fund, and twenty times as cost-effective as the Global Health and Development fund. Individual views on this question varied very widely, though 18/28 respondents thought the Long-Term Future fund was the most effective.
- In addition, we asked several community members working directly on animal welfare and global development for their views on the relative cost-effectiveness of donations to these funds. About half these staff thought the fund in their own cause area was best, and about half thought either the EA Community fund or Long-Term Future fund was best. The median respondent in that group thought that the Animal Welfare fund was about 33% more cost-effective than the Long-Term Future fund and the EA Community fund - which were rated equally cost-effective - while the Global Health and Development fund was 33% as cost effective as either of those two. However, there was also a wide range of views among this group.
- The organisations surveyed were usually willing to forego over a million dollars in additional donations to get the right person in a senior role 3 years earlier, or several hundred thousand dollars for a junior hire.
Continue reading for details of the method and results...
Most answers were similar to what we found in 2017, so next year we expect to either ask different questions or interview a smaller number of people in greater depth and see whether their responses change after further reflection.
Continuing on the EA talent paradox (“EA orgs need talent but many EAs can’t get hired at EA orgs”), I’m confused why 80,000 Hours is continuing to bemoan earning to give. I get that if someone could be an FHI superstar or earn to give at $50K/yr they should go join FHI and I get that there are many awesome career paths outside of EA orgs and outside ETG that should be explored. Maybe in the past ETG was too much of an easy auto-default and we want to pressure people to consider more of their options. But ETG is an easy auto-default for a reason and I wouldn’t be surprised if it turned out that ETG is genuinely the highest impact option for >50% of the population of people who are EA enough to, e.g., fill out the EA Survey!
It seems pretty discouraging to EAs to make them feel bad about what is a genuinely a really great option. I think we may have overcorrected too strongly against ETG and it may be time to bring it back as a very valid option among the top career paths, rather than “only for people who can donate $1M/yr or more” or “the auto-default for everyone”.
~
Edited to add that it looks like 80K seems to actually promote ETG in the way I recommend - see https://80000hours.org/articles/high-impact-careers/#5-otherwise-earn-to-give - but I don't think this is communicated very clearly outside that section of that article. In general, I get the sense that ETG has become depressing and low-status in EA when it was once high-status, and I'd like to see that trend reversed at least somewhat.
Hi Peter,
It sounds like you mostly agree with our take on earning to give in the high impact careers article. That article is fairly new but it will become one of the central pages on the site after a forthcoming re-organisation. Let us know if there are other articles on the site you think are inconsistent with that take - we can take a look and potentially bring them into line.
We agree with you that earning to give can be a genuinely great option and don’t intend to demoralize people who choose that path. As we write in that article, we believe that “any graduate in a high income country can have a significant impact” by earning to give.
That said, we do stand by our recommendation that most people who might be a good fit to eventually enter one of our priority paths should initially pursue one of those paths over earning to give (though while maintaining a back-up option). Those paths have higher upside, so it’s worth testing out your potential, while bearing in mind that they might not work out.
Many of the best options on these paths require substantial career capital, so often this won’t mean starting a direct impact job today. Instead, we think many readers should consider acquiring career capital that can open up these paths, including graduate school in relevant disciplines (e.g. AI/ML, policy, or international relations) entry level policy jobs (e.g. as a Congressional staffer, or working as an early employee at a startup to gain skills and experience in operations. We hope to release an article discussing our updated views on career capital soon.
Of course, these paths aren’t a good fit for everyone, and we continue to believe that earning to give can be a great option for many.
It’s also worth emphasizing that our advice is, of course, influenced by our views on the highest priority problems. We tried to make that clear in “high impact careers” by including a section on how our recommendations would change if someone is focused on global health or factory farming. In that case, we believe “earning to give, for-profit work and advocacy become much more attractive.”
I’d really like to hear more about other EA orgs experience with hiring staff. I’ve certainly had no problem finding junior staff for Rethink Priorities, Rethink Charity, or Charity Science (Note: Rethink Priorities is part of Rethink Charity but both are entirely separate from Charity Science)… and so far we’ve been lucky enough to have enough strong senior staff applications that we’re still finding ourselves turning down really strong applicants we would otherwise really love to hire.
I personally feel much more funding constrained / management capacity constrained / team culture “don’t grow too quickly” constrained than I feel “I need more talented applicants” constrained. I definitely don’t feel a need to trade away hundreds of thousands or millions of dollars in donations to get a good hire and I’m surprised that 80K/CEA has been flagging this issue for years now. …And experiences like this one suggest to me that I might not be alone in this regard.
So…
1.) Am I just less picky? (possible)
2.) Am I better at attracting the stronger applicants? (doubtful)
3.) Am I mistaken about the quality of our applicants such that they’re actually lower than they appear? (possible but doubtful)
Maybe my differences in cause prioritization (not overwhelmingly prioritizing the long-term future but still giving it a lot of credence) contributes toward getting a different and stronger applicant pool? …But how precise of a cause alignment do you need from hires, especially in ops, as long as people are broadly onboard?
I’m confused.
Something about this phrasing made me feel a bit 'off' when I first read this comment, like I'd just missed something important, but it took me a few days to pin down what it was.
I think this phrasing implicitly handles replaceability significantly differently to how I think the core orgs conventionally handle it. To illustrate with arbitrary numbers, let's say you have two candidates A and B for a position at your org, and A you think would generate $500k a year of 'value' after accounting for all costs, while B would generate $400k.
Level 0 thinking suggests that A applying to your org made the world $100k per year better off; if they would otherwise earn to give for $50k they shouldn't do that, but if they would otherwise EtG for $150k they should do that.
Level 0 thinking misses the fact that when A gets the job, B can go and do something else valuable. Right now I think the typical implicit level 1 assumption is that B will go and do something almost as valuable as the $400k, and so A should treat working for you as generating close to $500k value for the world, not $100k, since they free up a valuable individual.
In this world and with level 1 assumptions, your org doesn't want to trade away any more than $100k to get A into the applicant pool, but the world should be willing to trade $500k to get A into the pool. So there can be a large disparity between 'what EA orgs should recommend as a group' and 'what your org is willing to trade to get more talented applicants', without any actual conflict or disagreement over values or pool quality, in the style of your (1) / (2) / (3).
That being said, I notice that I'm a lot less sold on the level 1 assumptions than I used to be. I hadn't noticed that I now feel very differently to say 24 months ago until I was focusing on the question to write this reply, so I'm not sure exactly what has changed my mind about it, but I think it's what I perceive as a (much) higher level of EA unemployment or under-employment. Where I used to find the default assumption of B going and doing something almost as directly valuable credible, I now assign high (>50%) probability that B will either end up unemployed for a significant period of time, or end up 'keeping the day job' and basically earning-to-give for some much lower amount than the numbers EA orgs generally talk about. I feel like there's a large pool of standing applicants for junior posts already out there, and adding to the pool now is only worth the difference between the quality of the person added and the existing marginal person, and not the full amount as it was when the pool was much smaller.
How much this matters in practice obviously depends on what fraction of someone's total value to an org is 'excess' value relative to the next marginal hire, but my opinion based on private information about just who is in the 'can't get a junior job at an EA org' pool is that this pool is pretty high quality right now, and so I'm by-default sceptical that adding another person to it is hugely valuable. Which is a much more precise disagreement with the prevailing consensus than I previously had, so thanks for pointing me in a direction that helped me refine my thoughts here!
You're welcome!
I agree here and think this is a very important concept and that you put it well. It seems like, unfortunately, in practice, that a next best candidate who just barely doesn't make the cut has a risk of being a perennial EA job applicant who just barely doesn't make the cut in a bunch of places.
To me, this seems like a standard issue of EA unemployment or underemployment that could be analyzed like any other market, looking at the supply of EA jobs to the supply of potential EA laborers. The "level-1" assumption that people who just barely don't make it will find roughly equally valuable employment elsewhere doesn't fully account for the additional costs of a prolonged search, the risk of getting discouraged, etc.
I'd love to try to use the 2019 EA Survey to analyze EA unemployment / underemployment and see if this is amenable to more analysis.
-
Based at least on my recent hiring for Rethink Priorities, I can definitely confirm this is true, at least for us. We ended up completely overwhelmed with high-quality applicants beyond our wildest dreams. As a result we're dramatically scaling up as fast as we can to hire as many great applicants as we can responsibly, taking on a bunch of risk to do so. Even with all of that additional effort, we still had to reject numerous high-quality candidates that we would've otherwise loved to work with, if only we had more funding / management capacity / could grow the team even faster without overwhelming everyone.
Perhaps add a tl;dr?
AGB can correct me if I'm wrong, but I'd summarize it as:
EAs originally thought that replicability meant your value is only equal to how much better you are than the next best applicant, rather than your total value.
Now EAs think your value is equal to your full value, as the person you "replace" would go on to produce their full value somewhere else. Replaceability thus isn't really the issue that we once thought it was.
However, when considering additional costs, underemployment factors, other market dynamics, etc., it looks like the EA employment market is very saturated and the next best applicant actually doesn't end up producing their full value somewhere else. So perhaps our original naive analysis of replaceability ends up being closer to the truth.
I agree with this summary. Thanks Peter and sorry for the wordiness Milan, that comment ended up being more of a stream of consciousness that I’d intended.
TLYCS’s experience is very consistent with Peter’s. Money is overwhelmingly the constraining factor, with more funding we’re confident we can get high quality candidates.
Personally, I see large differences in the expected impact of potential new hires. I'm surprised you don't, especially at the startup stage, and am not sure what's going on there. I would guess you should be more picky for some of the reasons listed in Rob's post.
I also feel very constrained by management capacity etc. This drives the value of past hires up even further, which is what the survey was about (as also in Rob's post).
I do see large differences in expected impact of potential new hires, but I see a lot of hires who would be net positive additions (even after accounting all the various obvious costs enumerated by Rob) and even had to unfortunately turn away a few people I think would have been rather enormously net positive.
We're not constrained by management capacity but we will be soon.
We've written a new post in part to address this question: Many EA orgs say they place a lot of financial value on their previous hire. What does that mean, if anything? And why aren't they hiring faster?
I feel like part of the definition of "talented applicant" is that they don't stretch your management capacity, don't mess up your culture, etc. For example, if there was someone who had volunteered at Rethink for a while, you had a lot of trust in them, they knew your projects intimately and could hit the ground running etc., my guess is that you would value that person much more highly than someone who had "general" competency.
And the next level up would be candidates who not only don't stretch your management capacity or culture but actually add to it.
My experience is that there are lots of people who are good at research or programming or whatever but fewer who have those skills and can add value to the organization without subtracting from other limited resources.
One possibility is because the EA organizations you hire for are focused on causes which also have a lot of representation in the non-profit sector outside of the EA movement, like global health and animal welfare, it's easier to attract talent which is both very skilled and very dedicated. Since a focus on the far-future is more limited to EA and adjacent communities, there is just a smaller talent pool of both extremely skilled and dedicated potential employees to draw from.
Far-future-focused EA orgs could be constantly suffering from this problem of a limited talent pool, to the point they'd be willing to pay hundreds of thousands of dollars to find an extremely talented hire. In AI safety/alignment, this wouldn't be weird as AI researchers can easily take a salary of hundreds of thousands at companies like OpenAI or Google. But this should only apply to orgs like MIRI or maybe FHI, which are far from the only orgs 80k surveyed.
So the data seems to imply leaders at EA orgs which already have a dozen staff would pay 20%+ of their budget for the next single marginal hire. So it still doesn't make sense that year after year a lot of EA orgs apparently need talent so badly they'll spend money they don't have to get it.
We have been screening fairly selectively on having an EA mindset, though, so I'm not sure how much larger our pool is compared to other EA orgs. In fact, you could maybe argue the opposite -- given the prevalence of long-termism among the most involved EAs, it may be harder to convince them to work for us.
From my vantage point, though, their actions don't seem consistent with this view.
Yeah, I'm still left with more questions than answers.
Echoing David, I'm somewhat sceptical of the responses to "what skills and experience they think the community as a whole will need in the future". Does the answer refer to high impact opportunities in general in the world or only the ones who are mostly located at EA organisations?
I'm also not sure about the relevance to individual EA's career decisions. I think implying it might be relevant might be outright dangerous if this answer is built on the needs of jobs that are mostly located at EA organisations. From what I understand, EA organisations have had a sharp increase in not only the number, but also the quality of applications in recent times. That's great! But pretty unfortunate for people who took the arguments about 'talent constraints' seriously and focused their efforts on finding a job in the EA Community. They are now finding out that they may have little prospects, even if they are very talented and competent.
There's no shortage of high impact opportunities outside EA organisations. But the EA Community lacks the knowledge to identify them and resources to direct its talent there.
There are only a few dozen roles at EA orgs each year, nevermind roles that are a good fit for individual EA's skillset. Even if we only look at the most talented people, there are more capable people the EA Community isn't able to allocate among its own organizations. And this will only get worse - the EA Community is growing faster than jobs at EA orgs.
If we don't have the knowledge and connections to allocate all our talent right now, that's unfortunate, but not necessarily a big problem if this is something that is communicated. What is a big problem is to accidentally mislead people into thinking it's best to focus their career efforts mostly on EA orgs, instead of viewing them as a small sliver in a vast option space.
"Does the answer refer to high impact opportunities in general in the world"
That question is intended to look at the highest-impact jobs available in the world as a whole, in contrast with the organisations being surveyed. Given the top response was government and policy experts, I think people interpreted it correctly.
I've volunteered to submit a comment to the EA Forum from a couple anonymous observes which I believe deserves to be engaged.
The model this survey is based on implicitly creates something of an 'ideal EA,' which is somebody young, quantitative, elite, who has the means and opportunities to go to an elite university, and has the personality to hack very high-pressure jobs. In other words, it paints a picture of EA that is quite exclusive.
This strikes me as an odd statement to make, given that - so far - the two funds have essentially operated as the same fund and have given donations to the exact same organizations with the exact same stated purposes. That being said, I agree it’s reasonable to expect the grantmaking of the funds to diverge under the forthcoming new management and maybe this expectation is what is being priced in here.
I had written the same comment, but then deleted it once I found out that it wasn't quite as true as I thought it was. In Nick's writeup the grants come from different funds according to their purpose. (I had previously thought the most recent round of grants granted money to the exact same organisations.)
Ah, I see. There's overlap on 80K and CEA, but the long-term future fund goes to CFAR and MIRI, whereas the EA Community fund goes to Founders Pledge.
I don't know how others answered this question, but personally I didn't answer for how good I thought the last grants were to each other (ie, I wasn't comparing CfAR/MIRI to Founders Pledge) or in expectation of changover in grant maker. I was thinking about something like whether I preferred funding over the next 5 years to go to organisations which focused on the far future vs community building, knowing that these might or might not converge. I'd expect over that period a bunch of things to come up that we don't yet know about (in the same way that BERI did a year or so ago).
Looking at this part -
"We did include more people from organisations focused on long-termism. It’s not clear what the right method is here, as organisations that are bigger and/or have more influence over the community ought to have more representation, but we think there’s room for disagreement with this decision."
I think one potential reason there are more people interested in EA working at LTF organisations is that EA and LTF are both relatively new ideas. Not many people are considering careers in these areas, so it is much easier for a community to found and staff the majority of organisations.
If global development had been ignored until 5 years ago, it's very likely most of the organisations in this area would be founded by people interested in EA, and they might be over represented in surveys like this.
There may be talent gaps in other cause areas (beyond development and animals) that are missed out as they don't have leaders with EA backgrounds but that doesn't mean that those gaps should be under weighted.
It may be worth having a separate survey trying to get opinions considering talent gaps in priority areas whether they are led by people involved in EA or not.
Tackling just one part of this:
"It may be worth having a separate survey trying to get opinions considering talent gaps in priority areas whether they are led by people involved in EA or not."
Ultimately our goal going forward is to make sure that we and our readers are highly informed about our priority paths (https://80000hours.org/articles/high-impact-careers/). About six out of ten of our current priority paths get direct coverage in this survey, while four do not.
I agree in future we should consider conducing different surveys of other groups - including people who don't identify as part of the EA community - about opportunities they're aware of in order to make sure we stay abreast of all the areas we recommend, rather than just those we are closest to.
I find the term 'operations' to be chunky and plausibly misleading in a survey such as this where it might take on quite different meanings for different people and organizations depending on the specific needs it refers to, for example, in this article: https://80000hours.org/articles/operations-management/
Insofar as it is feasible, I would love to see it broken up into different parts, as it seems to me that it can refer to a lot:
Interesting that the Long Term Future Fund is thought of as the most cost effective fund, even though the cause area is considered one of the least funding constrained. Sounds like there are still some pretty amazing opportunities for donations in that area!
I am curious about the finding of "government and policy experts" being perceived as a priority for the EA community as a whole, but not for individual organizations. The speculation in the report offers some scenarios as to what is meant by the respondents rating this highly, but I haven't seen comments here that address this open ended question.
I comment as someone with government and policy background exploring the EA community over the last year or so with curiosity. I think I'm mid-career and looking at effective giving strategies, but trying to read more on policy roles within EA.
I think it's that none of the existing EA organizations would want to hire government and policy experts to their own orgs, but would very much like to see people with an EA approach working in government.
How much weight does 80,000 Hours give to these survey results relative for other factors which together form 80k's career recommendations?
I ask because I'm not sure managers at EA organizations know what in the near future their focus area as a whole will need, and I think 80k might be able to exercise better independent judgement than the aggregate opinion of EA organization leaders. For example, there was an ops bottleneck in EA that is a lot better now. It seemed like orgs like 80k and CEA spotted this problem, and drove operations talent to a variety of EA orgs. But independent of one another I don't recall other EA orgs which benefited from this push helping to solve this coordination problem in the first place.
In general, I'm impressed with 80k's more formal research. I imagine there might be pressure for 80k to give more weight to softer impressions like what different EA org managers think the EA movement needs. But I think 80k's career recommendations will remain better if they're built off a harder research methodology.
Hi Evan,
Responses to the survey do help to inform our advice but it’s only considered as one piece of data alongside all the other research we’ve done over the years. Our writeup of the survey results definitely shouldn’t be read as our all-things-considered view on any issue in particular.
Perhaps we could have made that clearer in the blog post but we hope that our frank discussion of the survey’s weaknesses and our doubts about many of the individual responses gives some sense of the overall weight we put on this particular source.
Oh, no, that all makes sense. I was just raising questions I had about the post as I came across them. But I guess I should've have read the whole post first. I haven't finished it yet. Thanks.
What did the leaders of direct work organisations in the "additional survey" say about their discount rate for future donations? Were they lower than the discount rates reported by the leaders of non-direct organisations?