Hide table of contents

TL:DR; There are currently more than 100 open EA-aligned tech jobs, including roles such as software engineering, ML engineering, data science, product management, and UI/UX design.

Thanks to Vaidehi Agarwalla, Yonatan Cale, Patrick Gruban and David Mears for feedback on drafts of this post.

EA’s tech needs in 2018

Flashback to EAG London 2018: Having just come out of a web development bootcamp, I got the impression that the only use for software engineers within the EA community was earning-to-give. There were very few software engineering jobs at EA organizations, and they didn’t seem to be particularly exciting. As one conference participant said to me: “There are only so many EA orgs for which websites must be built”. Sure, there was AI safety research, but your “standard” web developer does not have the required skill set.

EA’s software needs in 2022

Fast forward 3.5 years, this has changed. Probably partly due to EA growing significantly in funding, more and more tech jobs are advertised every day. The diversity of jobs has also increased, with roles ranging from app and web development, to data science, ML engineering or product management, as well as varying in seniority levels.

Some examples:

  • Anthropic is looking for a senior software engineer to build large scale ML systems to do AI alignment work.
  • Momentum is hiring a frontend engineer to help build their donation app.
  • Our World in Data is searching for a Head of Product and Design.
  • IDInsight has open positions for junior data scientists and engineers.

At the most recent EAG London, I was talking about this to someone working at a big tech company, and they asked me: “But how many EA-aligned tech jobs are there? Surely not a 100?”, to which I replied without much thinking, “No, 100 sounds about right!” 

Are there really more than 100 open EA-aligned tech jobs?

Curious whether I was correct, I’ve compiled this list, and, lo and behold, my estimate seems rather spot on. I could find 115 openings for EA-aligned tech jobs, at least if we interpret the relevant terms as follows.

What I counted as a tech job

I consider the following roles a tech job:

  • Software engineer/developer
  • Data scientist
  • Data engineer
  • IT admin
  • Product manager
  • UI/UX designer
  • Research engineer
  • Machine Learning engineer

What I didn't count as a tech job

I did not include AI research jobs, like this one at DeepMind. Under some interpretation of the term “tech”, you might want to add such jobs, but the target audience for this post are people working in “ordinary” industry jobs that don’t require a PhD in Computer Science. I have met many EAs who have a data/software background, but aren’t drawn to AI research work as they aren't qualified. I did, however, include any tech jobs from the above list in organizations working on AI alignment, such as this one at OpenAI.

I excluded internships, as calling these “jobs” seems to be somewhat misleading.

What did I count as EA-aligned?

As EA-aligned, I considered jobs that satisfy any of the following conditions:

[EDIT] To avoid confusion, it is worth pointing out that some of the jobs on the 80k job board are listed there not because they are directly impactful, but because they help people build up career capital to be impactful later. As Niel Bowerman from 80k writes in a comment to this post: "I'm keen for people to take these "career capital" roles so that in the future they can contribute more directly to making the development of powerful AI systems go well." So "EA-aligned" shouldn't be read as "directly impactful towards EA goals".

 [EDIT 2] Some have argued in the comments that working on AI capabilities research, e.g., for OpenAI or DeepMind, would be actively harmful. 

There is probably much more to come

I did not reach out to the organizations to check whether the positions listed are all still open. Even if some of them have already been closed and the ads have just not been taken offline, it is reasonable to expect that lots more jobs will be posted soon given that the FTX Foundation is expected to give out at least $100m in grants this year and has in recent weeks made its first funding decisions.

Does EA have an impactful tech job that would fit you?

There are lots of EAs with tech skills. As a proxy, of the roughly 1250 public profiles on the EA Hub that have been filled with more information than just the user’s name, roughly 200 have listed “software engineering” as a skill. Extrapolating this to an estimated community size of 7400 active EAs in 2021 (based on the EA survey), and subtracting the 31% of students in said survey, we should expect there to be have been more than 800 active EAs who are professionals with software engineering skills at that time. (Granted, these skills are quite different from UX or product management skills, so I am not saying that the talent pool for those skills is large within EA.)

Despite this, it has been noted that EA organizations don’t find it easy to hire tech talent. So, if you work as a software engineer, UX designer, data scientist or product manager in the industry, consider applying for an EA-aligned tech job. My sense from having run many events in the EA tech community is that the average member would like to use their skills to do direct work, but underestimates how many jobs there are.

Where to find EA-aligned tech jobs

  • For current openings (as of 26 April 2022), see this list of jobs I created. I do not intend to update it regularly, but feel free to make suggestions for any jobs I might have missed by adding a comment.
  • The 80k job board, filter by “Engineering”
  • The Software, Data, and Tech Effective Altruism Facebook group
  • Attend EAG(x)s. Some organizations are looking for people and present at career fairs, but haven’t posted those jobs yet.
Comments21
Sorted by Click to highlight new comments since:

(Not necessarily a criticism of this post, but) I want to note that some (maybe 20%?) of these roles seem probably-net-negative to me, and  I think there are big differences in effectiveness between the rest.

Maybe I'm wrong, but make sure to think carefully about finding a job that has a big positive impact, not just getting a job at a (more or less) EA-aligned organization!

I agree with this. Please don't work in AI capabilities research, and in particular don't work in labs directly trying to build AGI (e.g. OpenAI or Deepmind). There are few jobs that cause as much harm, and historically the EA community has already caused great harm here. (There are some arguments that people can make the processes at those organizations safer, but I've only heard negative things about people working in jobs that are non-safety related who tried to do this, and I don't currently think you will have much success changing organizations like that from a ground-level engineering role)

Do you think this is the case for Deepmind / OpenAI's safety teams as well, or does this only apply to non-safety roles within these organisations?

I don't think this is true for the safety teams at Deepmind, but think it was true for some of the safety team at OpenAI, though I don't think all of it (I don't know what the current safety team at OpenAI is like, since most of it left to Anthropic).

Thanks for sharing. It seems like the most informed people in AI Safety have strongly changed their views on the impact of OpenAI and Deepmind compared to only a few years ago. Most notably, I was surprised to see ~all of the OpenAI safety team leave for Anthropic . This shift and the reasoning behind it have been fairly opaque to me, although I try to keep up to date. Clearly there are risks with publicly criticizing these important organizations, but I'd be really interested to hear more about this update from anybody who understands it.

Thanks for the comment and further clarifying OP's point. This is an important perspective. I have edited the post to refer to your comment. 
Would you maybe like to share a link to some discussion regarding this for those who would like to read more about it?

Which roles specifically seem net-negative to you?

Not OP, but I'm guessing it's at least unclear for the non-safety positions at OpenAI listed but it depends a lot on what a person would do in those positions. (I think they are not necessarily good "by default", so the people working in these positions would have to be more careful/more proactive to make it positive. Still think it could be great.) Same for many similar positions on the sheet but pointing out OpenAI since a lot of roles there are listed. For some of the roles, I don't know enough about the org to judge.

Thanks. I don't have a personal opinion on this, but I've adapted the list to show which of the OpenAI positions were listed on the 80k job board and which not. I would point out that 80k lists OpenAI as an org they recommend.

Manifold Markets (my startup) is also EA-aligned and hiring! http://bit.ly/manifold-jobs, or reach out to me over EA Forum or email (austin@manifold.markets)~

As a snapshot of the landscape 1 year on, post-FTX:

80,000 Hours lists 62 roles under the skill sets 'software engineering' (50) and 'information security' (18) when I use the filter to exclude 'career development' roles.

This sounds like a wealth of roles, but note that the great majority (45) are in AI (global health and development is the distant second-place runner up, at 6); and the great majority are in the Bay Area (35; London is second with 5).

Of course, this isn't a perfectly fair test, as I just did the quick thing of using filters on the 80K job board rather than checking all the organisations as Sebastian did last year.

Thanks for writing up this post.  I'm excited to see more software engineers and other folks with tech backgrounds moving into impactful roles.  

Part of my role involves leading the 80,000 Hours Job Board.  In case it's helpful I wanted to mention that I don't think of all of the roles on the job board as being directly impactful.  Several tech roles are listed there primarily for career capital reasons, such as roles working on AI capabilities and cybersecurity.  I'm keen for people to take these "career capital" roles so that in the future they can contribute more directly to making the development of powerful AI systems go well.  

Could 80,000 Hours make it clear on their job which roles they think are valuable only for career capital and aren't directly impactful? It could just involve adding a quick boilerplate statement like in the job details, such as:

Relevant problem area: AI safety & policy

Wondering why we’ve listed this role?

We think this role could be a great way to develop relevant career capital, although other opportunities would be better for directly making an impact.

Perhaps this suggestion is unworkable for various reasons. But I think it's easy for people to think, since this job is listed on the 80,000 Hours jobs board and seems to have some connection to social impact, then it's a great way to make an impact. It's already tempting enough for people to work on AGI capabilities as long as it's ""safe"". And when the job description says "OpenAI […] is often perceived as one of the leading organisations working on the development of beneficial AGI," the takeaway for readers is likely that any role there is a great way to positively shape the development of AI.

What are your thoughts on Habryka's comment here?

Please don't work in AI capabilities research, and in particular don't work in labs directly trying to build AGI (e.g. OpenAI or Deepmind). There are few jobs that cause as much harm, and historically the EA community has already caused great harm here. (There are some arguments that people can make the processes at those organizations safer, but I've only heard negative things about people working in jobs that are non-safety related who tried to do this, and I don't currently think you will have much success changing organizations like that from a ground-level engineering role)

China-related AI safety and governance paths - Career review (80000hours.org) recommends working in regular AI labs and trying to build up the field of AI safety there. But how would one actually try to pivot a given company in a more safety-oriented direction?

Thanks, Niel, I probably should have been more explicit about this. I've added a paragraph to make this clearer.

How much do these roles pay? If they do not pay salaries at the same level that someone with those skills could make elsewhere, that means that top talent might not go to these roles unless they happen to be effective altruists, which seems unlikely. I've read a lot about how EA is no longer money constrained, so if these roles do not pay very well, is that actually true?

It varies, but many of them pay very well, comparable to or better than elsewhere. On the other hand, people reading this post are likely EAs, so the point that they might underpay somewhat seems less relevant. And in some cases, goal alignment actually matters tremendously, and paying less than market rate is strategically a fantastic way to filter for that. 

And in roles where both high pay and alignment are needed, I'd probably prefer to see something like "We will pay 75% of market rate, and will additionally donate a marginal 50% of your salary to an effective charity of your choice," rather than paying market rate.

Great post! The "list of EA-aligned organizations"  seems very useful. Do you know if it is linked to some more official EA source like the CEA website or 80k website? and do you know how up to date it is?
Initially it seems like core EA information that is hanging in a random Notion page but I am probably wrong

Thanks!
AFAIA, the list of orgs is not linked to any official EA source, but a wiki that can be updated by anyone.  If you're looking for something more official, the list of organizations on 80k's job board might be better. But it's missing some orgs that are clearly EA-aligned, e.g., the national regranting orgs.

Yep, the recommended orgs list on the 80,000 Hours Job Board (and the job board itself) is certainly not aiming to be comprehensive.

Thanks for the info!

Curated and popular this week
Relevant opportunities