This is a Draft Amnesty Week draft. |
Last year, someone who was considering projects related to diversity and inclusion in EA noted that one challenge was not knowing what had been tried before. I drafted this summary but never got it out the door.
The atmosphere around DEI interventions, in the US at least, is different than it was when I first drafted this. I’m not intending this as commentary as anything going on currently, but as finally publishing an old draft. The main update I've made is to add some demographic data that recently came out from the 2024 EA Survey.
I don’t mean this post as a claim that EAs have done all the right things. I mean it as a historical record so people can better gauge what’s been tried over time, what might be worth trying differently, and what gaps are unfilled.
I’m sure I’ve left out some efforts by mistake; sorry about that. Some efforts listed here aren’t nominally about diversity, but serve the purpose of supporting people who might feel marginalized in EA. I’m not including most projects that were explored / had some initial steps, but were never publicly carried out.
Organizational / program efforts
2015: CEA’s EA Outreach team has a summer intern focused on how to support demographic diversity in EA and particularly at conferences. An early version of the code of conduct for EA Global is written, and the series has a community contact person to address any problems that come up. My impression is that the conference organizers particularly want to avoid the phenomenon of “something upsetting happens at a conference, it’s nobody’s job to do anything about it, and then it’s hashed out on the internet,” like Elevatorgate. Later in 2015 I take on the role, which eventually evolves into the community health team.
2016: EAGx conference series launches, including in Hong Kong and Nairobi.
2016: EA Global makes a more serious effort to platform a more demographically diverse array of speakers than in its initial year. This continues over time.
2017-2022: Encompass, led by Aryenish Birdie, provides DEI services and advising in the farmed animal advocacy space. Several EA organizations get consulting services from Encompass.
Starting around 2018: meetups for identity / affinity groups at EAG and EAGx conferences, eventually including
- Women and nonbinary people
- People of color
- Socioeconomic diversity (for people who grew up lower-income, first in their family to attend university, etc)
- Meetups for different religious groups, or for religious people generally
Various times: People start online groups to connect people from identity groups:
- Women and nonbinary people in EA
- Underrepresented racial and ethnic groups in EA
- Parents in EA
- LGBT+ in EA
- Disabled and chronically ill people in EA
- Diversity and inclusion in EA
- EAs from immigrant backgrounds
- EA for Jews
- Buddhists in EA
- Muslims for Effective Altruism
- EA for Christians
Various times: Some EA groups develop a code of conduct.
Various times: Some EA groups have some kind of community contact person, either volunteer or sometimes paid. (Finland, Sweden, Denmark, Norway, Israel, London, NYC, Germany, France, and others.) This role aims to help address interpersonal or community problems that come up within the group.
2018 onward: CEA’s groups team shares a folder of resources for EA groups, including community health guidance about welcomingness and handling problems that arise. 2019: EA Hub creates a website of resources for group organizers, including guidance on community health, which later evolves into this website.
2019: Sky Mayhew, who has a background in research on diversity interventions, joins the community health team at CEA. Her work explores projects around mentorship and ways to connect EAs who would otherwise feel isolated because of demographics.
2019: Magnify Mentoring (originally called WANBAM) is founded by Kathryn Mecrow-Flynn and Catherine Low, sparked by a discussion in a meetup for women and nonbinary people at EA Global. The goal is providing support and mentorship for women, nonbinary people, and trans people in doing more good with their careers. Later pilots broader programs (see 2023 entry).
2020: EA Anywhere forms to connect people who aren’t geographically near other groups, don’t want to attend their local group, or can’t attend their local group for some reason (including physical mobility).
2020: several EA groups and orgs start explicitly encouraging diversity on applications for fellowships and other programs, e.g.: "We are committed to building a diverse applicant pool. There is some evidence suggesting that individuals from underrepresented groups tend to underestimate their abilities. [Group] does not want the application process to dissuade potential candidates. We strongly encourage interested candidates to apply regardless of gender, race, ethnicity, nationality, physical ability, educational background, socioeconomic status, etc."
Ongoing, ramping up in 2020: Will MacAskill is the main face of EA, but he makes an ongoing effort to recruit and mentor people who aren’t white men to be spokespeople or public intellectuals in EA. He’s pitched me on this at intervals since 2016. My perception is that it doesn’t work as well as hoped; people are typically busy with their own work and/or don’t want to be a public face of EA. In 2023 Will and his team start more active connection of new spokespeople with journalists / op-ed opportunities / podcasts; my perception is this works somewhat better, although still has limited impact.
Over time: Many EA groups put attention toward how to support group members from underrepresented groups, and how to have better numeric balance within groups. One example is EA Sweden’s strategy of changes to communications, workshops, and focused career coaching, resulting in significant increases in women at their national conference and as new members.
2021-2023: Open Philanthropy runs a scholarship for international students admitted to top undergraduate programs in the US or UK.
2022: Before the fall of FTX, CEA offers travel support to many EAG attendees, allowing more attendees to travel from low and middle income countries (and for lower-income people in high income countries). Travel grants drop when funding is scarcer post-FTX.
2023: GPI holds a fully-funded 4-day global priorities workshop for students from underrepresented groups.
By 2023 / 2024, EAGx conferences include Latin America, India, Warsaw, and Philippines, plus a virtual conference accessible anywhere in the world.
2023: Magnify Mentoring runs a pilot aimed at underrepresented groups generally, including "people from low to middle income countries, people of color, people from low-income households, etc" aiming at high-impact altruism. [edited, see comment from KMF]
2024: Animal Advocacy Careers offers a series of hiring workshops with a focus on inclusivity.
2024: Launch of Athena, a research mentorship program for women in AI alignment research.
Efforts in hiring / staffing at EA orgs
2016 onward: Many EA orgs don’t originally have any written policy about parental leave. As in other industries, EA orgs begin to provide paid maternity leave (generous by US nonprofit standards, not necessarily by European standards) in an effort to be more successful at hiring and retaining women.
Ongoing over time: EA orgs think about how to improve diversity in hiring. Some practices (but not all) I’m aware of that EA-related orgs tried when recruiting for jobs, fellowships, etc:
- Asking staff and others to brainstorm specifically for candidates from underrepresented backgrounds, and doing personal outreach to these people
- Using language in job listings that doesn’t slant male
- Posting job openings on platforms aimed at groups underrepresented in EA, as well as on generic EA platforms
- Emphasis on trial tasks, to avoid weighting credentials or superficial characteristics
- Anonymizing the initial stages of applications (but several orgs note that this doesn’t seem to improve rates of people from underrepresented backgrounds making it into the pool).[1]
- Keeping hiring rounds open for longer until there are candidates from underrepresented groups who met the hiring criteria (one org says they stopped doing this after it meant keeping rounds open for a long time while programs struggled without enough hires)
- Extra encouragement throughout the hiring process for candidates from underrepresented groups
For some organizations, these practices have yielded a more demographically diverse staff (overall, not necessarily in leadership). I’ve heard mixed opinions on the effects these policies have had on organizations and individuals.
As far as I understand, it’s not legal in the US or UK to use race or gender as a criterion in a final hiring decision. There may be some exceptions for formal affirmative action plans in the US, or as a tiebreaker for equally-qualified candidates in the UK, but note that EA organizations tend to say there’s a strong difference between the person they hired and their second-preference applicant.
2017-2022: I carry out interview projects for 4 organizations with especially homogeneous staff, interviewing women* who were familiar with the org (people who had worked there, interned there, or had been offered jobs and decided not to work there) about what the org should consider changing. In most cases it’s not clear to me it led to useful changes. Some themes:
- It’s good if staff in the majority group make clear they value diversity and want a good experience for people in the minority group
- If ops staff aren’t treated well, and ops staff are disproportionately women, problems compound
- It's discouraging if women who aren’t ops staff get mistaken for visitors or ops staff
- Framing matters: “We invited you because of your demographics” is an insulting frame, vs. “We’re aware that bias might have led to overlooking people who don’t fit the usual mold”
When there are few people of your demographic in a space, you feel you’re sticking your neck out more if you speak up / feel you’ll reflect on your group if you make mistakes.
*In theory these would have focused on race too, but sadly I think something like 3 people of color worked at these combined 4 organizations at the time of the interviews. I’ve heard EA staff of color point out that diversity efforts often focus on gender because there are so few staff of color, ironically. In this case, the staff of color I interviewed didn’t have much to add about race-specific inclusivity on top of general inclusivity.
Other research / content / major discussions in the community
More here: EA Forum posts tagged diversity and inclusion
2011-2014 Early CEA / GWWC: discussions about the gender skew in EA, internal presentations by Bernadette Young and Carolina Flores on reducing it
2014: Several online discussions about how women are treated in EA, and how communities should handle concerns about unfair treatment.
2015: AGB's post EA Diversity: Unpacking Pandora's Box
2015 and 2016: some content at EA Globals about diversity from me and others; I don’t think the early content was very notable / useful.
2016: My post on making EA groups more welcoming
2015-2017: ACE is among the first EA orgs to write some pieces about social justice, especially racial equity
2016: Oxford and Cambridge UK EA/GWWC groups have committees focused on diversity (these committees / working groups come and go over time)
2017: Meeting about diversity and inclusion at EAG Boston, organized fairly spontaneously after a speaker raises the topic.
2017: Various workshops at EA Global on making EA groups more welcoming, similar to this post about this topic
2017: Georgia Ray’s EA Global talk and writeup about the research on diversity and team performance
2017: Kelly Witwicki’s Why & How to Make Progress on Diversity & Inclusion in EA and ensuing discussion
2016-2018: EA London collects data on attendance of their group by gender: “Women* are just as likely to attend one EA event as men. However, women are less likely to return to future EA events. Women and men are about equally likely to attend most learning-focused events, like talks and reading groups. Women are much less likely to attend socials and strategy meetings.” [*based on people’s names and gender presentation]
2018 onward: At first all EA groups are run by volunteers, but some groups start to get funding for organizer time. Within CEA, there’s attention to how much we should additionally weight groups that add geographic and demographic diversity. Kaleem's post An elephant in the community building room summarizes narrower and broader approaches. CEA strategy fluctuates between broader and narrower groups support over the years. Funding for groups in different areas rises and falls, based on changing CEA strategy, changing funding, and availability of strong candidates for community-building roles.
2019: Community health team starts to grow, with hire of Sky Mayhew who previously did academic research on diversity interventions. Sky explores a lot of models for possible DEI work, and views mentorship as one of the most promising. More data collecting within CEA begins, to track things like satisfaction with our programs across different demographic groups.
2019 onward: Multiple pieces on the dominance of the English language in EA
2019: EA London carries out focus groups on what women and men find appealing and offputting about the group
2019: Debate about a post in the Diversity and Inclusion in EA FB group, resulting in the writing of Making discussions in EA groups inclusive. Pushback from The importance of truth-oriented discussions in EA.
2019: CEA writes a stance on DEI
2019: Future of Life Institute profiles Women for the Future
2019: Vaidehi Agarwalla carries out a survey on ethnic diversity in EA
April 2020: 80,000 Hours post on Anonymous contributors answer: How should the effective altruism community think about diversity?
May 2020: Statistics on Racial demographics at longtermist organizations
June-August 2020: Racial justice becomes more salient, especially in the US. Debate within a lot of EA orgs and projects about whether and how to make some kind of statement about racial justice, as a lot of US institutions are doing at the time. At least among the Extremely Online parts of the community, there’s a lot of worry about how EA is handling current events and debates. Some people have strong worries about EA not taking social justice seriously enough; others worry that cancel culture and poor epistemics will have bad effects in EA.
Unusual number of online arguments about social justice, with some people leaving groups and platforms over it.
2020: Post and discussion on Geographic diversity in EA by AmAristizabal
2020 onwards: EA Infrastructure fund, Open Philanthropy, Meta Charity Funders (2024-) provide funding for EA groups (in addition to CEA), resulting in broader community building strategies from funders, sometimes leading to locations getting funded organisers even if they hadn’t been prioritized by CEA.
2020 / 2021: Heated discussion in an Effective Animal Advocacy online space about racism. ACE staff who were planning to speak at the CARE conference withdraw from the conference based partly on comments from a staff member at the sponsoring org. It seems possible that ACE will split from EA over this area, especially after they say they will not engage more with discussion about it on the EA Forum. Criticism here. Later, under new leadership, ACE voices the intention to stay involved with EA.
2023: Examples of content not explicitly based on demographics, but that I expect is more relevant to people from underrepresented groups:
- Power dynamics between people in EA by Julia Wise
- My experience with imposter syndrome — and how to (partly) overcome it by Luisa Rodriguez
2022: EA career guide for people from LMICs by Surbhi B, Mo Putera, varun_agr, AmAristizabal
Late 2022 / early 2023: After fall of FTX, a period of internal and external scrutiny on EA. Some responses reflect on diversity or experiences of different groups, e.g. I’m a 22-year-old woman involved in Effective Altruism. I’m sad, disappointed, and scared.
January 2023: Nick Bostrom makes a statement about a racist email he wrote 20 years before. Community responses include:
- A personal response to Nick Bostrom's "Apology for an Old Email" by Habiba Banu
- My Thoughts on Bostrom's "Apology for an Old Email" by Cinera
February 2023: TIME article about cultural problems in EA including sexual misconduct, and Owen Cotton-Barratt identifies himself as one of the people described. Community responses include:
- Why I Spoke to TIME Magazine, and My Experience as a Female AI Researcher in Silicon Valley by Lucretia
- Share the burden by Emma Richter
- If you’d like to do something about sexual misconduct and don’t know what to do… by Habiba Banu
- EA Community Builders’ Commitment to Anti-Racism & Anti-Sexism
- Things that can make EA a better place for women by lilly
2023: CEA’s community health team carries out an interview series and research project on gender-related experiences in EA, based on EAG feedback, groups surveys, some data from Rethink Priorities’ EA survey and approximately 40 interviews with women and non-binary people about their experiences that Charlotte conducted. Not currently public, but shared in 2023 with many EA organisation staff, office managers, and group organisers to inform efforts for their spaces.
2023: EAGxRotterdam holds a brainstorming session to produce advice on what might make EA better for women.
2023: Giving What We Can and One for the World produce a guide for inclusive events. At some point, Giving What We Can publishes a code of conduct for its own events.
2024: Series on experiences of non-Western EAs by Yi-Yang
2024: Alex Rahl-Kaplan and Marieke de Visscher, as community building grantees, carry out preliminary research on evidence-based arguments for and against prioritizing diversity. (Not currently published)
2024: EA Asia retreat session about diversity and EA in low and middle income countries
2024: EA Forum discussion about an event focused on prediction markets, which had significant overlap with the EA community in terms of people attending and speaking
- My experience at the controversial Manifest 2024 by Maniano
- Why so many “racists” at Manifest? by Austin
Demographic trends over time
Location
2020: The EA Survey finds that EAs live mostly in a few countries: “69% of respondents to the EA survey currently live in the same set of five high-income, western countries (the US, the UK, Germany, Australia, and Canada) that were most common in previous years.” But it’s changing a bit over time: “The percentage of respondents outside the top 5 countries has grown in recent years, from 22% in 2018, to 26% in 2019 and 31% in 2020.” “Overall satisfaction with the EA community is lower in the US and UK than in other regions and countries”, i.e. non-hub countries have higher satisfaction.
2019 - 2023: Areas with notable growth in EA include Philippines and Latin America.
Race
The population filling out the EA Survey has become slightly less white over time.
But the newest cohort (people who got involved in EA during the last year as of the 2024 survey) is markedly more racially diverse, and more diverse than the newest cohort was in the 2022 survey.
Gender
2014-2024: EA Survey participant ratios have gone from about 75% male in early years to about 69% male over the last 6 years.
As with race, the cohort of people who got involved in 2024 is markedly more balanced. This might fade if retention is lower for women, but this is still a different pattern than newcomers in the 2022 Survey. The EA Survey notes last week: “This pattern is compatible with an increased recruitment of women (and/or decreased recruitment of men) or disproportionate attrition of women over time, which we will assess in a future post.” I'm interested to see what they think might be happening here!
Gender balance varies by country, but it also varies quite a lot by year within the same country, which might reflect actual changes in EA in those countries but also might be statistical noise.
Political views
Note this post is mostly about demographic diversity, and isn't aiming to cover diversity of ideas or viewpoints. But I'll stray into that for a minute.
EA has been consistently left-leaning since data collection started in 2019. In 2024, 70% of EA survey respondents were left or center-left, less than 5% right or center-right, the rest libertarian or center. More detail here.
Age
EA has become less consistently young. From 2014 to 2024, median age moved from 25 to 31, and the age spread widened.
Experiences at events
2023: EA Global team publishes info about race and gender demographics at EAG, and attendee response data like how welcome they feel. Women and non-binary attendees reported that they find EA Global slightly less welcoming (4.46/5 compared to 4.56/5 for men). No statistically significant difference in terms of feelings of welcomeness and overall recommendation scores across groups in terms of gender or race/ethnicity.
- ^
On anonymizing applications:
- The evidence on anonymizing identities produces mixed results as far as whether more or fewer women and minority candidates advance to the next round, with some settings being markedly more likely to move forward resumes they think belong to women and racial minorities.
- Personally, I favor anonymizing applications in the early stages of EA hiring processes partly because organization staff are likely to know specific individuals applying. I feel better able to grade a trial task neutrally when I don’t know which applicant is which.
This is great stuff. I often find it hard to remember a lot of initiatives have happened (despite having read 80% of this list already) so this timeline is a good reference
As an aside, I think others may benefit from reading about diversity initiatives outside EA to remember this is hard problem. It's totally consistent for EA to be above-the-curve on this and still not move the needle much (directionally I think those two things are true but not confident on magnitudes), so linking some stuff I've been reading lately:
"2023: Magnify Mentoring expands to serve people from underrepresented groups generally. “It includes, but is not limited to, people from low to middle income countries, people of color, people from low-income households, etc.” - The intention here was to pilot a round for people from underrepresented groups not captured by gender. We haven't reached consensus as to whether we will continue. It depends mostly on the impact asssessment of the round (which concludes this month). While it is accurate to say Magnify initially focused on improving diversity and inclusivity in the EA community, the organization's strategy is now focused on supporting the careers and wellbeing broadly of people who are working in evidence-based initiaves with or without an EA link. I mention this mostly because I don't want people to self-select out of applying for mentorship or mentoring with us.
Thanks for the correction! I've adjusted the entries, do let me know if there's anything still not right.
Executive summary: This post provides a historical overview of diversity, equity, and inclusion (DEI) efforts in the Effective Altruism (EA) community, detailing key organizational initiatives, hiring practices, community discussions, and demographic trends over time.
Key points:
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.
I think the Encompass link is expired.
Thanks, I've changed it to an article about them.