ETA: Evidently FTX Future Fund is funding a talent search project in the developing world to help address the diversity problem. What else can be done?

ETA#2: "Social justice is a matter of life and death. It affects
the way people live, their consequent chance of
illness, and their risk of premature death. We watch in
wonder as life expectancy and good health continue
to increase in parts of the world and in alarm as they
fail to improve in others. A girl born today can expect
to live for more than 80 years if she is born in some
countries – but less than 45 years if she is born in
others. Within countries there are dramatic differences
in health that are closely linked with degrees of social
disadvantage. Differences of this magnitude, within and
between countries, simply should never happen."  WHO Closing the Gap in a Generation

 

Taking the question literally, searching the term ‘social justice’ in EA forum reveals only 12 mentions, six within blog posts, and six comments, one full blog post supports it, three items even question its value, the remainder being neutral or unclear on value.

Social justice – as defined by Oxford Reference – is the “objective of creating a fair and equal society in which each individual matters, their rights are recognized and protected, and decisions are made in ways that are fair and honest”. The term ‘social justice’ has evolved to become adversarial, but for my purposes here, I am referencing the Oxford definition.

Being relatively new to EA (I had not heard of it before June 2021, although since then I have consumed all the books, and many of the podcasts, as well as the websites of each organization, and have benefitted from the career counselling), I believe I can still take an outsider’s perspective. As a newcomer to EA, I have nothing to gain or lose, and am only interested in improving what I see as a fundamentally good social movement.

When I discovered EA, I believed I had found my community. Coming from a generally utilitarian viewpoint, valuing efficiency and evidence, and wanting to make the world a better place, I was excited to finally find a school of thought which aligns with my own. However, the more I learn about the people of EA, the more I worry EA is another exclusive, powerful, elite community, which has somehow neglected diversity. The face of EA appears from the outside to be a collection of privileged, highly educated, primarily young, white men.

I don’t believe EA’s oversight of diversity has been deliberate. I just think we (I include myself because, apart from being female and mid-career, I otherwise identify with the EA stereotype) start in a place of such privilege as to have the bandwidth to concern ourselves with making the world a better place. People who live in impoverished countries, work three jobs to make ends meet, are food insecure, or are struggling against the glass ceiling of structural power do not have the time to consider improving the lives of others. Or if they do, they do not have the means by which to do so.

I hope to appeal to EA’s core identity of welcoming constructive criticism in asking EAs to reflect on the social justice in the following spheres:
• The EA community is exclusive based on country of origin. Most leaders and founders seem to hail from the global north, more specifically, Europe and its colonies. As a white settler in Canada, I know the structural privilege of colonialism has afforded me success – my level of education, comfortable income, and respectable career - although I would like to believe it was entirely my talent and hard work which made me successful. As a woman, if I had been born in Afghanistan, I would be neither literate nor educated, let alone employed. We EAs start with privilege of country of birth.

• The EA community is exclusive in the decision-making. The EA organizations now manage billions of dollars, but the decisions, as far as I can tell, are made by only a handful of people. Money is power, and although the decisions might be carefully considered to doing the most good, it is acutely unfair this kind of power is held by an elite few. How can it be better distributed? What if every person in low-income countries were cash-transferred one years’ wage? It would give them a buffer to think about doing good better.

• The EA community is exclusive by level of education. I have seen much written about how EA considers itself merit-based, however, to be recognized for epistemic merit, one would need to have at least a post-secondary education to achieve a reputable job. Most EA leaders seem to have at minimum graduate degrees, if not tenured professorships. Do high school dropout, blue-collar working-class people have good ideas about doing the most good for the world? How would we know?

I have four questions:
1) Am I wrong in my stereotype of the average EA? I would be interested to learn if the rank-and-file EAs are a more diverse group than those who present a public face. However, even if true, I suspect the diversity within EA could still be improved. 


2) Is my interest in greater diversity within EA misguided? If you think it is, please reply in the comments, rather than simply downvoting the post. 


3) If I am right about the lack of diversity in EA, is it something the EA community is working towards improving? I have seen EA forum comments refute equity, diversity, and inclusion initiatives as not being merit-based, but as I argued above, the merit you are identifying is fostered by privilege; the privilege of education, country of birth, the ability to attend an elite university, etc. Therefore, selection is not based on merit, but privilege.  Bright Malawian girls would achieve the same merit if provided with the same privilege.

4) If diversity should be increased, how can it be accomplished? Although I dislike the term, “giving a voice to the voiceless” because in practice we should be passing the mic, one idea is to have advisors from marginalized communities to all the EA decision-makers. Another idea, borrowed from the psychology term “contact hypothesis”, is the EA leaders could go to the marginalized people to get ideas, for example at soup kitchens, immigrant and indigenous communities, detention centers, or in low-income countries, refugee camps, or in impoverished global south villages.

I do not believe EA has been deliberately exclusionary, but I am concerned that if concerted efforts are not made to include marginalized people within EA and in the distribution of billions of dollars, it may become an echo chamber of epistemic elite. 
 

38

0
0

Reactions

0
0

More posts like this

Comments36
Sorted by Click to highlight new comments since:

We could definitely do well to include more people in the movement. For what it's worth, though, EA's core cause areas could be considered among the most important and neglected social justice issues. The global poor, non-human animals, and future generations are all spectacularly neglected by mainstream society, but we (among others) have opted to help them.

You might be interested in the following essays:

Bryan Caplan makes a related remark in Rob Wiblin's interview with him:

Bryan Caplan: Oh yeah. I remember what I really wanted to say about EA, which is I’ve got a slogan. My slogan is “EA is what SJ ought to be.” So it’s the contrast between two groups: both very idealistic, both want to make the world a better place. But again, the way you make the world a better place is by, step one, calming down, realizing that you don’t know that much about the world, and then trying to figure it out. And along the way, be nice to other people, because maybe they have something to teach you. Even if a lot of what they have to say is wrong, just getting that kind of feedback is very helpful for learning more. You don’t want to alienate critics, because without critics, you’re just stuck in your own echo chamber.

Bryan Caplan: Social justice movements are really weak on all those things. You got the intentions, but in terms of having the right mindset for actually making the world a better place, SJ has the right mindset for fanatically making the world worse.

calming down

 

I thought this paper on the 'The Aptness of Anger ' was good pushback on that point.

I also thought this podcast on how racism can distract from bigger problems (like climate) was insightful on this point (related Op-ed).

Taking the question literally, searching the term ‘social justice’ in EA forum reveals only 12 mentions, six within blog posts, and six comments, one full blog post supports it, three items even question its value, the remainder being neutral or unclear on value.

That can't be right. I think what may have happened is that when you do a search, the results page initially shows you only 6 each of posts and comments, and you have to click on "next" to see the rest. If I keep clicking next until I get to the last pages of posts and comments, I can count 86 blog posts and 158 comments that mention "social justice", as of now.

BTW I find it interesting that you used the phrase "even question its value", since "even" is "used to emphasize something surprising or extreme". I would consider questioning the values of things to be pretty much the core of the EA philosophy...

You are right, I missed the "next" button.  I did wonder why there was so little discussion on the forum about fair and equal society. I believe you made the comment I found which questions its value.

FWIW, I'm not sure if you found it already, but I think this is the best piece I've seen written so far on the overlaps and differences between EA and SJ worldviews: What Makes Outreach to Progressives Hard

you are right,  it is an excellent summary I had not found.

Strong upvote - this is a really great post and helped me understand the source of many disagreements between myself and my more social justice-oriented friends.

Taking the question literally, searching the term ‘social justice’ in EA forum reveals only 12 mentions, six within blog posts, and six comments...

I worry EA is another exclusive, powerful, elite community, which has somehow neglected diversity.

 

I think it's worth distinguishing discussions of "social justice" from discussions of "diversity." Diversity in EA has been much discussed, and there is also a whole facebook group dedicated to it. There has been less discussion of "social justice" in those terms, partly, I suspect, because it's not natural for utilitarians to describe things in terms of "justice", and partly because, as mentioned, the phrase "social justice" has acquired specific connotations. However, there has been extensive discussion of broader social justice related critiques of EA, largely under the banner of "systemic change."

Note: we also track demographic diversity in the ~ annual EA Survey.

It's worth mentioning that the diversity Facebook group is barely active; And that when EAs talk about systemic change of the social justice sort, they usually don't support pursuing it. Rather they support specific policy changes like criminal justice reform or tobacco taxes.

Summoning a benevolent AI god to remake the world for good is the real systemic change.

No, but seriously, I think a lot of the people who care about making processes that make the future good in important ways are actually focused on AI.

 

I don't see a contradiction between the two. I myself want to improve the future, and I'm studying machine learning and I think it has the potential to bring us to a post-scarcity society. But like every powerful technology, it has a gazillion ways it could go wrong and be disastrous. Even if it doesn't end the world, it won't work if used by those already in power to further their reach. Systemic change of the social justice kind is necessary for systemic change of the AI kind to be worth anything.

I think you raise important questions on a complex topic. Regarding the dearth of the literal phrase 'social justice', I think this is probably an effect of two things: 
1) an actual de-emphasis on the concept, either intentionally or by a larger relative focus on other ethical frameworks. My unfounded guess would be that there'd be a minority of people who would want to whole-heartedly reject social justice-related talk, a larger group who are open to it but consider it not as important as other ethical concepts to EA, and a minority of people who think it should be the dominant ethical concept. 

and 

2) an avoidance of the literal term 'social justice' because of its connotations in the 'culture wars', politics, etc. I think a lot of EA thought deals with the substance of social justice - examples might be the neartermist focus on low and middle-income countries, work on U.S. criminal justice reform,  the international justice angle on climate change, or even the intergenerational justice angle on longtermism. This work is grappling with justice-related issues implicitly, but isn't usually conceptualised or written out in terms of social justice. Whether it can or should be or not I don't know. 

I just linked to that too! I think about it all the time.

This is great!  Also, I am appreciative to learn you have been working hard on diversity  :)

I have seen much written about how EA considers itself merit-based, however, to be recognized for epistemic merit, one would need to have at least a post-secondary education to achieve a reputable job. Most EA leaders seem to have at minimum graduate degrees, if not tenured professorships. 

One of the highest status EA leaders of all time is literally a high school dropout.

As a college dropout from the SF Bay Area EA/rationalist community where it's common for people at parties (including non-EA/rationalist parties) to brag about who dropped out of school earliest, I've never really grokked some people's impression that EA is highly credentialist.

If you're privileged in other ways, it's easier to get away with dropping out (or even use it as a countersignal). It's an intersectional issue.

Right. I'm also "literally a highschool dropout" and am studying for my master's.

Yes, seems like clear self-selection of people who enjoy schooling.

Interesting! He is an outlier. I would be very interested to learn his story, if possible.

https://intelligence.org/2018/02/28/sam-harris-and-eliezer-yudkowsky/

Flipping the question around, we might also ask "where is the EA in social justice"? What has the social justice movement done to prioritize their efforts, to focus on cost-effectiveness, to ask how they can do the most good?

You could, and you should! e.g. why social justice movements don't give a voice to the Global South. But that's another discussion and does not answer the current one.

I think lack of diversity in EA is largely due to founder effects, and EA is working on this. There's an emerging effort to have EA outreach in more global south countries like India, the Philippines, and Mexico, and local EA community-builders are working hard on that.

For what it's worth, it seems to me that EA university groups have more racial and gender diversity than the broader EA movement, which I think is because they reach a broader base of people, compared to the type of people who randomly stumble across EA on the internet.

The EA community is exclusive by level of education. I have seen much written about how EA considers itself merit-based, however, to be recognized for epistemic merit, one would need to have at least a post-secondary education to achieve a reputable job.

I'm not sure I agree. I think many EA orgs are anti-credentialist enough that you wouldn't need any university education if you have the skills, which could be built through, e.g., doing independent work funded by the Long-Term Future Fund. Actually, I think dropping out of college to do EA work is even more badass. Compared to academia, a good number of AI alignment researchers don't have graduate degrees, though it is true that many EA leaders have graduate degrees.

I think your assessment of the lack of diversity in EA is right, that this is a problem (we're missing out on talented people, coalition allies, specific knowledge, new ideas, wider perspectives, etc), and that we need to working towards improving this situation. On all three (questions 1-3), see this  statement from CEA. Thanks for raising this!

In terms of what we can be doing, being inclusive in hiring and pipeline-building seem very important - Open Philanthropy are amongst the best practice on this (see here) and Magnify Mentoring are doing awesome work.

Empowering marginalised/affected communities directly and working closely with them is one of the reasons GiveDirectly is great and is strongly supported by the EA community. This can't work so clearly with farmed animals and future people, of course.

Just to add to other links people have offered, I've always liked this on privilege,  and this discussion: 

GiveDirectly is great and is strongly supported by the EA community.

Theoretically - but GiveWell seems to prefer to keep money rather than give it directly. There may or may not be good reasons for that, but it's not a strong message for direct empowerment of marginalised communities.

I agree with you that EA outreach to non-Western cultures is an important and probably neglected area — thank you for pointing that out! 

There are lots of reasons to make EA more geographically (and otherwise) diverse, and also some things to be careful about, given that different cultures tend to have different ethical standards and discussion norms. See this article about translation of EA into Mandarin. Something to observe is that outreach is very language and culture-specific. I generally think that international outreach is best done in a granular manner — not just “outreach to all non-Western cultures” or “outreach to all the underprivileged”. So I think it would be wonderful for someone to post about how to best approach outreach in Malawi, but that the content might be extremely different from writing about outreach in Nigeria. 

So: if you're interested in questions like this, I think it would be great if someone were to choose a more specific question and research it! (And I appreciate that your post points out a real gap.)

On a different note, I think that the discussion around your post would be more productive if you used other terms than “social justice.” Similarly, I think that the dearth of the phrase “social justice” on the EA Forum is not necessarily a sign of a lack of desire for equity and honesty. There are many things about the “social justice” movement that EAs have become wary of. For instance, my sense is that the conventional paradigm of the contemporary Western elite is largely based on false or unfalsifiable premises. I’d guess that this makes EAs suspicious when they hear “social justice” — just like they’re often wary about certain types of sociology research (things like “grit,” etc. which don’t replicate) or psychosexual dynamics and other bits of Freud’s now-debunked research.  

At the same time (just like with Freudism), a lot of the core observations that the modern social justice paradigm makes are extremely true and extremely useful. It is profoundly obvious, both from statistics and from the anecdotal evidence of any woman that pretty much every mixed-gender workplace has an unacceptable amount of harassment. There is abundant evidence that e.g. non-white Americans experience some level of racism, or at least are treated differently, in many situations. 

Given this, here are some things that I think it would be useful to do:

  1. Make the experience of minorities within EA more comfortable and safe.
  2. Continue seriously investigating translating EA concepts to other cultural paradigms (or conversely, translating useful ideas from other cultural paradigms into EA). (See also this article .)
  3. Take some of the more concrete/actionable pieces of the social justice paradigm and analyze/ harmonize them with the more consequentialist/science-based EA philosophy (with the understanding that an honest analysis sometimes finds cherished ideas to be false).

I think the last item is definitely worth engaging with more, especially with people who understand and value the social justice paradigm. Props if you can make progress on this!

A very nitpicky comment, but maybe it does point towards something about something: "What if every person in low-income countries were cash-transferred one years’ wage?"

There is a lot of money in the EA space, but at most 5 percent of the sort of money that would be required for doing that (quick google of 'how many people live in low income countries' tells me there are 700 million people in countries with a per capita income below roughly 1000 usd a year, so your suggestion would have a 700 billion dollar bill. No individual, including Elon Musk or Jeff Bezos has more than a quarter of that amount of money, and while very rich, the big EA funders are no where near that rich). Also, of course, give directly is actually giving people in low income countries the equivalent of a year's wage to let them figure out what they want to do with the money. Of course they are operating on a small enough scale that is affordable within the funding constraints of the community.

I don't know, the on-topic thing that I would maybe say is that it is important to have a variety of people working in the community, people with a range of skills and experiences (ie we want to have some people who have an intuitive feel for big economic numbers and how they relate to each other -- but it is not at all important for everyone, or even most people to have that awareness). But at the same time, not everyone is in a place to be part of the analytic research oriented part of the EA community, and I simply don't think that decision making will become better at achieving the values I care about if the decision making process is spread out.

(But of course the counter point is that decision makers who ignore the voices of the people they are claiming to help often do more harm than good, and usually are maximizing something they care about, which is true).

Also, and I'm not sure how relevant this is, but I think it is likely that part of the reason why X-risks is the area of the community that is closest to being fully funded is because it is the cause area that people can care about for purely selfish reasons -- ie spending enough on X-risk reduction is more of a coordination problem than an altruism problem.

Peter Singer has done the math, and it is possible to reduce by half global hunger and extreme poverty, with modest numbers (old article, but I just found it):

https://www.nytimes.com/2006/12/17/magazine/17charity.t.html

No individual, including Elon Musk or Jeff Bezos has more than a quarter of that amount of money

But governments do. Which, while being about a hypothetical, does demonstrate a good reason for EA to try to transition away from relying on individuals instead of governments.

This post calls out un-diversities in EA. Instead of attributable to EA doing something wrong, I find these patterns mainly underline a basic fact about what type of people EA tends to attract. So I don't find the post fair to EA and its structure in a very general way.

I find to detect in the article an implicit, underlying view of the EA story being something like:

                'Person becoming EA -> World giving that person EA privileges'

But imho, this completely turns upside down the real story, which I mostly see as:

                'Privileged person ->becoming EA -> trying to put their resources/privileges to good use, e.g. to help the most underprivileged in the world',

whereby privileged refers to the often a bit geeky, intellectual-ish, well-off person we often find particularly attracted to EA.

In light of this story, the fact that white dudes are over-represented relative to the overall global world population, in EA organizations, would be difficult to avoid in today's world, a bit like it would be difficult to avoid a concentration of high-testosterone males in a soccer league.

Of course, this does not deny that many biases exist everywhere in the selection process for higher ranks within EA, and these may be a true problem. Call them out specifically, and we have a starting point to work from. Also in EA, people tend to abuse of power, and this is not easy to prevent. Again, welcome to all enlightenment about how, specifically, to improve on this. Finally, that skin color is associated with privileges worldwide may be a huge issue in itself, but I'd not reproach this specifically to 'EA' itself. Certainly, EAs should also be interested in this topic, if they find cost-effective measures to address it (although, to some degree, these potential measures have tough competition, just because there is so much poverty and inequality in the world, absorbing a good part of EA's focus for not only bad reasons).

Examples of what I mean (I add the emphasize):

However, the more I learn about the people of EA, the more I worry EA is another exclusive, powerful, elite community, which has somehow neglected diversity. The face of EA appears from the outside to be a collection of privileged, highly educated, primarily young, white men.

Let's talk once you have useful info on whether they focus on the wrong things, rather than that they have the wrong skin colors. In my model, and in my observations, there is simply a bias in who feels attracted to EA, and as much as anyone here would love the average human to care about EA, it is sadly not the case (although in my experience, it is mostly true that more generally slightly geeky, young, logical, possibly well-off persons like and join EA, and can and want to use resources towards it, than simply the "white men" you mention).

The EA organizations now manage billions of dollars, but the decisions, as far as I can tell, are made by only a handful of people. Money is power, and although the decisions might be carefully considered to doing the most good, it is acutely unfair this kind of power is held by an elite few. How can it be better distributed? What if every person in low-income countries were cash-transferred one years’ wage?

The link between the last bold part and the preceding bold parts surprises me. I see two possible readings:

a. 'The rich few elite EAs get the money, but instead we should take that money to support the poorest?' That would have to be answered by: These handful work with many many EAs or other careful employees, to try to figure out what causes to prioritize based on decent cost-benefit analysis, and they don't use this money for themselves (and indeed, at times, it seems like cash-transfers to the poorest show up among promising candidates for funding, but these still compete with other ways to try to help the poorest beings or those most at risk in the future).

b. 'Give all poorest some money, so some of these could become some of the "handful of people" with the power (to decide on the EA budget allocation)'. I don't know. Seems a bit a distorted view on the most pressing reason for alleviating the most severe poverty in the world.

While it might be easy to envy some famous persons in our domain, none has chosen 'oh, whom could we give a big privilege of running the EA show', but instead there is a process, however imperfect, trying to select some of the people who seem most effective for also the higher rank EA positions. And as many skills useful for it correlate with privileged education, I'd not necessarily want to force more randomization or anything - other than through compelling, specific ways to avoid biases.

1) Am I wrong in my stereotype of the average EA?

Probably not. EA pulls from philosophy, computer science, economics, etc, which are themselves disproportionately white and male.

2) Is my interest in greater diversity within EA misguided?

No, it warrants some attention; the question is how much relative to directly working on EA causes.

3) If I am right about the lack of diversity in EA, is it something the EA community is working towards improving? [...]  Bright Malawian girls would achieve the same merit if provided with the same privilege.

Yes, see links from the other commenters. As for getting Malawian girls to the forefront of EA, that's a long-term process that will take decades as their society develops (eg. high school completion is 10x lower than in the US). We live in a deeply unfair point in history.

4) If diversity should be increased, how can it be accomplished?

One approach here.

I would love to see us scale up the number of EA hotels. I don't just think this would be highly impactful, but I also suspect that this would much it much easier for people who aren't as financially privileged to break into the movement and it may also reduce some of the impact of credentialism. A natural result of this would be to benefit minorities who tend to be lower SES.

[comment deleted]1
0
0
More from LiaH
Curated and popular this week
Relevant opportunities