C

Cat🔸

15 karmaJoined

Posts
1

Sorted by New
14
· · 25m read

Comments
2

On a practical level, I don't necessarily disagree with anything you're saying in the first two paragraphs. I tried to address some of what you're saying in my conclusion, and I don't think anything in the "main" argument (benefits missions provide but EA is currently missing) is incompatible with having the abstract "doing good" as the core EA thing (so then it just becomes a semantic thing about how we define missions). 

As for your last paragraph, I argue for a cultural shift because I've personally seen a lot of people who resonate very much with EA intellectually but not emotionally (like here). This is fine when they have an easy transition into a high-impact role and there are less abstract stuff they can feel emotional about, but a lot of people don't "survive" that transition. They are aligned on principles, but EA is a really different community and movement that takes time getting used to. The current EA community seems to not only select for people who share values, but also people who share personality traits. I think that's bad.

(I do like the subculture idea and it was something I was thinking about as I wrote it! I think that's 100% a viable path too)

On a more speculative level, the people who I see drifting away strongly tend to be the people who have support networks outside of EA, instead of the people who are more reliant on EA for their social/emotional needs. I'm sure some of this trend exists for every movement, but I somewhat believe that this trend is larger for EA. This post gets at one of the reasons I have personally hypothesized to have caused this - that EA feels cold to a lot of people in a way difficult to describe, and humans are emotions-driven at their core. Regardless of the reason, selecting for members that are socially and emotionally reliant on the movement seems like a recipe for disaster. 

I'm considering writing a post on why it's hard for some people who intellectually agree with EA foundations to be emotionally passionate about EA (and really "doing good" in general). This is mostly based on my experience as a university group organiser, my tendency to be drawn to EA-lite people who end up leaving the community, and the fact that I am not very passionate about EA. Very fuzzy TL;DR is that caring about cause prioritisation requires levels of uncertainty, but the average person needs to be able to see concrete steps to take and how their contribution can help people to feel a fervour that propels them into action. This is doubly true for people who are not surrounded by EAs. To combat this, I argue for one actionable item, and one broader, more abstract ideal. The action item is to have a visual, easily digestable EA roadmap, that links broader cause areas with specific things people and orgs are doing. Ideally, the roadmap would almost be like a bunch of "business pitches" to attract new employees, explaining the pain points, the solutions suggested, and how people can get involved. The broader ideal I want to advocate for is for the EA philosophy to be principles based, but for the day-to-day EA to be missions-based (which I view as different from being cause-area-oriented). 

It's all just vibes in my head right now, but I'd be curious to know if people would want to see interviews/surveys/any sort of data to back up what I'm saying.