I've been involved in EA for nearly a year now. At first, it was super exciting. I resonated so much with the core ideas of EA, and I couldn't wait to get started with doing the most good I possibly could. I had no idea there was so much opportunity.

As I got further into it, my hopes started to fade, and I started to feel like I didn't really fit in. EA is pitched to the super intelligent in our society, those who did super hard degrees at Oxford or Harvard and learned to code at age 8. For me, I'm just average. I never stood out at school, I went to mid-ranking university and studied sociology (which has a reputation for being an easy degree). I graduated, got an average job and am living an average life. I don't have some high earning side hustle and I don't spend my spare time researching how we can make sure AI is aligned with human values.

I do however, care a lot about doing the most good. So I really want to fit in here because that matters a lot to me. I want to leave the world a better place. But I feel like I don't fit, because frankly, I'm not smart enough. (I'm not trying to be self deprecating here, I feel like I'm probably pretty average among the general population - and I didn't really ever feel 'not smart enough' before getting involved in EA)

I totally understand why EA aims at the Oxford and Harvard graduates, of course, we want the most intelligent people working on the world's most pressing problems.

But most people aren't Oxford or Harvard graduates. Most people aren't even university graduates. So do we have a place in EA?

I want to be a part of this community, so I'm trying to make it work. But this leads me to be worried about a lot of other people like me who feel the same. They come across EA, get excited, only to find out that there's not really a place for them - and then they lose interest in the community. Even the idea of giving 10% of your salary can be hard to achieve if you're balancing the needs/wants of others in your family (who maybe aren't so EA minded) and considering the rises in the cost of living currently.

I'm guessing here, because I have absolutely no stats to back this up and it's based on mostly my anecdotal experience - but we could potentially be losing a lot of people who want to be a part of this but struggle to be because EA is so narrowly targeted.

Whenever I come on the EA forum I literally feel like my brain is going to explode with some of the stuff that is posted on here, I just don't understand it. And I'm not saying that this stuff shouldn't be posted because not everyone can comprehend it. These are really important topics and of course we need smart people talking about it. But maybe we need to be aware that it can also be quite alienating to the average person who just wants to do good.

I don't have a solution to all this, but it's been on my mind for a while now. I re-watched this Intro to EA by Ajeya Cotra this morning, and it really re-invigorated my excitement about EA, so I thought I'd put this out there.

I'd be really keen to hear if anyone has any thoughts/feelings/ideas on this - I'm honestly not sure if I'm the only one who feels like this.

Comments161
Sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

I know that lukeprog's comment is mostly replying to the insecurity about lack of credentials in the OP.  Still,  the most upvoted answer seems a bit ironic in the broader context of the question:

If you read the comment without knowing Luke, you might be like "Oh yeah, that sounds encouraging." Then you find out that he wrote this excellent 100++ page report on the neuroscience of consciousness, which is possibly the best resource on this on the internet, and you're like "Uff, I'm f***ed."

Luke is (tied with Brian Tomasik) the most genuinely modest person I know, so it makes sense that it seems to him like there's a big gap between him and even smarter people in the community. And there might be, maybe. But that only makes the whole situation even more intimidating.

It's a tough spot to be in and I only have advice that maybe helps make the situation tolerable, at least.

Related to the advice about Stoicism, I recommend viewing EA as a game with varying levels of difficulty. 

Because life isn’t fair, the level of difficulty of the video game will sometimes be “hard” or even “insane”, depending on the situation you’re in. The robot on the other hand would be playing on “e

... (read more)

FWIW, I wouldn't say I'm "dumb," but I dropped out of a University of Minnesota counseling psychology undergrad degree and have spent my entire "EA" career (at MIRI then Open Phil) working with people who are mostly very likely smarter than I am, and definitely better-credentialed. And I see plenty of posts on EA-related forums that require background knowledge or quantitative ability that I don't have, and I mostly just skip those.

Sometimes this makes me insecure, but mostly I've been able to just keep repeating to myself something like "Whatever, I'm excited about this idea of helping others as much as possible, I'm able to contribute in various ways despite not being able to understand half of what Paul Christiano says, and other EAs are generally friendly to me."

A couple things that have been helpful to me: comparative advantage and stoic philosophy.

At some point it would also be cool if there was some kind of regular EA webzine that published only stuff suitable for a general audience, like The Economist or Scientific American but for EA topics.

This is pretty funny because, to me, Luke (who I don't know and have never met) seems like one of the most intimidatingly smart EA people I know of.

Vox’s Future Perfect is pretty good for this!

Thanks for this comment. I really appreciate what you said about just being excited to help others as much as possible, rather than letting insecurities get the better of you.

Interesting that you mentioned the idea of an EA webzine because I have been toying with the idea of creating a blog that shares EA ideas in a way that would be accessible to lots of people. I’m definitely going to put some more thought into that idea.

Let me know if you decide to go ahead with the idea and I'll see how I can help 😀 

4
Joseph Lemien
That would be great! I'd love to see this. I consider myself fairly smart/well-read, but I don't think that I have the background or the quantitative skills to comprehend advanced topics. I would very much like to see content targeted at a general audience, the way that I can find books about the history of the earth or about astrophysics targeted at a general audience.
2
TomChivers
re the webzine, I feel like Works in Progress covers a lot of what you're looking for (it's purportedly progress studies rather than EA, but the mindset is very similar and the topics overlap)

I'm really sorry that you and so many others have this experience in the EA community. I don't have anything particularly helpful or insightful to say -- the way you're feeling is understandable, and it really sucks :(

I just wanted to say I'm flattered and grateful that you found some inspiration in that intro talk I gave. These days I'm working on pretty esoteric things, and can feel unmoored from the simple and powerful motivations which brought me here in the first place -- it's touching and encouraging to get some evidence that I've had a tangible impact on people.

Thank you so much! I so appreciate this comment.

Your talk really is great. This weekend I’m facilitating my first introductory fellowship session and I’ve recommended it to those coming along because I think it’ll be great to inspire and get them interested in EA, like it did for me.

I'm going to be boring/annoying here and say some things that I think are fairly likely to be correct but may be undersaid in the other comments:

  • EAs on average are noticeably smarter than most of the general population
  • Intelligence is an important component for doing good in the world.
  • The EA community is also set up in a way that amplifies this, relative to much of how the rest of the world operates.
  • Most people on average are reasonably well-calibrated about how smart they are.
    • (To be clear exceptions certainly exist) EDIT: This is false, see Max Daniel's comment.
  • If you're less smart than average for EAs (or less driven, or less altruistic, or less hardworking, or have less of a social safety net), than on average I'd expect you to be less good at having a positive impact than others.
  • But this is in relative terms, in absolute terms I think it's certainly possible to have a large impact still.
  • Our community is not (currently) set up  well to accommodate the contributions of many people who don't check certain boxes, so I expect there to be more of an uphill battle for many such people.
    • I don't think this should dissuade you from the project of (effectively) doing good, but I understand and emphasize if this makes you frustrated.

Most people on average are reasonably well-calibrated about how smart they are.

(I think you probably agree with most of what I say below and didn't intend to claim otherwise, reading your claim just made me notice and write out the following.)

Hmm, I would guess that people on average (with some notable pretty extreme outliers in both directions, e.g. in imposter syndrome on one hand and the grandiose variety of narcissistic personality disorder on the other hand, not to mention more drastic things like psychosis) are pretty calibrated about how their cognitive abilities compare to their peers but tend to be really bad at assessing how they compare to the general population because most high-income countries are quite stratified by intelligence. 

(E.g., if you have or are pursuing a college degree, ask yourself what fraction of people that you know well do not and will never have a college degree. Of course, having a college degree is not the same as being intelligent, and in fact as pointed out in other comments if you're reading this Forum you probably know, or have read content by, at least a couple of people who arguably are extremely intelligent but don't have a degree. But... (read more)

I think you're entirely right here. I basically take back what I said in that line. 

I think the thing I originally wanted to convey there is something like "people systematically overestimate effects like Dunning-Kruger and imposter syndrome," but I basically agree that most of the intuition I have is in pretty strongly range-restricted settings. I do basically think people are pretty poorly calibrated about where they are compared to the world. 

(I also think it's notably more likely that Olivia is above average than below average.)

Relatedly, I think social group stratification might explain some of the other comments to this post that I found surprising/tone-deaf. (e.g. the jump from "did a degree in sociology" to "you can be a sociologist in EA" felt surprising to me, as someone from a non-elite American college who casually tracks which jobs my non-STEM peers end up in). 

I think social group stratification might explain some of the other comments to this post that I found surprising/tone-deaf.

Yes, that's my guess as well.

This feels like it misses an important point. On the margin, maybe less intelligent people will have on average less of an individual impact. But given that there are far more people of average intelligence than people on the right tail of the IQ curve, if EA could tune its pitches more to people of average intelligence, it could reach a far greater audience and thereby have a larger summed impact. Right?

I think there's also a couple other assumptions in here that aren't obviously true. For one, it assumes a very individualistic model of impact; but it seems possible that the most impactful social movements come out of large-scale collective action, which necessarily requires involvement from broader swaths of the population. Also, I think the driving ideas in EA are not that complicated, and could be written in equally-rigorous ways that don't require being very smart to parse.

This comment upset me because I felt that Olivia's post was important and vulnerable, and, if I were Olivia, I would feel pushed away by this comment. But I'm rereading your comment and thinking now that you had better intentions than what I felt? Idk, I'm keeping this in here because the initial gut reaction feels valuable to name.

4
Linch
Thanks I appreciate this feedback.

Anyway, on a good day, I try to aim my internet comments on this Forum to be true, necessary, and kind. I don't always succeed, but I try my best.

This comment upset me because I felt that Olivia's post was important and vulnerable, and, if I were Olivia, I would feel pushed away by this comment. But I'm rereading your comment and thinking now that you had better intentions than what I felt? Idk, I'm keeping this in here because the initial gut reaction feels valuable to name.

I think realizing that different people have different capacities for impact is importantly true. I also think it's important and true to note that the EA community is less well set up to accommodate many people than other communities. I think what I said is also more kind to say, in the long run, compared to casual reassurances that makes it harder for people to understand what's going on. I think most of the other comments do not come from an accurate model of what's most kind to Olivia  (and onlookers) in the long run. 

Is my comment necessary? I don't know. In one sense it clearly isn't (people can clearly go about their lives without reading what I said). But in another sense, I feel better about ... (read more)

I think realizing that different people have different capacities for impact is importantly true. I also think it's important and true to note that the EA community is less well set up to accommodate many people than other communities. I think what I said is also more kind to say, in the long run, compared to casual reassurances that makes it harder for people to understand what's going on. I think most of the other comments do not come from an accurate model of what's most kind to Olivia  (and onlookers) in the long run. 

FWIW I strongly agree with this.

3
Sophia
Will we permanently have low capacity?  I think it is hard to grow fast and stay nuanced but I personally am optimistic about ending up as a large community in the long-run (not next year, but maybe next decade) and I think we can sow seeds that help with that (eg. by maybe making people feel glad that they interacted with the community even if they do end up deciding that they can, at least for now, find more joy and fulfillment elsewhere).
2
Max_Daniel
Good question! I'm pretty uncertain about the ideal growth rate and eventual size of "the EA community", in my mind this among the more important unresolved strategic questions (though I suspect it'll only become significantly action-relevant in a few years). In any case, by expressing my agreement with Linch, I didn't mean to rule out the possibility that in the future it may be easier for a wider range of people to have a good time interacting with the EA community. And I agree that in the meantime "making people feel glad that they interacted with the community even if they do end up deciding that they can, at least for now, find more joy and fulfillment elsewhere" is (in some cases) the right goal.
4
Sophia
Thanks 😊.  Yeah, I've noticed that this is a big conversation right now.  My personal take EA ideas are nuanced and ideas do/should move quickly as the world changes and our information about it changes too. It is hard to move quickly with a very large group of people.  However, the core bit of effective altruism, something like "help others as much as we can and change our minds when we're given a good reason to", does seem like an idea that has room for a much wider ecosystem than we have.  I'm personally hopeful we'll get better at striking a balance.  I think it might be possible to both have a small group that is highly connected and dedicated (who maybe can move quickly) whilst also having more much adjacent people and groups that feel part of our wider team.  Multiple groups co-existing means we can broadly be more inclusive, with communities that accommodate a very wide range of caring and curious people, where everyone who cares about the effective altruism project can feel they belong and can add value.  At the same time, we can maybe still get the advantages of a smaller group, because smaller groups still exist too. More elaboration (because I overthink everything 🤣) Organisations like GWWC do wonders for creating a version of effective altruism that is more accessible that is distinct from the vibe of, say, the academic field of "global priorities research".  I think it is probably worth it on the margin to invest a little more effort into the people that are sympathetic to the core effective altruism idea, but maybe might, for whatever reason, not find a full sense of meaning and belonging within the smaller group of people who are more intense and more weird.  I also think it might be helpful to put a tonne of thought into what community builders are supposed to be optimizing for. Exactly what that thing is, I'm not sure, but I feel like it hasn't quite been nailed just yet and lots of people are trying to move us closer to this from d
2
Sophia
There are also limited positions in organisations as well as limited capacity of senior people to train up junior people but, again, I'm optimistic that 1) this won't be so permanent and 2) we can work out how to better make sure the people who care deeply about effective altruism who have careers outside effective altruism organisations also feel like  valued members of the community.
8
tn-77
I think its important to define intelligence. Do we mean logic-based ability or a more broader definition (emotional intelligence, spatial, etc). EAs are probably high in one category but low in others. Soon I'll need a computer to keep track of the socially awkward interactions I've had with EAs who seem to be mostly aligned with a certain technical domain!  Others I talk seem to have a similar experiences.
3
Vilfredo's Ghost
awkward is pretty mild as far as ways to be emotionally stupid go. If that's all you're running into then EAs probably have higher than average emotional intelligence, but perhaps not as high in relative terms as their more classically defined intelligence
-27
DukeGartzea

The Parable of the Talents, especially the part starting at:

But I think the situation can also be somewhat rosier than that.

Ozy once told me that the law of comparative advantage was one of the most inspirational things they had ever read. This was sufficiently strange that I demanded an explanation.

Ozy said that it proves everyone can contribute. Even if you are worse than everyone else at everything, you can still participate in global trade and other people will pay you money. It may not be very much money, but it will be some, and it will be a measure of how your actions are making other people better off and they are grateful for your existence.

Might prove reassuring. Yes, EA has lots of very smart people, but those people exist in an ecosystem which almost everyone can contribute to. People do and should give kudos to those who do the object level work required to keep the attention of the geniuses on the parts of the problems which need them.

As some examples of helpful things available to you: 

  • Being an extra pair of hands at events
  • Asking someone who you think is aligned with your values and might have too much on their plate what you can help them with (if you actually have the bandwidth to follow through)
  • Making yourself available to on-board newcomers to the ideas in 1-on-1 conversations

I also want to chime in here and say that it was a bit of a shock for me coming into the EA community also: I was one of the more analytical people in most of my friendship groups, yet it was pretty quickly clear to me that my comparative advantage in this community was actually EQ, communications, and management. I'm glad to work with some incredibly smart analytical people who are kind enough to (a) help me understand things that confuse me when I'm frank about what I don't understand; and (b) remind me what else I bring to the table.

Luke needing to be reminded what he brings to the table I think is evidence that we're missing out on many extremely talented people who aren't 99.9th percentile on one particular skillset that we overselect for. 

As a counter-example, I am below average in many skills that people in my wider-peer group have that, I believe, would be incredibly helpful to the effective altruism movement. However, I am good at a very narrow type of things that are easy to signal in conversation that makes people in the EA community often think way more highly of me than, I believe, is rational.

 I have found easy  social acceptance in this community because I speak fluent mathematics. I have higher IQ friends who, in high-trust conversations, are extremely epistemically humble and have a lot to contribute but who I can't easily integrate into the effective altruism community. 

I believe that part of what makes it hard to introduce people who aren't exceptionally analytical to effective altruism is because there seems to be a stronger prior that intelligence and competence are one-dimensional (or all types of competence and intelligence are correlated) in a way there isn't so much th... (read more)

In short, if my model is correct, being a bit different to other people in the effective altruism community is evidence that you might have a comparative advantage (and maybe even an absolute advantage) within our community and you are paving the way for other people who are less weird in ways people in EA tend to be weird to find belonging here. 

I strongly believe that if you care deeply about others in an impartial way, that carving out space for you is very much in the best interest of this community (and that if the EA community is a place you want to be, finding your place in it is going to help others feel like they, too, have a place). It is also fine for you to just do what's good for you too and if the EA community isn't healthy for you for whatever reason, it's fine to bring what you like and discard the rest elsewhere too! 

Another relevant Slate Star Codex post is Against Individual IQ Worries.

4
Sophia
I love this post. It is so hard to communicate that the 2nd moment of a distribution (how much any person or thing tends to differ from the average[1]) is often important enough that what is true on average often doesn't apply very well to any individual (and platitudes that are technically false can therefore often be directionally correct in EA/LessWrong circles). 1. ^ This definition was edited in because I only thought of an okay definition ages later.
3
Sophia
Some of my personal thoughts on jargon and why I chose, pretty insensitively given the context of this post, to use some anyway  I used the "second moment of a distribution" jargon here initially (without the definition that I later edited in) because I feel like sometimes people talk past each other. I wanted to say what I meant in a way that could be understood more by people who might not be sure exactly what everyone else precisely meant. Plain English sometimes lacks precision for the sake of being inclusive (inclusivity that I personally think is incredibly valuable, not just in the context of this post). And often precision is totally unnecessary to get across the key idea.  However, when you say something in language that is a little less precise, it naturally has more room for different interpretations. Some interpretations readers might agree with and some they might not. The reason jargon tends to exist is because it is really precise. I was trying to find a really precise way of saying the vibe of what many other people were saying so everyone all felt a tiny bit more on the same page (no idea if I succeeded though or if it was actually worth it or if it was actually even needed and whether this is all actually just in my head). 

For what it's worth, I think the term "variance" is much more accessible than "second moment".

Variance is a relatively common word. I think in many cases we can be more inclusive without losing precision (another example is "how much I'm sure of this" vs "epistemic status")

4
Sophia
lol, yeah, totally agree (strong upvoted).  I think in hindsight I might literally have been subconsciously indicating in-groupness ("indicating in-groupness" means trying to show I fit in 🤮 -- feels so much worse in plain English for a reason, jargon is more precise but still often less obvious what is meant, so it's often easier to hide behind it) because my dumb brain likes for people to think I'm smarter than I am.  In my defense, it's so easy to, in the moment, to use the first way of expressing what I mean that comes to mind.  I am sure that I am more likely to think of technical ways of expressing myself because technical language makes a person sound smart and sounding smart gets socially rewarded.  I so strongly reflectively disagree with this impulse but the tribal instinct to fit in really is so strong (in every human being) and really hard to notice in the moment.  I think it takes much more brain power to find the precise and accessible way to say something so, ironically, more technical language often means the opposite of the impression it gives.  This whole thing reminds me of the Richard Feymann take that if you can't explain something in language everyone can understand, that's probably because you don't understand it well enough. I think that we, as a community, would be better off if we managed to get good at rewarding more precise and accessible language and better at punishing unnecessary uses of jargon (like here!!!).[1]  I kind of love the irony of me having clearly done something that I think is a pretty perfect example of exactly what I, when I reflect, believe we need to do a whole lot less of as a community🤣 1. ^ I think it's also good to be nice on the forum and I think Lorenzo nailed this balance perfectly. Their comment was friendly and kind, with a suggested replacement term, but still made me feel like using unnecessary jargon was a bad thing (making using unnecessary jargon feel like something I shouldn't have do
1
Sophia
It's just my general feeling on the forum recently that a few different groups of people are talking past each other sometimes and all saying valuable true things (but still, as always, people generally are good at finding common ground which is something I love about the EA community).  Really, I just really want everyone reading to understand where everyone else is coming from. This vaguely makes me want to be more precise when other people are saying the same thing in plain English. It also makes me want to optimise for accessibility when everyone else is saying something in technical jargon that is an idea that more people could get value from understanding.  Ideally I'd be a good enough at writing to be precise and accessible at the same time though (but both precision and making comments easier to understand for a broader group of readers is so time consuming so I often try to either do one or the other and sometimes I'm terrible and make a quick comment that is definitely neither 🤣). 

You seem to be jumping to the conclusion that if you don't understand something, it must be because you are dumb, and not because you lack familiarity with community jargon or norms. 

For example, take the yudkowsky doompost that's been much discussed recently. In the first couple of paragraphs, he namedrops people that would be completely unknown outside his specific subfield of work, and expects the reader to know who they are.  Then there are a lot of paragraphs like the following:

If nothing else, this kind of harebrained desperation drains off resources from those reality-abiding efforts that might try to do something on the subjectively apparent doomed mainline, and so position themselves better to take advantage of unexpected hope, which is what the surviving possible worlds mostly look like.

It doesn't matter if you have an oxford degree or not, this will be confusing to anyone who has not been steeped in the jargon and worldview of the rationalist subculture. (My PHD in physics is not helpful at all here)

This isn't necessarily bad writing, because the piece is deliberately targeted at  people who have been talking with this jargon for years. It would be bad wri... (read more)

I agree, and reading other comments - I think I may have got a bit down on myself (unnecessarily) for not understanding a lot of the stuff on the forum, as that seems to be pretty common. I guess as this is sort of the ‘main place’ (as far as I’m aware) for EA discussion, this contributed to my feelings of not being ‘smart enough’ to fit in.

6
Guy Raveh
Second everything here.
4
Luke Freeman 🔸
Strongly agree!

On fancy credentials: most EAs didn't go to fancy universities*. And I guess that 4% of EAs dropped out entirely. Just the publicly known subset includes some of the most accomplished: Yudkowsky, Muehlhauser, Shlegeris?, Kelsey Piper, Nuno Sempere. (I know 5 others I admire greatly.) 

On intelligence: You might be over-indexing to research, and to highly technical research. Inside research / writing the peak difficulty is indeed really high, but the average forum post seems manageable. You don't need to understand stuff like Löb's theorem to do great work. I presume most great EAs don't understand formal results of this sort. I often feel dumb when following alignment research, but I can sure do ordinary science and data analysis and people management, and this counts for a lot.

On the optics of the above two things: seems like we could do more to make people feel welcome, and to appreciate the encouraging demographics and the world's huge need for sympathetic people who know their comparative advantage. (I wanted to solve the education misconception by interviewing great dropouts in EA. But it probably would have landed better with named high-status interviewees.)

 

* Link i... (read more)

I think people are also unaware of how tiny the undergraduate populations of elite US/UK universities are, especially if you (like me) did not grow up or go to school in those countries.

Quoting a 2015 article from Joseph Heath, which I found shocking at the time:

There are few better ways of illustrating the difference than to look at the top U.S. colleges and compare them to a highly-ranked Canadian university, like the University of Toronto where I work. The first thing you’ll notice is that American schools are miniscule. The top 10 U.S. universities combined (Harvard, Princeton, Yale, etc.) have room for fewer than 60,000 undergraduates total. The University of Toronto, by contrast, alone has more capacity, with over 68,000 undergraduate students.

In other words, Canadian universities are in the business of mass education. We take entire generations of Canadians, tens of thousands of them recent immigrants, and give them access to the middle classes. Fancy American schools are in the business of offering boutique education to a very tiny, coddled minority, giving them access to the upper classes. That’s a really fundamental difference.

Oxford (12,510 undergraduates) and Cambri... (read more)

(I'm flattered by the inclusion in the list but would fwiw describe myself as "hoping to accomplish great things eventually after much more hard work", rather than "accomplished".)

FWIW I went to the Australian National University, which is about as good as universities in Australia get. In Australia there's way less stratification of students into different qualities of universities--university admissions are determined almost entirely by high school grades, and if you graduate in the top 10% of high school graduates (which I barely did) you can attend basically any university you want to. So it's pretty different from eg America, where you have to do pretty well in high school to get into top universities. I believe that Europe is more like Australia in this regard.

I can support the last point for Germany at least. There's relatively little stratification among universities. It's mostly about which subject you want to study, with popular subjects like medicine requiring straight A's at basically every university. However you can get into a STEM program at the top universities without being in the top-10% at highschool level.

I appreciate you highlighting that most EA’s didn’t go to top level uni’s - I wish this was out there more!

And I think (from reading other comments too) I was definitely getting a bit too wrapped up in not understanding highly complex stuff (when a lot of EA’s don’t either).

I agree there’s a huge need for more sympathetic people and that’s why I think it’s a shame that the community does feel like it has such a high bar to entry. I hope this changes in future.

7
isabel
I'm pretty sure Kelsey didn't drop out, though she did post about having a very hard time with finishing. 

This is correct, she graduated but had a hard time doing so, due to health problems. (I hear that Stanford makes it really hard to fail to graduate, because university rankings care about completion rates.)

Note that Kelsey is absurdly smart though, and struggled with school for reasons other than inherently having trouble learning or thinking about things.

2
Gavin
Interesting, I seem to remember a Tumblr post to the contrary but it was years ago. 
1
Rebecca
Maybe she had temporarily dropped out at the time, and later was able to finish?

To supplement what others have said, I think the long term (few years or more) outcomes of the movement depend greatly of on the diversity of perspectives we manage to have available. Mathematicians and engineers are great for solving some complex problems (ok, I'm biased because I am one), but the current lack of e.g. sociologists in EA is going to hinder any efforts to solve big problems that exist in a social environment. Not only do you have a place here - it's necessary that people like you be part of EA.

++ having a sociology background is great Not sure, but I think Vaidehi may have also studied Sociology at a non-Ivy+ school as well, and she seems to have done some cool stuff in the EA community too.

Not sure how relevant this comment is, but as someone who studies more technical stuff, I am honestly impressed with people who study things like sociology. The sheer number of papers and essays you guys pump out and how you have to think about large social systems honestly scares me! English / history classes were some of the hardest for me in high school!

I also think you might find some of Cal Newport's books helpful (So Good They Can't Ignore You, maybe even How To Be A High School Superstar). He shares a lot of encouraging stories about people who become good at what they do without being super impressive beforehand!

Another issue here is that the EA Forum is used sort of as the EA research journal by many EAs and EA orgs, including my employer, Rethink Priorities. We sometimes post write-ups here that aren't optimized for the average EA to read at all, but are more for a technical discipline within EA.

Isn't that a good thing? I hope it stays like this.  Then the forum stays interesting for people who are specialized in certain fields or cause areas.

6
Rebecca
It’s an issue insofar as people aren’t aware of it
2
Holly Elmore ⏸️ 🔸
Exactly, it's an issue if people think the posts on here are all aimed at a general EA audience 
5
Rebecca
Perhaps there could be tags for different ‘levels’ of technicality
5
Sunny1
I think the centrality of the EA Forum to the overall "EA project" has likely caused a lot of unintended consequences like this. Participating in the Forum is seen as a pretty important "badge" of belonging in EA, but participating in an internet forum is generally not the type of activity that appeals to everyone, much less an internet forum where posts are expected to be lengthy and footnoted.

Participating in the Forum is seen as a pretty important "badge" of belonging in EA,

Why do you believe this is true? I've met - online and offline - many higly involved people who never post or comment on the forum.  Maybe that's even the majority of the EA people I know. Some of them even never or seldom read anything here (I guess).

4
Holly Elmore ⏸️ 🔸
I second this-- a lot of prominent EAs don't look at the Forum. I check the Forum something like once a week on average and rarely post despite this being where my research reports are posted. A lot of EA social engagement happens on facebook and Discord and discourse may take place over more specialized fora like the Alignment Forum or specific Slacks.
2
Holly Elmore ⏸️ 🔸
(I have a lot of karma because I've been on here a long time)
4
Chriswaterguy
Perhaps these posts could start with a note on "assumed context", similar to the "epistemic status" notes. (A downside might be if it discourages someone from reading a post that they actually would have got value from, even if they didn't understand everything. So the choice of wording would be important.)

Since sociology is probably an underrepresented degree in effective altruism, maybe you can consider it a comparative advantage rather than "the wrong degree". The way I see it, EA could use a lot more sociological inquiry.

Yes, agree 100%! In general, I think EA neglects humanities skills and humanistic ways of solving problems. 

Yeah I would still love to see something like ethnographies of EA: https://forum.effectivealtruism.org/posts/YsH8XJCXdF2ZJ5F6o/i-want-an-ethnography-of-ea

Feel a bit sad reading this. I'm sorry you've felt alienated by EA and are unsure about how you fit in.

Re: your last sentence: you're far from alone in feeling this way. I cannot recommend Luisa Rodriguez's  80000 Hours article about imposter syndrome highly enough.

I don't think super high intelligence, or Ivy league degrees, are a requirement for engaging with EA. But setting aside that question, I do think there are lots of ways to engage that aren't, like, "do complicated AI alignment math". Organizations need many people and skills other than researchers to run well. And I think there are many ways to express EA values outside of your career, e.g. by donating, voting for political candidates who focus on important issues, and leading by example in your personal life and relationships.

I generally agree with your comment but I want to point out that for a person who does not feel like their achievements are "objectively" exceptionally impressive Luisa's article can also come across as intimidating: "if a person who achieved all of this still thinks they are not good enough, then what about me?"

I think Olivia's post is especially valuable because she dared to post even when she does not have a list of achievements that would immediately convince readers that her insecurity/worry is all in her head. It is very relatable to a lot of folks (for example me) and I think she has been really brave to speak up about this!

I agree. I would actually go further and say that bringing imposter syndrome into it is potentially unhelpful, as it's in some ways the opposite issue - imposter syndrome is about when you are as smart/competent/well-suited to a role as your peers, but have a mistaken belief that you aren't. What Olivia's talking about is actual differences between people that aren't just imagined due to worry. I could see it come off as patronising/out-of-touch to some, although I know it was meant well. 

5
Olivia Addy
Thank you for this comment and the article recommendation, I will definitely be checking it out. And thank you for highlighting the other ways to get involved, I could definitely do a bit more thinking about the options available to me, as I'm sure I can find my place somewhere!!

This is such a good post + I agree so much! I'm sorry you feel like you don't fit in :( and I'm also worried about the alienating effect EA can have on people. Fwiw, I've also had worries like this in the past - not so much that I wasn't smart enough, but that there wasn't a place for me in EA because I didn't have a research background in any of the major cause areas (happy to DM about this). 

 A couple of points, some echoing what others have said:

-there's a difference between 'smart' and 'has fancy credentials'
-some stuff that's posted on the Forum is written for a niche audience of experts and is incomprehensible to pretty much everyone
-imo a lot of EA stuff is written in an unnecessarily complicated/maths-y/technical way (and the actual ideas are less complicated than they seem)
-maybe you have strengths other than "intellectual" intelligence, e.g. emotional intelligence, people skills, being organized, conscientiousness...

I really do think this is a problem with EA, not with you - EAs should offer more resources to people who are excited to contribute but don't fit into the extremely narrow demographic of nerdy booksmart STEM graduates. 

9
RogerAckroyd
For people with math/technical background the easiest way to express certain ideas may be in a mathy way. 
6
Amber Dawn
Yeah absolutely! And it's not always worth experts' time to optimize for accessibility to all possible readers (if it's most important that other experts read it). But this does mean that sometimes things can seem more "advanced" or complex than they are.
4
Olivia Addy
Thank you for this comment!! The points you make are really great, and I hadn’t considered the importance of other types of intelligence so that’s something for me to think about a bit more. I agree there needs to be more resources out there, and I hope this is something that changes over time.
-11
Sharmake

I'd change the title of this post to "EA for non-geniuses".

Someone around 100-120 IQ isn't dumb, but actually still above avr!

And yet, this is a great contribution to EA discourse, and it's one that a "smart" EA couldn't have made.

You have identified a place where EA is failing a lot of people by being alienating. Smart people often jump over hurdles and arrive at the "right" answer without even noticing them. These hurdles have valuable information. If you can get good at honestly communicating what you're struggling with, then there's a comfy niche in EA for you.

Thank you for writing this up and putting it out there (coming from a non-fancy background I can totally relate!). 

One of the best things for my mental health this year was realising and allowing myself to accept that I am not and never will be a top researcher/EA thinker/ the person with the most accurate AI timelines/.... 

However, I do have an ops background and can see that my work is valuable, and there is no need for me to be the smartest in the room, as long as I get stuff done. 

I'd guess there are small things everyone can contribute (even just being a friendly face at an event and talking to new people can be really valuable). 

Additionally, I wish we had some more socialiogical studies about the EA community (posts like this one by Julia Wise).

9
Olivia Addy
That's really great advice, thank you! I can definitely be a bit hard on myself but recognising that I can still contribute is probably better than getting down about not being an Oxford PhD grad!!

Hi Olivia! As a former teacher with a degree in Education from a very low-ranking German university, I understand how you feel. I've been there! I am 100% confident there is an impactful way you can contribute to the EA community. Please reach out to me if you'd like to chat about those options, talk through potential career moves or volunteering opportunities!

P. S. : Your post has inspired me to finally create an account here and start commenting and posting - who knows what impact you have created down the line... ;-)

4
Olivia Addy
Hi Moritz, thank you for this kind comment, it's really made me smile!! I will definitely reach out.

Aside, which I’m adding after having written the rest of the comment: I think most EAs would agree that intelligence isn’t a moral virtue. Nonetheless, I think there can be a tendency in EA to praise intelligence (and its signals) in a way that borders on or insinuates moralization.

In a more just world, being called “unintelligent” or even “stupid” wouldn’t seem much different than being called “unattractive,” and being called smart or even “brilliant” wouldn’t leave anyone gushing with pride. 

Nice post. I started writing a comment that turned into a couple pages, which I hope will become a post along the lines of:

  • No really, what about 'dumb' EAs? Have there been any attempts to really answer this question?
    • There seems to be a consensus among smart, well-connected EAs that impact is heavy-tailed, so everyone but the top 0.1-10% (of people who call themselves EA!) or something is a rounding error.
      • I think this is largely true from the perspective of a hiring committee who can fill one role
        • But the standard qualitative interpretation might break down when there is a large space of possible roles.
    • I know this isn't at all an original point and I'm sure there are better write ups else
... (read more)
5
Linch
I don't understand the relevancy of this question. Can you elaborate a bit? :)
7
Aaron Bergman
Yeah, that was written way too hastily haha.  The idea is that currently,  CB+hiring seems think that finding seven (or any small integer) of people who can each multiply some idea/project/program's success by 100 is a big win, because this multiplies the whole thing by 10^14!  This is the kind of thing an "impact is heavy tailed" model naively seems to imply we should do. But then I'm asking "how many people who can each add 10% to a thing's value would be as good as finding those seven superstars"?  If the answer was like 410,000, it would seem like maybe finding the seven is the easier thing to do. But since the answer is 338, I think it would be easier to find and put to use those 338 people
8
KaseyShibayama
Hmm, I’m skeptical of this model.  It seems like it would be increasingly difficult to achieve a constant 1.1x multiplier as you add more people. For example, it would be much harder for Apple's 300th employee to increase their share price by 10% compared to their 5th employee.
1[comment deleted]
1
Imma🔸
Maybe edit your original comment? I think it's information that is worth explaining more clearly.

I definitely felt dumb when I first encountered EA . Certain kinds of intelligence are particularly overrepresented and valorized in EA (e.g. quantitative/rational/analytical intelligence) and those are the kinds I've always felt weakest in (e.g. I failed high school physics, stopped taking math as quickly as I could). When I first started out working in EA I felt a lot of panic about being found out for being secretly dumb because I couldn't keep up with discussions that leaned on those kinds of intelligence. I feel a lot better about this now, though it still haunts me sometimes.

What's changed since then?

  1. I still make dumb math mistakes when I'm required to do math in my current role - but I've found that I have other kinds of intelligence some of the colleagues I initially felt intimidated by are less strong in (i.e. judgment, emotional intelligence, intuitions about people) and even though these can be easily dismissed/considered 'fluffy' they actually do and have mattered in concrete ways.
  2. I've come to realize that intelligence isn't useful to consider as a monolithic category -- most things can be decomposed into a bunch of specific skills, and ~anyone can get better at really
... (read more)

First of all, thank you for speaking up about this. I know very smart people that are scared to just share their perspective on things and I do think THAT is very dumb.

Secondly, I do think donating some money regularly and cost-effectively is a safe bet, and freaking yourself out about "doing more" or even "the most" can easily be counterproductive. Just e.g. focusing on doing advocacy and explaining why evidence-based and cost-effective donations are good choices is still neglected in basically every country. There are many such relatively easy tasks that are great leverage points and in the end, it is precisely about comparative advantage. By you taking up such tasks you shoulder some burdens that are of relatively lower value to others.

Then for objectively difficult problems to solve it is, of course, reasonable to not try to make it "inclusive", there is a reason why there is a minimum height to become a soldier because the task environment will not change to accommodate certain people. I understand that you understand this. And by understanding this and e.g. not attempting something grandiose that ends up harmful, you are counterfactually already winning.

Then I also do think t... (read more)

Pre-Warning: Please don't read any of this as a criticism of those people who fit into the super-intelligent,  hard degrees at Oxford etc.  If you're that person, you're awesome, not critical of you, and this comment is directed at exploring the strengths of other paths :) 

Tl;dr at bottom of post for the time-constrained :)

This was a really interesting post to read. I wrote a slightly controversial piece a little while back that highlighted that 'top' Universities like Oxford, Cambridge, Stanford have a lot of class and wealth restrictions so that instead of marketing to the 'most intelligent' in society, EA was frequently marketing to the 'intelligent, but wealthy' instead - and missing out most of the intelligent but economically stranded people who went to universities lower down on the rankings because it made more financial sense. It ended up spreading on Twitter a bit. Most people were interested and engaged, one or two got a bit angry (largely by misinterpreting what I said but that's likely on me as the writer, hence the pre-warning), that's life. But I would like to highlight that to you - I know a ton of really, really intelligent people who went to average... (read more)

9
Olivia Addy
Thank you so much for this comment! The points you made about the community needing different types of skills is great and I totally agree...your comment (and lots of others) has definitely helped open my mind up and think a bit more about ways in which I could be useful here...even if it's outside the traditional view I had of what an EA is...so thank you for that!!

This resonates a lot with me. I actually studied physics at a pretty good college and did very well in all my physics classes, but I was depressed for a long time (two years?) [ETA: more like two and a half] for not feeling smart enough to be part of the EA community. 

I’m feeling better now, though that’s unfortunately because I stopped trying so hard to fit in. I stopped trying (and failing) to get into EAGs or get hired at EA orgs, and haven’t been reading the EA Forum as much as I used to. I still… have longtermist EA values, but I have no idea of what someone like me can do to help the future go well. Even donating part of my income seems approximately useless, given how longtermism is far from funding-constrained.

3
Denkenberger🔸
I'm sorry you feel this way. Have you tried volunteering to skill up? I think a physics major could be a good quantitative generalist researcher. Also, it is a common perception that longtermism is not funding constrained with the entry of FTX, but they only funded about 4% of the applications in their open round. And there are still longtermist organizations that are funding constrained, e.g. ALLFED (disclosure: which I direct).

Whenever I come on the EA forum I literally feel like my brain is going to explode with some of the stuff that is posted on here, I just don't understand it.

Dude, I have a degree from Harvard but it's in biology and I feel this way about a lot of the AI stuff! I admire your humility but you might not be that dumb.

I think your critique is totally spot-on, and I think a better EA community would have room for all kinds of engagement. When longtermism became dominant (along with the influx of a lot of cash so that we were more talent-constrained than money constrained) we lost a lot of the activities that had brought the entire community together, like thinking a lot about how to save and donate money or even a lot of emphasis on having local communities. We also stopped evangelizing as much as our message got more complicated and we became focused on issues like AI alignment that require specific people more than a large group of people. 

But even though I frequently say we should shore up the community by bringing back some focus on the original EA bread and butter causes like global health, I don't know if the current community is really making a mistake by focusing our limited... (read more)

8
Olivia Addy
Hey no I haven't thought about setting up a group like that but I definitely think it could be a good idea! The original ideas I learnt about right when I got involved are the most exciting to me...I'm sure others feel the same
8
Sophia
Maybe the giving what we can brand is good for this? (I'm not at all sure, this is really a question in my mind) It is obviously focused on donations but if it was a university group, this can largely maybe be seen as something that seems worth discussing now and doing later post-graduation?
3
Sophia
It obviously depends a lot on which ideas seem most compelling to you and the extent that they are captured by GWWC

Was not www.probablygood.org set up to address the large pool of talent that might have a hard time working on the cause areas identified by 80k hrs? Not sure if others mentioned this but ctrl+f did not show me any mentions of this org.

1
Olivia Addy
I had no idea this existed but will definitely check it out.
[anonymous]13
0
0

One thing that I think is helpful is to do the best you can to separate "EA the set of ideas" from "EA the set of people." People involved with EA form something akin to a broad social group. Like any social group, they have certain norms and tendencies that are  annoying and off-putting. Being snobbish about intelligence is one of these tendencies. My advice is to take the parts of "EA the set of ideas" that work for you and ignore the parts of the community that you find annoying. Maybe for you that means ignoring certain kinds of forum posts or maybe it means not going on the forum at all. Maybe it means giving 2 percent of your income to an effective charity and not worrying that you don't give more. Maybe it means being on the lookout for a job where you could have a higher impact but targeting organizations that are not EA-branded. The bottom line is that you do not need to be involved in EA the community to take EA the set of ideas seriously. 

 

This is not at all to concede that you cannot do high-impact things while being engaged in the community. I am happy with the impact I have and I went to a state school with standardized test scores that were nothing to brag about. This is just to say that if you find the community annoying, you don't need it to "Do EA".

3
Olivia Addy
This is definitely a good point. I have had really great experiences with every EA i've met and actually talked too - I guess it's what I see online that I've struggled with.  I could definitely make more of an effort to not get so tied up in the parts which don't work for me.

I don't think it is your capacity for impact, your intelligence or anything else that would stop you from adding a tonne of value to the effective altruism project. Your post is insightful (a sign of intelligence if I were to look for signs) and I also think that intelligence and competence are complicated and are so far from perfectly correlated with impact that worrying about them on an individual level seems counter-productive to me (lots of people have said similar things in the comments, but I felt this was, nonetheless, still worth reiterating). 

 The biggest thing I see is whether the effective altruism community is a place where you feel you can add value and, therefore, whether you can be here and be mentally healthy. Regardless of reality, how you feel matters. I think people struggle in this community not because they can't contribute, but because they feel they can't contribute "enough", or at all. I think "enough" is a low bar to pass in reality (where enough is "the world is better because of your contribution than it would have been otherwise": thanks comparative advantage).

 I think feeling like you're enough is the real challenge here. 

Why I

... (read more)

I couldn't agree more with this post! E.g. I feel like there should be an "80k for average smart people in the 100-130IQ ranche".

8
constructive
I honestly don't see why. I think I'm much below 130 and still, 80k advised me. The texts they write about why AI might literally kill all of us and what I could do to prevent are not only relevant for oxford graduates but also for me who just attended an average German University. I think everyone can contribute to the world's most pressing problems. What's needed is not intelligence but ambition and open-mindedness. EA is not just math geniuses devising abstract problems it's hundreds of people running the everyday work of organizations, coming up with new approaches to community building, becoming politically active to promote animal welfare, or earning money to donate to the most important causes. None of these are only possible with an above-average IQ. 
3
Lumpyproletariat
The 100-130 IQ range contains most of the United State's senators.  You don't need a license to be more ambitious than the people around you, you don't need to have  131 IQ or greater to find the most important thing and do your best. I'm confident in your ability to have a tremendous outsize impact in the world, if you choose to attempt it.

Something I wrote a little while back regarding whether EA should be a "narrow movement of people making significant impact, or a much broader one of shallower impact":

I've sometimes wondered whether it would be good for there to be a distinct brand and movement for less hardcore EA, that is less concerned with prestige, less elitist, more relaxed, and with more mainstream appeal. Perhaps it could be thought of as the Championship to EA's Premier League. I think there are already examples, e.g. Probably Good (alternative to 80,000 Hours), TLYCS and OFTW (alternatives to GWWC), and the different tiers of EA investing groups (rough and ready vs careful and considered). Places where you feel comfortable only spending 5 minutes editing a post, rather than agonising about it for hours; where you feel less pressure to compete with the best in the world; where you are less prone to analysis paralysis or perfect being the enemy of the good; where there is less stress, burnout and alienation; where ultimately the area under the impact curve could be comparable, or even bigger..? Perhaps one of the names mentioned here could be used.

You don't need to have a PhD to give a portion of your income to effective charities and do a lot of good. That's part of what makes effective giving such a powerful idea.

4
Olivia Addy
Agree, and I do donate a part of my income - my issue was that I wanted to do more than just donate, I wanted my career to be commited to making the world a better place, and that is where I was getting stuck.

I can totally relate to the feeling of wanting to do more than "just donate". I strongly agree with Henry (and others) that donating is an accessible way to have an impact, small donations from individuals are valuable. But "just donate" may not be enough for people with a strong altruistic motivation.

It can be for someone that donating is not only a way to have some impact, but actually the way to have the most impact with their career, given their limited talent. I don't know if that is the case for you, nor for the person who is reading along here, but it might apply to some people. I do believe that it applies to me, and I have been working in normal jobs for 8 years and donating a significant part of my income.

In my experience, being altruistically motivated and "just donate" is a challenging combination. My monkey brain wants connection to the community, and to the organization and the cause I am donating to. If I were less motivated, I would just be satisfied throwing 10 percent of my income at whatever charity GiveWell recommends . If I  were less "dumb" had a different set of talents, I would do fulltime direct work. I experience a lot of excitement and commitment for... (read more)

I don’t relate entirely but I do feel too dumb for AI Safety stuff in particular and don’t understand some posts about it, even though I think it’s very important.

I think community building, EA-related political advocacy, personal assistant jobs in the EA sphere and content creation related to EA on social media can be extremely high impact might be fairly accessible?

Very good post! Some potential tips how people who have similar experiences to what you described can feel more included: 

  1. Replacing visits to the EA Forum with visits to more casual online places: various EA Facebook groups (e.g. EA Hangout, groups related to your cause area of interest), the EA Discord server, probablygood.org (thanks to another commenter mentioning the site).   
  2. Attending events hosted by local EA groups (if close by). These events are in my experience less elite and more communal. 
  3. If attending larger EA conferences, understand that many people behave like they are in a job interview situation (because the community is so small,  reputation as a smart person can be beneficial), and will consequently e.g. avoid asking questions about concepts they do not know. 
2
Olivia Addy
Hey thanks for these really concrete steps! I appreciate you highlighting these other parts of the EA community as I really wasn’t aware of them before I posted this. And that’s a really great point about the conferences - think it’s key to be mindful that people might be trying to portray a certain image.

Yeah, I think a lot of people (myself included) feel a lot of the same things.

You might want to consider pursuing a career in operations or something with a more entrepreneurial vibe. There's generally a lack of such people in EA, so I think there's often a lot of impact to be had.

In my experience things like good judgement and grit matter a lot in these roles, and being super smart matters a lot less.

Really resonate with me. Reading 80,000 hours makes me want to jump out the window. Every article is something like “oh, Jane Street is a better way to spend your time than trying to join Downing Street”. What about us intellectual peasants? What about us in the 3rd world? What can we do? We exist in the billions, yet we are seen as the victims, not the solution.

Olivia Addy I'm glad you wrote this post, today my supervisor forwarded it to me regarding our last conversation when I told him that I'm too stupid to work for such a big organization as Anima International. It all had a beginning in that, as a person with a strong interest in insects, I read a discussion on genes of insects and couldn't grasp any of it, although I'm in the middle of Richard Dawkins' book, "The selfish gene" not much in my head cleared up. Then it came to me that maybe I don't deserve this job.

Reading the comments below, I know I'm not st... (read more)

3
Olivia Addy
Thank you for this comment. It's nice to know that I'm not the only person feeling this way and I totally relate to the feeling of undermining yourself, this is something I am trying to work on too!

Many people have already made comments, but I’ll throw my 2 cents into the ring:

  1. I don’t come from an Ivy League school or related background, nor do I have a fancy STEM degree, but feel decently at home in the EA community.
  2. I often thought that my value was determined by how “well” I could do my job in my intended field (policy research) vs. whomever I’m replacing. However, one of the great insights of EA is that you don’t have to be the 99th percentile in your field in order to be impactful: in some fields (e.g., policy research arguably) the most impor
... (read more)

I don't like how all the comments basically reiterate that smart people have more impact. Of course smart people do. But one avenue for EA to actually make a difference--is to appeal to the masses. For policy to change, you have to appeal to the body politic. And for that, we do need a diverse range of skillsets from people that are much different than the average EA (for example, having more super-social salesperson types in EA would be a net positive for EA)

I agree with a lot of what other people have said here. I think the key message I would emphasize is that 1) yes, the EA community should do a better job at inclusion of this kind of diversity (among others), but 2) you really just shouldn't even think too much about "intelligence" in finding your path within EA, bc there is almost certainly SOME way you can make a significant contribution given your unique conditions. To the extent you're concerned about "intelligence" being a limiting factor, I think this should also incline you to chose topics and types... (read more)

I also want all kind of people in this community. And I believe that not matter your intelligence you can have a good impact in the world and most even a job that's EA. For example I feel like community building could be a place for people with low level of studies to do valuable work, and even to solve this particular problem (make EA more accessible).  I think that creating more of those jobs would make EA more popular and that is the way of getting the most people to do direct work, GPR, donating, going vegan and voting good while also making a lot... (read more)

[anonymous]7
0
0

Great post! I think this is a failure of EA. Lots of corporations and open source projects are able to leverage the efforts of many average intelligence contributors to do impressive things on a large scale through collaboration. It seems to me like there must be something wrong when there are many motivated people willing to contribute their time and efforts to EA but don't have lots of avenues to do so other than earning to give and maybe community building (which leaves a lot of people who feel motivated by EA with no concrete ways to easily engage). It... (read more)

2
Olivia Addy
Thanks! I agree with everything in your comment - and I really hope to see EA change in the future so that more 'average' people are able to contribute (I think we could have a lot to give!!)

You don't have to be smart, or a college graduate. What I'm attempting to do at least for my edification is breaking things down into manageable chunks, so that it can easily be explained. 

I also enjoy looking at some of the EA numbers and shaking my head.

The table is small, but that's because we need people to help us build more extensions to the table. Come get some tools and help me make an addition.

In my view one of the most defining features of the EA community is that it makes most people who come into contact with it feel excluded and "less than," on several dimensions. So it's not just you!

1
alene
Yes.

Adding to the list of anecdotes, I previously wrote about somewhat similar experiences here (and am coincidentally also a sociology major).

I do agree that EA does sort of take an elitist approach which shuns people who don't come from an academic background -- which is a shame because it definitely stifles creativity and innovation. Even though I am from am elite American institution, finding an EA career has been incredibly difficult because the community is quite closed off. In my experience, if you are not a researcher within one of their pre selected fields, you are not worth their time. There is a significant drive to have positive impact paired with a significant lack of empathy. Again,... (read more)

To the extent that I'm outside of the general population I think it's because of my giving, but I generally feel squarely inside the box of ordinary people. I can relate to not feeling as smart as many EAs.

I think there are numerous things a typical person could do to take EA ideas and try to concretely make the world a better place:

One action that I think is broadly available is to join some advocacy group for EA-related policies on some local / regional / national level like animal welfare, electoral reform, sane land use policy, or something else. You c... (read more)

2
Olivia Addy
Some great ideas here, thank you! I've talked to my husband a lot about EA but like you do find it a bit challenging to branch out to others. I think this is something worth me working on though.

I was not very aware about this topic until recently when somebody wanted to discuss outreach towards non-academics with me. We are in contact with an adult education center (Volkshochschule) and might offer an Intro Fellowship there. It might be worth considering starting an EA group for high school graduates (no college) , comparable to EA for Christians, but I haven't thought much about it and this should be founded by people without tertiary education (chicken egg problem).

4
Guy Raveh
I kinda think Christians, or students of X university, have something positive that unites them and makes sense to approach them as a group - while "non college-educated people" do not. I do think it's worth to reach out to these people, but I don't know if that would be the right framing. Maybe indeed someone in this group would be better equipped to think about this, as you said.
7
shinybeetle
In Norway there's a local group for a county/state that doesn't really have a large university. That group has a bunch of farmers and tradespeople in it :)

Seeing a lot of great responses and content in the comments, I love it. You’re clearly not alone! Echoing what others have said, I like to frame the goal as “finding a way to be useful.” There are so many ways to be useful, and intelligence is just one input (keep in mind that smart people are probably especially good at Appearing Useful, especially online). Diligence, pragmatism, humility, sociability, plus a million other abilities are inputs too, and they’re basically all things you can improve at. Getting smarter is obviously helpful to being useful, but we can’t let it be the whole picture. Until we run out of opportunities to do good, there are opportunities for anyone to make a difference.

2
Olivia Addy
I think 'finding a way to be useful' is a great way to think about it, and something I'm going to consider going forward!

I don't know if replying to this thread after a couple of weeks is against the forum rules, as I've not posted on the forum before, but I have followed EA from a distance for a few years and completely agree with the OP. Well done for being brave enough to write about this, because I also felt similarly "dumb", while on paper, I know that I shouldn't be! I'm doing a PhD but it is not in a directly-EA related field/subject and I have not been to the highest ranking universities you mention. A lot of the very philosophical and computational stuff on here goe... (read more)

But I feel like I don't fit, because frankly, I'm not smart enough.

 But you are smart enough, because you correctly perceived that any ideology primarily by and for intellectual elites is not scalable to the degree necessary to create the required change.

Don't worry about everyone else's fancy sounding intellectual analysis.  Most of that is just career promotion anyway, or in my case, a lot of ego horn honking.     

Don't worry about fitting in.   We (humanity) are mostly insane anyway, so fitting in isn't always such a great ... (read more)

This may seem off topic, but one thing I've "learned" from years of meditation practice is that no one "earns" or freely chooses anything, like their intelligence level, for example. Nor do they freely choose to work on improving their intelligence level. There is no pride or shame in any ability because no one chooses what happens in their life. Consciousness just sort of notices what happens and mistakenly applies agency to certain phenomena. . .or maybe not. . .what the hell am I talking about? :) The moral of the story is do what you can, how you can, to make the world the best place it can be. 

Thank you so much for writing this.

I feel a very similar way. Every so often I get that feeling and excitement again about doing so much good, and after reading some posts and listening to podcasts for a few days, I get incredibly depressed because I don't study at Oxford and I'm not good at mathematics and I even struggle to make a okay-ish cost-benefit analysis for very basic things and I have no idea how to take all those seemingly complicated things like moral uncertainty into count. It's just exhausting.

But it has also taught me a lot things that I pr... (read more)

4
shinybeetle
This isn't what you intended by posting this, but I think it's useful to say it anyways. You sound discouraged in the same way that I used to be before I was involved in the EA community. It can be pretty hard to see where you fit in by looking at the forum and 80k. Getting help to plan my career and getting involved in the community really changed all of that for me by giving me direction and peers to relate to. Here's some things I really really encourage you to do, that I think will be helpful: * Go to an EA conference. Doesn't matter if it's EAG or EAGx, just go and talk to as many people as you can. If you don't have the money to travel, ask for monetary support. If youre unsure if you're "EA enough" to go, apply and let the conference holders decide for you. * Get some help making a rough career plan or steps to get enough knowledge to make that plan. EA knowledge is a lot to take in. My country's EA group had people who were willing to learn about me and help me figure out what my options were. It really helped me find clarity in what I already could plan/decide, and what I needed to gather more information about. Sorry that this is a bit of a mess, but I hope that its at least somewhat helpful
1
Jeroen De Ryck 🔹
Don't worry, it is a fine answer and probably has more structure than what I wrote, so good job on that :D I'm going to an EA meetup of the few people that do exist in EA in my country for the first time, I'm very much looking forward to what they have to say. Thanks for the reply!
1
Olivia Addy
Thank you for this comment! It's really great to know I'm not alone with this - and I hope you start to find your way, I know how confusing it can feel to be completely lost. I'd love to connect with you. I will send a DM!
Curated and popular this week
Relevant opportunities