It might not have shown up on your radar but the funding situation for EA is currently insane. Like bananas, jumping off the wall, insane. Especially with regards to young people. I personally know of 16 year olds getting more money than the median American salary, and of 21 year olds getting six to seven figure grants. And this isn’t to knock either of those things. There’s really well thought out reasons why this makes sense. And generally I’d even advocate for more of this crazy risk taking. Normal institutions are extremely risk averse and it’s nice to see EA buck the trend.
But here’s the thing. The message is out. There’s easy money to be had. And the vultures are coming. On many internet circles, there’s been a worrying tone. “You should apply for [insert EA grant], all I had to do was pretend to care about x, and I got $$!” Or, “I’m not even an EA, but I can pretend, as getting a 10k grant is a good instrumental goal towards [insert-poor-life-goals-here]” Or, “Did you hear that a 16 year old got x amount of money? That’s ridiculous! I thought EA’s were supposed to be effective!” Or, “All you have to do is mouth the words community building and you get thrown bags of money.” Basically, the sharp increase in rewards has led the number of people who are optimizing for the wrong thing to go up. Hello Goodhart. Instead of the intrinsically motivated EA, we’re beginning to get the resume padders, the career optimizers, and the type of person that cheats on the entry test for preschool in the hopes of getting their child into a better college. I’ve already heard of discord servers springing up centered around gaming the admission process for grants. And it’s not without reason. The Atlas Fellowship is offering a 50k, no strings attached scholarship. If you want people to throw out any hesitation around cheating the system, having a carrot that’s larger than most adult’s yearly income will do that. TLDR: People are going to begin to optimize really hard around showing [EA grants] what they are thought they want to see. This will lead to just less impactful grants for helping people, and generally less chance of right handed tail successes.
So what to do? I’d like to note that some of the knee jerk reactions when hearing of the problem are examples of things not to do.
- Tightening up and becoming more stringent on what is funded. This is a failure mode. The rationale for giving out high risk grants stands and hasn’t changed. So decreasing the riskiness of the grants just means we backslide into becoming like any other risk averse institution.
- Increasing purity tests. Are you an EA, or are you an EA. Making people jump through more hoops to prove their alignment with core EA values is a terrible idea. Not only would you get evaporate cooling, you’d get a worse community. As a community that values good epistemics, having a purity test on whether or not this person agrees with the EA consensus on [insert topic here] is a death blow to the current very good MO.
- Fund less young people. Giving 16 year olds huge chunks of money with no oversight is a bad PR story waiting to happen. So people will argue that we should stop doing that. First, I'd like to point to the first bullet point. Second, Alexander Hamilton ran a business when he was 16, sailed on a ship from the Caribbean when he was 17 and founded the country that most reading this post live in when he was 20. So not funding young people means this type of talent and potential is wasted. Let's not do that.
Finally, I’d like to note that this problem has yet to become an actual problem. It's just a whisper of what's to (maybe) come. It still happens to be the case that the intrinsically motivated EA’s far, far out number the resume builders. But this might change if we're not careful. And this will begin to make a difference, as no matter how good our interview filters, the false positive rate will continue to increase. Furthermore, it seems that there are currently plans to massively scale up grant giving. So it would be nice if we could somehow solve this now when it’s a small problem, instead of later. Money saved is lives saved!
This post uses an alarmist tone to trigger emotions ("the vultures are circling"). I'd like to see more light and less heat. How common is this? What's the evidence?
People have strong aversions to cheating and corruption, which is largely a good thing - but it can also lead to conversations on such issues getting overly emotional in a way that's not helpful.
I might be in the minority view here but I liked the style this post was written in, emotive language and all. It was flowery language but that made it fun to read it and I did not find it to be alarmist (e.g. it clearly says “this problem has yet to become an actual problem”).
And more importantly I think the EA Forum is already a daunting place and it is hard enough for newcomers to post here without having to face everyone upvoting criticisms of their tone / writing style / post title. It Is not the perfect post (I think there is a very valid critique in what Stefan says that the post could have benefited from linking to some examples / evidence) but not everything here needs to be in the perfect EA-speak. Especially stuff from newcomers.
So welcome CitizenTen. Nice to have you here and to hear your views. I want to say I enjoyed reading the post (don’t fully agree tho) and thank you for it. :-)
I also thought that the post provided no support for its main claim, which is that people think that EAs are giving money away in a reckless fashion.
Even if people are new, we should not encourage poor epistemic norms.
The claim sounds plausible to me and that’s enough to warrant a post to encourage people to think about this.
:-)
My bad. Any good ideas for what the title should change to? Also, I'd just like to note that this is not yet very common at all. My evidence is just hearsay, anecdotes, and people that I've talked to. So if it was overly alarmist I'm sorry. That was not my attention. Once again, I'm more noting the change in tone on how some people are treating the grants then anything. Instead of being excited about cause area X and then using the grants as a way to achieve their goals, people are instead excited about cause area X because they can get easy funding. Once again, I don't think we should be alarmist about this, as funding less great/risky people would be a failure mode. I just wanted it to be common knowledge that this is happening (and probably?) going to get worse over time.
Fair enough - thanks for your gracious response.
Fwiw, anecdotally my impression is that a more common problem is that people engage in motivated reasoning to justify projects that aren't very good, and that they just haven't thought through their projects very carefully. In my experience, that's more common than outright, deliberate fraud - but the latter may get more attention since it's more emotionally salient (see my other comment). But this is just my impression, and it's possible that it's outdated. And I do of course think that EA should be on its guard against fraud.
Would really appreciate links to Twitter threads or any other publicly available versions of these conversations. Appreciate you reporting what you’ve seen but I haven’t heard any of these conversations myself.
I sent a DM to the author asking if they could share examples. If you know of any, please DM me!
Yes to links of what conversations on gaming the system are happening where!
Surely this is something that should be shared directly with all funders as well? Are there any (in)formal systems in place for this?
Like other commenters, to back-up the tone of this piece, I'd want to see further evidence of these kinds of conversations (e.g., which online circles are you hearing this in?).
That said, it's pretty clear that the funding available is very large, and it'd be surprising if that news didn't get out. Even in wealthy countries, becoming a community builder in effective altruism might just be one of the most profitable jobs for students or early-career professionals. I'm not saying it shouldn't be, but I'd be surprised if there weren't (eventually) conversations like the ones you described. And even if I think "the vultures are circling" is a little alarmist right now, I appreciate the post pointing to this issue.
On that issue: I agree with your suggestions of "what not to do" -- I think these knee-jerk reactions could easily cause bigger problems than they solve. But what are we to do? What potential damage could there be if the kind of behaviour you described did become substantially more prevalent?
Here's one of my concerns: we might lose something that makes EA pretty special right now. I'm an early-career employee who just started working at an EA org . And something that's struck me is just how much I can trust (and feel trusted by) people working on completely different things in other organisations.
I'm constantly describing parts of my work environment to friends and family outside of EA, and something I often have to repeat is that "Oh no, I don't work with them -- they're a totally different legal entity -- it's just that we really want to cooperate with each other because we share (or respect the differences in) each other's values". If I had to start second-guessing what people's motives were, I'm pretty sure I wouldn't feel able to trust so easily. And that'd be pretty sad.
Strong upvote for the erosion of trust being one of the things I'm really worried about.
Agree strongly. Eroding the high trust EA community would be really sad. Don't have much to add, except a strong upvote.
How about also adding links to your sources?
This seems overly quick to rule out a large class of potential responses. Assuming there are (or will be) more "vultures," it's not clear to me that the arguments against these "things not to do" are solid. I have these hesitations (among others) [edited for clarity and to add the last two]:
Thanks for this comment, Mauricio. I always appreciate you trying to dive deeper – and I think it's quite important here. I largely agree with you.
I feel like this community was never meant to scale. There is little to no internal structure, and like others have said, so much of this community relies on trust. I don't think this is just an issue of "vultures", it will also be an issue of internal politics and nepotism.
To me the issue isn't primarily about grantmaking. If you are a good grantmaker, you should see when people's proposals aren't super logical or aligned with EA reasoning. More people trying to get big grants is mostly a good thing, even if many are trying to trick us into giving free money. I think the much larger issue is about status/internal politics, where there is no specific moment if you can decide how aligned someone is.
But first to give some evidence of vultures, I have already seen multiple people in the periphery of my life submit apps to EAGs who literally don't even plan on going to the conferences, and are just using this as a chance to get a free vacation. I feel sorry to say that they may have heard of EA because of me. More than that, I get the sense that a decent contingent of the people at EAGx Boston came primarily for networking purposes(and I don't mean networking so the can be more effective altruists). At the scale we are at right now, this seems fine, but I seriously think this could blow up quicker than we realize.
Speaking to the internal politics, I believe we should randomly anonymize the names on the on the forum every few days and see if certain things are correlated with getting more upvotes (more followers on twitter, a job at a prestigous org, etc.). My intuition has been that having a job at a top EA org means 100-500% more upvotes on your posts here, hell even the meme page. Is this what we want? The more people who join for networking purposes, potentially the worse these effects become. That could entail more bias.
I post (relatively) anonymously on twitter, and the amount of (IMO) valid comments I make that don't get responded to makes me worry we are not as different from normal people as we claim, just having intellectual jousts where we want to seem smart among the other high status people. To be fair this is an amazing community and I trust almost everyone here more than almost anyone not in this community to try to be fair about these things.
I get the sense (probably because this is often going on the back of my mind), that many people are in fact simply optimizing for status in this group, not positive impact as they define it themself. Of course status in this community is associated with positive impact, BUT as defined by the TOP people in the community. Could this be why the top causes haven't changed much? I don't feel strongly about this, but it's worth considering.
As a former group organizer, there is a strong tension between doing what you think is best for the community vs for yourself. Here is an example: To build resilience for your group, you should try to get the people who might run the group after you leave to run events/retreats/network with other group organizers, so they are more committed, have practice, and have a network built up. But you get more clout if you run retreats, if you network with other group organizers, etc. It takes an extremely unselfish person to not just default to not delgating a ton of stuff, in no small part for the clout benefits. This tension exists now, so I'm not claiming this would only result from the influx of money, but now that organizers can get jobs after they graduate school, expect this to become a bigger issue.
P.S. If the community isn't meant to scale, then individual choices like vegetarianism are not justified within our own worldview.
I’m not a community builder. Also, just to be careful, relevant to the sentiment of this post and your own comment, I want to disclose that I’m both willing to drop and also take the title/status of being an EA, aligned with "improving the long term future", etc.
In the past, I have been involved in planning and probably understand the work of creating a retreat.
I thought your comment and experiences were important and substantive. In particular, this part of your comment seemed really important.
I wanted to understand more:
For context, this is my basic understanding of how leadership is rewarded in organizations: most successful organizations reward development. Senior people are supposed to and rewarded for dedicating most of their time away from object level work to managing people and fostering talent. This leadership performance is assessed, and good leaders are promoted to greater status and influence, so organizations end up with conscientious, effective leaders at the top who further develop or replicate these virtues.
In this ideal model, the more and active strong the junior people are, the more credit and status the leaders get. Leaders don’t lose their status, no matter how much junior people do, they would get promoted themselves. There is no incentive to squat on duties.
It seems like this isn’t true in this situation. This seems important. I wanted to ask questions to learn more:
I think you are saying there is an incentive to do the work of running a retreat personally, even when there are talented people who can do this, and you already have experience running a retreat.
Again, the right outcome and common belief would look like everyone saying, "Wow, Guthmann is a hero, he scouted out A, B, and C, who are huge future leaders. Imagine what new people and projects Guthmann can foster!".
I’m uncertain how much I will learn, but others might and it seems worth asking.
Please let me know if I’m wrong or muddying the water. I also understand if you don’t respond.
I started Northwestern's EA club with a close friend my sophomore year at northwestern (2019). My friend graduated at the end of that year and our club was still nascent. There was an exec board of 6 or 7 but truly only a couple were trustworthy with both getting stuff done and actually understanding EA.
Running the club during covid and having to respond to all these emails and carrying all this responsibility somewhat alone(alone isn't quite fair but ) and never meeting anyone in person and having to explain to strangers over and over again what ea was stressed /tired me a decent bit (I was 19-20) and honestly I just started to see EA more negatively and not want to engage with the community as much, even though I broadly agreed with it about everything.
I'm not sure I really feel externally higher status in any way because of it. I guess I might feel some internal status/confidence from founding the club, because it is a unique story I have, but I would be lying if I said more than 1 or 2 people hit me up during eagx boston (had a great time btw, met really cool people)to talk over swapcard, meanwhile my friend who has never interacted with ea outside of NU friends and fellowship but has an interesting career was dmed up like 45 times. And the 2 people who hit me up did not even do so because I founded, much less organized the club. The actual success of the club in terms of current size/avg. commitment and probabilistic trajectory does not seem to be data that anyone in the community would necessarily notice if I didn't try to get them to notice. Don't even get me started on whether or not they would know if I promoted/delegated (to) the right people. At any point during our clubs history I could tell you which people were committed and which weren't, but no one ever asked. There are people who work with the university groups but it's not like they truly knew the ins and outs of the club, and even if I told them how things are truly going, what does that really do for me? It may be the case that they would be more likely to hirer or recommend people who are better at delegating but anecdotally this doesn't even seem true to me. Which is still a far cry from doing impact estimates and funding me based on that. Plus isn't it possible that people who delegate less just inherently seem like a more important piece of a universities "team". Maybe there are other people waiting to take over and do and even better job but they are quite literally competition to their boss in that case. Perhaps it increases my chance of getting jobs? but I'm not sure, and if it was, it's not like it would be connected to any sort of impact score.
Founding the club has at best a moderate impact on its own. It is the combination of starting the club and giving it a big enough kick to keep going that I believe is where the value is created. Otherwise the club may die and you basically did nothing. A large part of this "kick" is ofc ensuring the people after you are good. Currently, Northwestern's Effective Altruism club is doing pretty good. We seem to be on pace to graduate 50+ fellows this year, we have had 10-15 people attend conferences. TO BE CLEAR - I have done almost nothing this year. The organizers that (at risk of bragging) I convinced/told last year to do the organizing this year have done a fire job. Much better than I could have. I like to think that if I had put very little effort in last year, or potentially even worse, not give authority to other positive actors in the club, there would have been a not tiny chance the club would have just collapsed, though I could be wrong. It does seem as though there is a ton of interest in effective altruism among the young people here, so it's feasible that this wasn't such a path dependent story.
Still - If I had started the club, put almost no effort in to creating any structure to the club/giving anyone else a meaningful role during covid year other than running events with people I wanted to meet (and coordinating with them myself, which counterintuitively is easier then delegating), and then not stepped down/maintained control this year so that I could continue doing so, no one would have criticized me, even though this action would probably have cost ea 15-30 committed northwestern students already, and potentially many more down the line. I mean, no one criticized me when I ghosted them last year(lol). If I had a better sense of the possibility of actually getting paid currently or after school for this stuff, I could see it increasing the chance I actually did something like above. Moreover, if I had a sense of the potential networking opportunities I might have had access to this year ( I did almost all my organizing except the very beginning during heavy covid), this probably would have increased my chances of doing something like above even more than the money.
To be clear I probably suck at organizing, and even if I hadn't solely used the club as my own status machine it would have been pretty terrible if I didn't step down and get replaced by the people who currently organize.
To summarize/ Organize:
I know I didn't precisely answer your questions and more just rambled. let me know if you have questions, and obviously if I said stuff that sounds wrong disagree. I feel like even though this post is long it's lacking a lot of nuance I would like to include but I felt it was best to post it like this.
Hi Charles,
I am not writing in an official CEA capacity but just wanted to respond with a couple quick personal thoughts that don't cover everything you mentioned
Hi,
Thanks for the thoughtful reply, appreciate it. Super valid points. Upon re-reading it seems I may have come off insultingly towards the community building contingent of EA. Certainly not my intention! I think y'all are doing a great job and I def don't want to give the impression that I would have a better plan in mind. I am somewhat familiar with the recent initiatives with universities and think they will def be solid also.
Again I just want to clarify that I don't think EA community builders are doing anything specifically wrong per se, and I don't think most of these issues are even super specific to the community building sector of EA. I think the issues I brought up would be present in pretty much any new social movement that is fast scaling and has lots of opportunities.
Just another super quick response that doesn't cover everything and is purely my own thoughts and not necessarily accurate to CEA:
I think you are vastly overestimating the access one gains from organizing events. You don't need to organize anything to get access to people. You just have to have something interesting to talk about. I've had access to VIPs in my field since I was 16 because I was working on interesting projects, and my experience within the EA community has been similar--the VIPs are easy to reach as long as you have a reason. And if you are managing someone else who is organizing an event, this should be easy to do, e.g. you can check up on your subordinates' performance.
Which post is this?
It's already happening. The two people I vetted for FTX Future Fund grant application didn't pass muster but, I dare say, they've perfected the worst version of "isomorphic mimicry" (for the lack of a better phrase). I'm not sure I can share details of this private conversation but it's good someone is pointing it out.
Just my impression based on anecdotes, but I've heard about more people from outside of the community trying to get FTX grants than I've noticed in the past (e.g. with Open Phil). The word seems to have 'gotten out' to a greater degree than before.
To bring in some numbers, FTX had a huge number of applications, so most of these must have come from outside the current community. And it seemed like the number was greater than expected.
To bring in some theory, it would make sense this is happening based on SBF's fame and the strategy they're taking (rapid, low overhead grants for a wider range of things).
So overall it seems plausible to me this is now happening to a greater degree than in the past, but I'm very unsure how much more, and how much of a problem it is. It's probably semi-inevitable as you get bigger.
Interesting! This makes me wonder if it could also be related to the crypto connection. Crypto is full of opportunities to exploit systems for money, and some people do that full-time with zero ethics. That would lend plausibility to the claim in the OP about discords dedicated to it.
Agree that seems plausible. I also heard FTX had a lot of crypto projects applying for funding.
I downvoted this post because it doesn't present any evidence to back up its claims. Frankly I also foudn the tone off-putting ("vultures"? really?) and the structure confusing.
I also think it underestimates the extent to which the following things are noticeable to grant evaluators. I reckon they'll usually be able to tell when applicants (1) don't really understand or care about x-risks, (2) don't really understand or care about EA, (3) are lying about what they'll spend the money on, or (4) have a theory of change that doesn't make sense. Of course grant applicants tailor their application to what they think the funder cares about. But it's hard to fake it, especially when questioned.
Also, something like the Atlas Fellowship is not "easy money". Applicants will be competing against extremely talented and impressive people from all over the world. I don't think the "bar" for getting funding for EA projects has fallen as much as this post, and some of the comments on this post, seem to assume.
I appreciate this and it's annoying, but I'm supposing OP didn't think they could do this without revealing who they are, which they wanted to avoid.
I agree that grant makers are probably aware of these things, but I would like them to demonstrate it and say how they plan to mitigate it. I note the Atlas Fellowship doesn't talk about this is its FAQ (admittedly the FAQ seems aimed at applicants, not critics, but still).
I'm not sure how easy it is for grantmakers to tell sincere from insincere people - particularly at high-school level when there hasn't been so much opportunity to engage in costly signalling.
I am genuinely worried about what effect it has on people's epistemics if they even think that they will be rewarded for holding certain beliefs. You can imagine impressionable students not wanting to even raise doubts because they worry this might be held against them later.
Didn't know how to say it originally, but yes, I did not want to reveal/out sources. It does make it so that the argument holds less punch (and you should be rightly skeptical) but on net I thought it would be enough without.
If you call them 'purity tests' that has a bad connotation.
Obviously that test would be terrible for the intellectual and epistemic environment of EA. We shouldn't screen on 'whether agrees with outcome'...
But it is reasonable to consider 'epistimic virtues' as inputs ... 'whether someone engages in honest debate, their reasoning is transparent' ... something less stringent than the (CEA principles)[https://www.centreforeffectivealtruism.org/ceas-guiding-principles] perhaps.
I also think considerations like 'does this person have a track record of engaging with EA and EA-adjacent activities before applying for this' should yield some good signaling/screening value.
(I see Mauricio made a similar point)
Thanks for the post. I share your concerns, and I even enjoy the kind of alarmist tone. However, I think some possible objections would be:
a) Perhaps job applications are more effective at marketing EA than other strategies. Publish a good job offer, and you can make dozens or hundreds of talented and motivated people dive into EA concepts.
b) Maybe false positive rates are increasing, but what about recall? It's all about trade-offs, right? There are probably many people with EA potential out there; how many vultures are you willing to let in to attract them?
c) I don't have a problem with "effective" vultures. If they can, e.g., solve the alignment problem or fill the operational needs of an EA organization, does it matter a lot that they are just building career capital?
I can speak of one EA institution, which I will not name, that suffers from this. Math and cognitive science majors can get a little too far in EA circles just by mumbling something about AI Safety, and not delivering any actual interfacing with the literature or the community.
So, thanks for posting.
Have you told the institution about this? Seems like a pretty important thing for them to know!
Update: I had a chance to talk to quinn irl about this, and speaking in very broad strokes, I consider the problem (at least for the specific example they gave) an order of magnitude less significant than when I read it the first time on this forum.
Atlas Fellowship cofounder here. Just saw this article. Currently running a workshop, so may get back with a response in a few days.
For now, I wanted to point out that the $50,000 scholarship is for educational purposes only. (If it says otherwise anywhere, let me know.)
That's not how I understood the scholarship when I read the information on the website.
The FAQ says
and
From this, I concluded that once the student turns 18, they can use the money for everything that could be defended as plausibly leading to their professional development.
If that's the case, than though the scholarship is not exactly "no strings attached" as the OP claims, it's still a description that to me seems closer to reality than "educational purposes only".
edit: a typo
I remember hearing that the money was just for the person and I felt alarmed, thinking that so many random people in my year at school would've worked their asses off to get $50k — it's more than my household earned in a year.
Sydney told me scholarships like this are much more common in the US, then I updated that it's only to be paid against college fees which is way more reasonable. But I guess this is kind of ambiguous still? Does seem like it's two radically different products.
If you start from the premise that someone is trying to game the system, then since there seems to be no oversight on what happens after they choose to take a $50k transfer to their bank account it's effectively no strings attached.
Don't people have the option to take it as a lump sum? If that is the case, presumably if they are willing to game the system to get the money they will not be particularly persuaded by a clear instruction to "only spend it on education".
I might make it clearer that your bullet points are what you recommend people not do. I was skimming and at first and was close to taking away the opposite of what you intended.
Thank you for writing this post. The discussion and critiques brought up are important and valuable, and I just want to say that I'm grateful you put this out there, since I've been very worried about the same things.
I believe that one should generally use extra caution while discussing delicate and emotionally important subjects in order to avoid becoming clickbait.
I'll hop on the "I'd love to see sources" train to a certain extent, but honestly we don't really need them. If this is happening it's super important, and even if it isn't happening right now it'll probably start happening somewhat soon. We should have a plan for this.
Vultures ≈ Death, typically
EA = Not dead, quite the opposite in fact
The "circling vultures" metaphor is generally used to mean "X is in danger". The idea is that vultures can tell X is at an elevated risk of death and are preparing to swoop down once that happens.
Here my interpretation is that the vultures represent grifters, and X is something like "EA money, protected by diligence and community trust". When the protection falls the grifters swoop in on the money.
I think we should be careful about how we communicate. Maybe instead of just saying that there is "lots of funding available" we should clarify that we mean that there's lots of funding available for people who can deliver. That is less likely to draw in vultures.
I haven't seen very much in terms of mechanisms that ensure funding is only distributed to people who can deliver? And it seems like that's in opposition to a "hits-based" approach, where many people doing what we want them to do will still come away without having accomplished their goal?
good epistemics?
Thanks for posting about this; I had no idea this was happening to a significant extent.
I think it is important to keep in mind that we are not very funding constrained. It may be ok to have some false positives, false negatives may often be worse, so I wouldn't be too careful.
I think grantmaking is probably still too reluctant to fund stuff that has an unlikely chance of high impact, especially if they are uncertain because the people aren't EAs.
For example, I told a very exceptional student (who has like 1 in a million problem solving capability) to apply for Atlas fellowship, although I don't know him well, because from my limited knowledge it increases the chance that he will work on alignment from 10% to 20-25%, and the $50k are easily worth it.
Though of course having more false positives causes more people that only pretend to do sth good to apply, which isn't easy to handle for our current limited number of grantmakers. We definitely need to scale up grantmaking ability anyways.
I think that non-EAs should know that they can get funding if they do something good/useful. You shouldn't need to pretend to be an EA to get funding, and defending against people who pretend they do good projects seem easier in many cases, e.g. you can often just start giving a little funding and promise more funding later if they show they progress.
(I also expect that we/AI-risk-reduction gets even much more funding as the problem gets more known/acknowledged. I'd guess >$100B in 2030, so I don't think funding ever becomes a bottleneck, but not totally sure of course.)
I think you refer to this post. Note that there was a discussion about the title of that post, and that it was eventually changed.
In general, I think that one should be more careful about being clickbaity regarding sensitive and emotionally charged topics.
This may be counterproductive as many projects we would like to see funded face economic barriers to entry.
E.g., if starting any effective new advocacy org requires at minimum a 0.5 FTE salary of X and initial legal costs of Y, for a total of X+Y=Z, funding some people 20% below Z won't lead to a 20% less developed advocacy org, but no advocacy org at all.
Fixed costs also vary across projects, and only providing initial funding below a certain threshold could lead to certain high-value but high-fixed cost projects being de-prioritized compared to low-fixed cost, lower-value ones.
It seems like there are tools to deal with this:
Finally, this isn't what will need to happen, but an available, robust strategy is to focus on giving grants to people with high opportunity costs or high outside options. These outside options are observable and also correlated with effectiveness, for reasons most people find acceptable.
I was considering writing something like this up a a while back, but didn't have enough evidence directly, was mostly working of too few examples as a grantmaker and general models. Glad this concern is being broadcast.
I did come up with a proposal for addressing parts of the problem over on the "Please pitch ideas to potential EA CTOs" post. If you're a software dev who wants to help build a tool which might make the vultures less able to eat at least parts of EA please read over the proposal and ping me if interested.
For what it's worth, Atlas has two different steps of online application, and then a final interview. This doesn't make it impossible to Goodhart, but it buys us time.
My prior is that one's degree of EA-alignment is pretty transparent. If there are any grifters, they would probably be found out pretty quickly and we can retract funding/cooperation from that point on.
Also, people who are at a crossroads of either being EA-aligned or non-EA aligned (e.g., people who want to be a productive member of a lively and prestigious community) could be organizationally "captured" and become EA-aligned, if we maintain a high-trust, collaborative group environment.