This is a linkpost for https://www.thephilosopher1923.org/post/a-mirror-for-tech-bros
Abstract as written by the author on r/philosophy:
The problem with effective altruism and longtermism isn’t that they are funded by morally dubious capitalists or that they are sanction harmful acts for the greater good; it’s that they are naive about how power can be abused and how knowledge can reflect the interests of the powerful.
Their coziness with arbitrary power so long as it is effective makes it vulnerable to ‘the despotism trap’ where the ends justify the means.
Honestly, I like that this essay tries to engage with and understand EA more than other critiques I've seen. Usually when I see an article about EA's association with billionaires/elites, it tends to be a lot less substantive.
I do agree that EA decision making generally biases towards privilege. EA orgs pretty openly bias towards recruiting and hiring from prestigious universities, which are significantly overrepresented in EA demographics. I also recall data from Spencer Greenberg that essentially placed EAs as ideologically centrist, as oppose to more left-leaning social movements popular with similar demographics.
I think her critique here is actually quite sound:
I think this is a pretty accurate description. EA and EA-adjacent orgs are generally funded by rich philanthropists, and receive less support and collaboration with governmental bodies. A lot of EA-adjacent or recommended orgs exist in the private sector. I think at the very least, we have to acknowledge that funding sources do affect culture. EA happens to be disproportionately funded by tech billionaires, so EA is more likely to be sympathetic to the views and ideologies of tech billionaires.
I think where outside critics get it wrong, is the interpretation that because EA is funded by social elites, it actively sides/colludes with social elites. I'm not sure how to convey it, but outside coverage seems to imply that EA is essentially a front for social elites to gain more power without actually doing any good, when my take is that EA began in elite college campuses, and so the most accessible sources of support and funding just happened to be social elites. Most notably, coverage of EA often focuses more on ties with social elites than the actual work done by EA orgs in various cause areas. The average article I read contains maybe 1-2 brief sentences about the fairly complex work/research done by EA orgs, and the rest of the article is just speculating about its ties w rich elites.
In the reddit comment section they discuss this a little further. Specifically in this comment the author says that it is less the closeness of EA with powerful people, but that the philosophy of EA itself fits powerful people. So they can still do "dirty hand stuff" but justify it with "it is necessary for generating value to do good". I agree with the author here, that the philosophy of EA can provide this cover. And combined with the closeness to the power centers in the US, can derail.
SBF is a good example I think (please correct me if not. I am no EA follower and only here for the discussion). Crypto currency projects have currently very dubious value for humanity. To oversimplify a little: effectively consuming enormous amounts of energy in times of an energy crisis, while mostly being there so people can do financial speculations with the aim to become rich. How does this fit with doing good?