Participation
5

  • Organizer of Tucson Effective Altruism
  • Attended an EAGx conference
  • Received career coaching from 80,000 Hours
  • Attended an EA Global conference
  • Completed the ML Safety Scholars Virtual Program

Posts
2

Sorted by New
3
· · 1m read

Comments
61

At Anthropic’s new valuation, each of its seven founders — CEO Dario Amodei, president Daniela Amodei and cofounders Tom Brown, Jack Clark, Jared Kaplan, Sam McCandlish and Christopher Olah — are set to become billionaires. Forbes estimates that each cofounder will continue to hold more than 2% of Anthropic’s equity each, meaning their net worths are at least $1.2 billion.

From: https://www.forbes.com/sites/alexkonrad/2025/01/08/anthropic-60-billion-valuation-will-make-all-seven-cofounders-billionaires/

I don't know if any of the seven co-founders practice effective giving, but if they do, this is welcoming news!

(Tangential but related) There is probably a strong case to be made for recruiting the help of EA sympathetic celebrities to promote effective giving, and maybe even raise funds. I am a bit hesitant about "cause promotion" by celebrities, but maybe some version of that idea is also defensible. Turns out, someone wrote about it on the Forum a few years ago, but I don't know how much subsequent discussion there has been on this topic since then.

I don't disagree. I was simply airing my suspicion that most group organizers who applied for the OP fellowship did so because they thought something akin to "I will be organizing for 8-20 hours a week and I want to be incentivized for doing so" — which is perfectly a-ok and a valid reason — rather than "I am applying to the fellowship as I will not be able to sustain myself without the funding."

In cases where people need to make trade-offs between taking some random university job vs. organizing part time, assuming that they are genuinely interested in organizing and that the university has potential, I think it would be valuable for them to get funding. 

Random idea: a yearly community retreat or a mini-conference for EtG folks?

I would be interested to see what proportion of group organizer request funding primarily due to difficult financial situations. My guess would be that this number is fairly small, but I could be wrong.

I agree with so much here. 

I have my responses to the question you raised: "So why do I feel inclined to double down on effective altruism rather than move onto other endeavours?"

  • I have doubled down a lot over the last ~1.5 years. I am not at all shy about being an EA; it is even on my LinkedIn!
    • This is partly because of integrity and honesty reasons. Yes, I care about animals and AI and like math and rationality and whatnot. All this is a part of who I am.
    • Funnily enough, a non-negligible reason why I have doubled down (and am more pro-EA than before) is the sheer quantity of not-so-good critiques. And they keep publishing them.
    • Another reason is because there are bizarre caricatures of EAs out there. No, we are not robotic utility maximizers. In my personal interactions, when people hopefully realize that "okay this is a just another feel-y human with a bunch of interests who happens to be vegan and feels strongly about donations."
  • "I have personally benefited massively in achieving my own goals." — I hope this experience is more common!
    • I feel EA/adjacent community epistemics have enormously improved my mental health and decision-making; being in the larger EA-sphere has improved my view of life; I have more agency; I am much more open to newer ideas, even those I vehemently disagree with; I am much more sympathetic to value and normative pluralism than before!

I wish more ever day EAs were louder about their EA-ness.

Related Q: is there a list of EA media project that you would like to see more of but ones that currently do not exist?

I honestly don't know. When I think of an arms race, I typically think of rapid manufacturing and accumulation of "weapons." 

Do you think export controls between two countries are a sufficient condition for an arms race?

I don't disagree with this at all. But does this mean that blame can be attributed to the entire EA community? I think not. 

Re mentorship/funding: I doubt that his mentors were hoping that he would accelerate the chances of an arms race conflict. As a corollary, I am sure nukes wouldn't have been developed if the physics community in the 1930s didn't exist or mentored different people or adopted better ethical norms. Even if they did the latter, it is unclear if that would have prevented the creation of the bomb. 

(I found your comments under Ben West's posts insightful; if true, it highlights a divergence between the beliefs of the broader EA community and certain influential EAs in DC and AI policy circles.)

Currently, it is just a report, and I hope it stays that way.

And we contributed to this.

What makes you say this? I agree that it is likely that Aschenbrenner's report was influential here, but did we make Aschenbrenner write chapter IIId of Situational Awareness the way he did? 

But the background work predates Leopold's involvement.

Is there some background EA/aligned work that argues for an arms race? Because the consensus seems to be against starting a great power war.

Load more