AS

Adam_Scholl

193 karmaJoined

Comments
6

I also think Open Philanthropy would benefit from less ambiguity about my role in its funding decisions (especially given the fact that I’m married to the President of a major AI company).

This makes sense, but if anything the conflict of interest seems more alarming if you're influencing national policy. For example, I would guess that you are one of the people—maybe literally among the top 10?—who stands to personally lose the most money in the event of an AI pause. Are you worried about this, or taking any actions to mitigate it (e.g., trying to convert equity into cash?)

Yeah, Dario pretty explicitly describes liking RSPs in part because they minimally constrain continued scaling:

"I mean one way to think about it is like the responsible scaling plan doesn't slow you down except where it's absolutely necessary. It only slows you down where it's like there's a critical danger in this specific place, with this specific type of model, therefore you need to slow down." (Logan Bartlett interview, h/t Joe_Collman).

At one point an EA fund manager told me something like, "the infrastructure fund refuses to support anything involving rationality/rationalists as a policy." Did a policy like this exist? Does it still?

Another potential cause of the narrow focus, I think, is some people in fact expecting the vast majority of impact to be from a small group of orgs they mostly already know about. Curious whether you disagree with that expectation (i.e., you think the impact distribution of orgs is flatter than that), or whether you're just claiming that e.g. the distribution of applicants should be flatter regardless?

Currently CFAR is on sabbatical, which we planned to allocate a couple months this year toward anyway. I.e., we're reading, and learning and scheming, and in general trying to improve ourselves in ways that are hard to find time for during our normally-dense workshop schedule.

We're considering a range of options for what to do next—e.g. online workshops, zoom mentoring, helping other orgs in some way—but we haven't yet settled on a decision.

For what it's worth, I wouldn't describe the social ties thing as incidental—it's one of the main things CFAR is explicitly optimizing for. For example, I'd estimate (my colleagues might quibble with these numbers some) it's 90% of the reason we run alumni reunions, 60% of the reason we run instructor & mentorship trainings, 30% of the reason we run mainlines, and 15% of the reason we co-run AIRCS.