The LTFF recently switched to doing grant rounds, our first round closes on Saturday (deadline EOD anywhere 2025-Feb-15). I think you should consider submitting a quick application in the next 24 hours. We will likely consider applications submitted over the next few days in this round (unless we are overwhelmed with applications).
Apply nowIn my personal view, I don't think there has been a better time to work on AI safety projects than right now. There is a clear-ish set of priorities, funders willing to pay for projects, and an increasing sense from the AI safety community that we might be close to the critical window for ensuring AI systems have a profoundly positive effect on society.[1]
I am particularly keen to see applications on:
- publicly communicating AI threat models and other societal implications
- securing AI systems in ways I don't expect to be done by default in labs
- getting useful safety research out of AI systems when the AI is powerful and scheming against you
- analysis of AI safety research agendas that might be especially good candidates for AIs (e.g. because they can be easily decomposed into subquestions that are easily checkable)
- new organisations that could use seed funding
- gatherings of various sizes and stakeholders for navigating the transition to powerful AI systems
- neglected technical AI governance research and fieldbuilding programs
- career transition grants for anyone thinking of the above
- areas that Open Philanthropy recently divested from
Other LTFF fund managers are excited about other areas and an area not being included in the list above is not a strong indicator that we aren't excited about it.
You can apply to the round here (deadline EOD anywhere 2025-Feb-15).
- ^
we are also interested in funding other longtermist areas, though empirically they meet our bar much less often than AI safety areas.
When will the next round likely be?
Not sure right now, but probably sometime next quarter.
Thanks for the reminder!
I assume that:
1. This restriction is only for the EA Infrastructure Fund, not the LTFF.
2. Given that it takes 8 weeks to get a response, this means that projects can only take roughly 5.5 months or less.
Are those points correct?
Yes it’s only for the EAIF as of the time of writing this comment.
My impression is that the EAIF is currently getting back to people in much less than 8 weeks, but I haven’t checked in the last few weeks. In any case, that’s a different non-round process to the ltff.
FWIW the paragraph beginning "In my personal view, I don't think there has been a better time to work on AI safety projects than right now..." is a really plausible and big if true take, I'd love to see a full post on it :)
Interesting, I think I only endorse a weak version of this claim and expect replies to the post to be fairly nitpicky which would make writing the post annoying.
Otoh (the weak version) seems pretty obvious to me, which makes me excited to write a longer post making the case for it, are there any particular points you'd like such a post cover?
A few things could be useful:
- An overview of funders in the space/ new funders and how things have changed recently.
- How changing priorities has or hasn't changed the profile of the people applying for LTFF funding
- Where the biggest opportunities (some combination of importance and availability of funding) are for people who might be reading a forum post.
Also you can always post during draft amnesty and use the built in excuse to not respond to comments (i.e. we have a table you can put at the top of your post where you can say "I only endorse a weak version of this claim and I probably won't be responding to comments" or something to that effect)