Hide table of contents

I work at the EA Infrastructure Fund, and I'd love to hear people's ideas for what EAIF should fund (and why). 

I'm particularly interested hearing ideas on this at the moment because

  • I'm planning on publishing a list of 'projects we'd like to fund' on our website.
  • I'm interested in finding ideas for particularly valuable projects, and running specific application rounds to fund people to work on them (akin to a hiring round). 

There's more detail on EAIF's scope here

Feel free to email me at harri@effectivealtruismfunds.org if you have ideas that you'd like to share privately. 

(If you're interested in receiving funding from EAIF yourself, you can apply here, and your welcome to reach out if you have any questions)

59

0
0

Reactions

0
0
New Answer
New Comment

11 Answers sorted by

Open Phil has seemingly moved away from funding ‘frontier of weirdness’-type projects and cause areas; I therefore think a hole has opened up that EAIF is well-placed to fill. In particular, I think an FHI 2.0 of some sort (perhaps starting small and scaling up if it’s going well) could be hugely valuable, and that finding a leader for this new org could fit in with your ‘running specific application rounds to fund people to work on [particularly valuable projects].’

My sense is that an FHI 2.0 grant would align well with EAIF’s scope. Quoting from your announcement post for your new scope:

Examples of projects that I (Caleb) would be excited for this fund [EAIF] to support

  • A program that puts particularly thoughtful researchers who want to investigate speculative but potentially important considerations (like acausal trade and ethics of digital minds) in the same physical space and gives them stipends - ideally with mentorship and potentially an emphasis on collaboration.
  • Foundational research into big, if true, areas that aren’t currently receiving much attention (e.g. post-AGI governance, ECL, wild animal suffering, suffering of current AI systems).

Having said this, I imagine that you saw Habryka’s ‘FHI of the West’ proposal from six months ago. The fact that that has not already been funded, and that talk around it has died down, makes me wonder if you have already ruled out funding such a project. (If so, I’d be curious as to why, though of course no obligation on you to explain yourself.)

Thanks for the suggestion - I read the proposal a while ago, and hadn't thought about it recently, so it's good to be reminded of it again. 

The fact that that has not already been funded, and that talk around it has died down, makes me wonder if you have already ruled out funding such a project.

We haven't decided against funding projects like this. (EAIF's grantmaking historically has been very passive - eg. the projects that we end up considering for funding has been determined by the applications we received. And we haven't received any strong applications in the 'FHI of the West' ballpark - at least as far as I'm aware)

One possible concern with this idea is that the project would probably take a lot of funding to launch. With Open Phil's financial distancing from EA Funds, my guess is that EAIF may often not be in the ideal position to be an early funder of a seven-figure-a-year project, by which I mean one that comes on board earlier than individual major funders.

I can envision some cases in which EAIF might be a better fit for seed funding, such as cases where funding would allow further development or preliminary testing of a big-project proposal to the point it could... (read more)

I still think that EA Reform is pretty important. I believe that there's been very little work so far on any of the initiatives we discussed here

My impression is that the vast majority of money that CEA gets is from OP. I think that in practice, this means that they represent OP's interests significantly more than I feel comfortable with. While I generally like OP a lot, I think OP's focuses are fairly distinct from those of the regular EA community. 

Some things I'd be eager to see funded:
- Work with CEA to find specific pockets of work that the EA community might prioritize, but OP wouldn't. Help fund these things.
- Fund other parties to help represent / engage / oversee the EA community.
- Audit/oversee key EA funders (OP, SFF, etc); as these often aren't reviewed by third parties.
- Make sure that the management in key EA orgs are strong, including the boards.
- Make sure that many key EA employees and small donors are properly taken care of and are provided with support. (I think that OP has reason to neglect this area, as it can be difficult to square with naive cost-effectiveness calculations)
- Identify voices that want to tackle some of these issues head-on, and give them a space to do so. This could mean bloggers / key journalists / potential community leaders in the future.
- Help encourage or set up new EA organizations to sit apart from CEA, but help oversee/manage the movement.
- Help out the Community Health team at CEA. This seems like a very tough job that could arguably use more support, some of which might be best done outside of CEA.

Generally, I feel like there's a very significant vacuum of leadership and managerial visibility in the EA community. I think that this is a difficult area to make progress on, but also consider it much more important than other EA donation targets. 

A fuckton of EAs struggle with procrastination, at least 10% but probably more like 20-30%. Funders tend to underestimate the prevalence because it's in nobody's best interests to admit it to them. 

Interventions work fast and are cheap. There's no overhead because you can just do them via Zoom. We already know what works, so you don't even have to innovate.

Funders I've spoken to have tended to think that it's only the least productive EAs who procrastinate. I've worked with a bunch of people working in high prestige EA jobs, including dozens of charity founders, and can confirm this is categorically untrue. It's not that highly productive people don't procrastinate, it's that they have other strengths that counteract the weakness.

EA co-working spaces are the most impactful EA infrastructure that I'm aware of. And they are mostly underfunded.

A matchmaking service for EA projects and EA volunteers / freelancers with a reputation system: At present, volunteers are underutilised because there's no good way of knowing who is reliable. Freelancers are often chosen because they are "EA-aligned" when really some are either incompetent or grifters. There ought be some way of seeing how someone has performed on past work.

Templates already exist on-line for this exact type of marketplace. The only upfront cost would be in finding enough early adopters to give it a go. From there, it could take a small cut of freelance work and fund itself.

Also just copying unedited a related rough note to self I made on my own list of potential entrepreneurial projects (but which I'm very unlikely to ever actually work on myself)

Impact-focused red-teaming, consultancy, and feedback marketplace

Problem I faced as a founder: Often making decisions where I would have loved external input but felt reluctant because I didn't want to ask people for favours, and didn't necessarily know who I could ask other than my personal network.

Solution: a platform like Fiverr or similar where people willing to give feedback an

... (read more)

Interesting! Could you share one or two examples of the "Templates [that] already exist on-line for this exact type of marketplace"?

Since you've thought about it a bit already, I'd be interested if you have any thoughts on how long something like this would take to setup on technical/operational side to a high standard, excluding time spent "finding enough early adopters".

(Also, I'm guessing this isn't something you're interested in doing yourself?)

2
John Salter
Example https://upworkclone.bubbleapps.io/ Person specs No technical knowledge required, it'd just involve learning how to use "bubble". The hard part is that no volunteer wants to be on a platform with few organisations and no organisation wants to be on a platform with few volunteers. [further reading] Do I wanna make this? I have no time. Ideal founder traits: 1. Well connected with EA orgs volunteers find attractive or volunteers EA orgs would find attractive. 2. Experience volunteering 3. A love of networking 4. Good at convincing EAs to do stuff.

Is https://www.impactcolabs.com/ still active? They don't have the feature with the reputation system, but at least it's a start 

Small scale (1-10k) epistemic infrastructure or experiments, like adj.news

There are currently key aspects of EA infrastructure that aren't being run well, and I'd love to see EAIF fund improvements. For example, it could fund things like the operation of the effectivealtruism.org or the EA Newsletter. There are several important problems with the way these projects are currently being managed by CEA.

 

  1. Content does not reflect the community’s cause prioritization (a longstanding issue). And there’s no transparency about this. An FAQ on Effectivealtruism.org mentions that “CEA created this website to help explain and spread the ideas of effective altruism.” But there’s no mention of the fact that the site’s cause prioritization is influenced by factors including the cause prioritization of CEA’s (explicitly GCR-focused) main funder (providing ~80% of CEA’s funding).
  2. These projects get lost among CEA’s numerous priorities. For instance, “for several years promoting [effectivealtruism.org], including through search engine optimization, was not a priority for us. Prior to 2022, the website was updated infrequently, giving an inaccurate impression of the community and its ideas as they changed over time.” This lack of attention also led to serious oversites like Global Poverty (the community’s top priority at the time) not being represented on the homepage for an extended period. Similarly, Lizka recently wrote that “the monthly EA Newsletter seems quite valuable, and I had many ideas for how to improve it that I wanted to investigate or test.” But due to competing priorities, “I never prioritized doing a serious Newsletter-improvement project. (And by the time I was actually putting it together every month, I’d have very little time or brain space to experiment.”
  3. There doesn’t seem to be much, if any, accountability for ensuring these projects are operated well. These projects are a relatively small part of CEA’s portfolio, CEA is just one part of EV, and EV is undergoing huge changes. So it wouldn’t be shocking if nobody was paying close attention. And perhaps because of that, the limited public data we have available on both effectivealtruism.org and the EA newsletter doesn’t look great. Per CEA’s dashboard (which last updated these figures in June), after years of steady growth the newsletter’s subscriber count has been falling modestly since FTX collapsed. And traffic to ea.org’s “introduction page”, which is where the first two links on the homepage are designed to direct people, is the lowest it has been in at least 7 years and continues to drift downward.

 

I think all these problems could be improved if EAIF funded these projects, either by providing earmarked funding (and accountability) to CEA or by finding applicants to take these projects over. 

To be clear, these aren’t the only “infrastructure” projects that I’d like to see EAIF fund. Other examples include the EA Survey (which IMO is already being done well but would likely appreciate EAIF funding) and conducting an ongoing analysis of community growth at various stages of the growth funnel (e.g. by updating and/or expanding this work).

I'd like to see more basic public philosophy arguing for effective altruism and against its critics. (I obviously do this a bunch, and am puzzled that there isn't more of it, particularly from philosophers who - unlike me - are actually employed by EA orgs!)

One way that EAIF could help with this is by reaching out to promising candidates (well-respected philosophers who seem broadly sympathetic to EA principles) to see whether they could productively use a course buyout to provide time for EA-related public philosophy. (This could of course include constructively criticizing EA, or suggesting ways to improve, in addition to - what I tend to see as the higher priority - drawing attention to apt EA criticisms of ordinary moral thought and behavior and ways that everyone else could clearly improve by taking these lessons on board.)

A specific example that springs to mind is Richard Pettigrew. He independently wrote an excellent, measured criticism of Leif Wenar's nonsense, and also reviewed the Crary et al volume in a top academic journal (Mind, iirc). He's a very highly-regarded philosopher, and I'd love to see him engage more with EA ideas.  Maybe a course buyout from EAIF could make that happen? Seems worth exploring, in any case.

I'd be worried that -- even assuming the funding did not actually influence the content of the speech -- the author being perceived as on the EA payroll would seriously diminish the effectiveness of this work. Maybe that is less true in the context of a professional journal where the author's reputation is well-known to the reader than it would be somewhere like Wired, though?

  1. Country and/or domain specific career advising webcontent 

80000 Hours and Probably Good are great but their advice can be off putting, irrelevant or not useful enough for many people who are not their main audience. Having content about potentially many impactful careers in medicine, academia, or engineering, in Japan, Germany, Brazil, or India can be much more useful and engaging for those people who are in these categories. This can also be done at a relatively low cost - one or two able and willing writers per country/domain. 

2. “Budget hawk” organisation/consultancy that aims to propose budget cuts to EA organisations without compromising cost-effectiveness. 

There is a lot of attention towards effective giving like %10 pledges. Another way of achieving similar outcomes is to make organisations spend less (%10 again?). We tend to assume that EA organisations are cost effective (which is true overall) but this does not mean that every EA organisation spends each penny with %100 cost-effectiveness. It is probable that many EA organisations can make cuts to their ineffective programs or manage their operations/taxes more efficiently. A lot of EA organisations have very large budgets, more than millions of dollars annually. So even modest improvements can be equivalent to adding many GWWC pledgers. 

3. Historical case studies about movement or community building 

Open philanthropy had commissioned some reports. But most of them are about certain policy reforms. Only a few are about movement or community building. I think more case studies can provide interesting insights. Sentience Institute’s case studies were very useful for animal advocacy in my opinion. 

4. Grand strategy research

This might be already being carried out by major EA organisations. But I can imagine that most leadership and key staff members in EA organisations typically focus on specific and urgent problems and never have enough time and focus on taking a (lot of) step back and think about the grand strategy. Other people might also have better skills to do this too. By the way, I am also more in favour of “learning by doing” and “make decisions as you progress” type of approaches but nevertheless at least having “some” grand strategy can reveal important insights about what are the real bottlenecks and how to overcome them.

5. Commissioning impact evaluations of major EA organisations and EA funds. 

I think the reasons for this are obvious. There are of course some impact evaluations in EA- GWWC’s evaluating the evaluators project was a good example (But note that this was done only last year, once - and from my perspective it evaluated the structure and framework of the funds, not the impact of the grants themselves). I definitely think there is a lot of room for improvement - especially on publicly accessible impact reports. I think this is all the more important for EA, since “not assuming impact but looking for evidence” is one of the distinguishing features of it. 

Unless do favors for, or otherwise suck up to, well-connected people it can be difficult to get your idea in front of early-stage funders (i.e. HNWIs / individual donors). Even if you do, unless you are able to get a warm introduction or a meeting, your idea will be a mere piece of paper, overlooked in favor of projects from people the funder already knows. Even if you manage to get connected, you will likely be rejected without feedback and so cannot efficiently improve your application.

Potential improvements

  1. AI: I'm pretty sure a GPT 01-preview custom GPT, with just prompt engineering, could give pretty strong feedback on grant proposals if it was just given guidance and examples. The subset of applications that look strong to the bot could be forwarded to an email inbox where a well-connected EA could look at it and decide if it merits forwarding to funders.
  2. Forum: There could be a weekly thread dedicated to feedback on ideas. Pay one person to give feedback to applications that get none.
  3. EA YC: If either of the above yields promising results, it could be scaled up by doing an EA version of y-combinator.

A lot of what I have seen regarding "EA Community teams" seems to be be about managing conflicts between different individuals. 

It would be interesting to see an organization or individual that was explicitly an expert in knowing different individuals and organizations and the projects that they are working on and could potentially connect people who might be able to add value to each other's projects. It strikes me that there are a lot of opportunities for collaboration but not as much organization around mapping out the EA space on a more granular level. 

A lot of what I have seen regarding "EA Community teams" seems to be be about managing conflicts between different individuals. 

Not sure I understand this part - curious if you could say more. 

It would be interesting to see an organization or individual that was explicitly an expert in knowing different individuals and organizations and the projects that they are working on and could potentially connect people who might be able to add value to each other's projects.

I like this idea. A related idea/ framing that comes to mind. 

  • There's often a
... (read more)
2
Brad West🔸
Just when I have seen efforts to improve community relations it has typically been in the "Community Health" context relating to when people have had complaints about people in the community or other conflicts. I haven't seen as much concerted effort in connecting people working on different EA projects that might add value to each other.
Curated and popular this week
Relevant opportunities