There's a lot to discuss in this comment so it might be worth unpacking responses into sections. For myself, I'm most interested in your assertion that money is well-correlated with having more accurate views about the world.
I think you're correct that there is some connection between "accurate views in a domain" to "success in that domain" on average. But I think the main driver of that connection is a correlation at the low end (e.g., people with really faulty pictures of reality are not successful) but no low correlation outside of that range.
In the case of wealth, while we might expect that being well-attuned to reality is helpful, being "well-attuned to reality" is not a real trait (or if it is, it's extremely rare) -- most people are well-attuned to parts of reality and not others. Furthermore, wealth is in most societies highly driven by being lucky to be born into a particular family. So at the end of the day, we shouldn't expect donors with the most money to generally have the best views on what to do with it.
In particular, I think that the dynamics in charity make this lack of correlation even more problematic, because the wealthiest folks have disproportionately more control over what happens in charity than the just-relatively-well-off folks, and we particularly shouldn't expect that being wildly wealthy is a good predictor of "being good at figuring out which charities are most impactful." Being insanely wealthy is probably even more luck driven than being successful in a normal way, and the more insanely wealthy you are, the more likely you are to have charities trying to sell themselves to you, and the worse your access to information about them will be.
Just to reality-test my mental model here against my own experience: you suggest looking at the major donors in EA. By and large, my experience in EA is that there is not really a correlation between wealth and having good ideas about charity. I meet a lot of wealthy people in my job, and they are often shockingly out of touch. Maybe they were better calibrated before they got wealthy, but becoming insanely wealthy reduces how much people are honest to you and makes your life look so different from normal I expect you forget what normal is. Often, the people in EA I think make the best calls are sort of mid-tier employees of EA orgs, who are both thoughtful and have great insider info.
Even beyond that, EA major donors are a small selection of rich people in general, who by and large I think make absolutely terrible decisions about charity (and I expect you think that also, since you're on the EA forum). So even if I wanted to grant you that these rich people might have accurate views within their domain, I wouldn't grant that that makes them better at choosing charities.
Basically, my overall point is that (1) really wealthy people are probably mostly really wealthy by chance of circumstance; (2) if not chance, and it is domain expertise in the area of their success, that doesn't largely transfer to success in choosing charities, and (3) based on my experience of EA, wealthy EAs are no more likely to make good decisions than non-wealthy EAs. So I'm comfortable endorsing the idea that having more money is not generally a good predictor of having great ideas about charity.
I don't really want to get into an argument here about whether extreme wealth is largely luck-driven, or how much success in one domain translates to success in others, since I believe people tend to be firmly entrenched in one view or another on those topics and it could distract from the main topic of the Equal Hands experiment. My intention is merely to illustrate why someone might endorse the original statement.
Just chiming in to have more than Habryka's view represented here. I think it's not unreasonable in principle to think that OP and GV could create PR distance between themselves. I think it will just take time, and Habryka is being moderately too pessimistic (or, accurately pessimistic in the short term but not considering reasonable long-term potential). I'd guess many think-tank type organizations have launched on the strength of a single funder and come to advise many funders, having a distinct reputation from them -- OP seems to be pursuing this more strongly now than in the past, and while it will take time to work, it seems like a reasonable theory of change.
I have updated, based on this exchange, that OP is more open to investment in non-GV-supported activities than previously came through. I'm happy to hear that.
I don't want to wake up anymore to somebody I personally loathe getting platformed only to discover I paid for the platform. That fact matters to me.
This is eminently reasonable. Any opposition to these changes I've aired here is more about disappointment in some of the specific cause areas being dropped and the sense that OP couldn't continue on them without GV; I'm definitely not intending to complain about GV's decision, and overall I think OP attempting to diversify funding sources (and EA trying to do so at well) is very, very healthy and needed.
To clarify, I did see the invitations to other funders. However, my perception was that those are invitations to find people to hand things off to, rather than to be a continuing partner like with GV. Perhaps I misunderstood.
I also want to be clear that the status quo you're articulating here does not match what I've heard from former grantees about how able OP staff are to participate in efforts to attract additional funding. Perhaps there has been quite a serious miscommunication.
I feel incapable of being heard correctly at this point, so I guess it was a mistake to speak up at all and I'm going to stop now.
Sorry to hear that, several people I've spoken to about this offline also feel that you are being open and agreeable and the discussion reads from the outside as fairly civil, so except perhaps with the potential heat of this exchange with Ollie, I'd say most people get it and are happy you participated, particularly given that you didn't need to. For myself, the bulk of my concern is with how I perceive OP to have handled this given their place in the EA community, rather than my personal and irrelevant partial disagreement with your personal funding decisions.
[edited to add "partial" in the last sentence]
Speculating on your point 4: The messaging so far has been framed as "Good Ventures is pulling out of XYZ areas; since OP is primarily funded by GV, they are also pulling out." But perhaps, if the sweet spot here is "Cari and Dustin aren't controlling EA but also are obviously allowed not to fund things they don't buy/want to" one solution would be to leave OP open to funding XYZ areas if a new funder appears who wants to partner with them to do so. This would, to me, seem to allow GV and OP to overtime develop more PR and bandwidth breathing room between the two orgs.
My sense is that this is not what's happening now. As in my other comment on this thread, I don't want to reveal my sources because I like this account being anonymous, but I'm reasonably confident that OP staff have been told "we are not doing XYZ anymore" not "Cari and Dustin don't want to fund XYZ anymore, so if you want to work on it, we need to find more funding."
My suspicion (from some conversations with people who interact directly with OP leadership) is that it isn't only Cari and Dustin who don't want to support the dropped areas, but also at least some leadership at OP. If that's right, it explains why they haven't taken the approach I'm suggesting here, but not why they didn't say so (perhaps connecting this back to your point 2).
Here are the ones I know about: wild animal welfare (including averting human-caused harms), all invertebrate welfare (including farmed shrimp), digital minds, gene editing*.
I think this is close to all of them in the animal space. I believe there are also some things losing funding in other areas (e.g., see Habryka's comments), but I'm less familiar with that community.
*I don't know about gene editing for humans, like for malaria.
[edited to fix typo]
My evidence is all anecdotal, so I could be wrong, but I think the discrepancy between outside impression of EA/EA Forum behavior and the survey comes from two things:
Hey just some notes on how nonprofit fiscal sponsorship stuff works (I have worked in ops for charities for a while now) --
Second point within this comment I'm interested in discussing: If I'm summarizing you correctly, you think standard methods of addressing the problem ("cause allocation in EA is controlled by a few rich people who might not make good decisions") makes Equal Hands an unnecessary project.
First: I agree with you that the current donation pooling/voting process is not optimal. Hopefully in the six months of the trial a more streamlined option will be found. A fund seems good; knowing the annoying-ness of setting up an appropriate 501c3 and considering the international nature of EA I understand why Abraham didn't go that route before determining whether there was any interest in the project, but I think if it succeeds creating a fund would be good.
If a fund is created, the main difference between the Equal Hands concept and EA funds is that typical EA funds don't address at all the issue of larger donors having more influence. Yes, experts decide where the amounts within the buckets go. But if one billionaire likes GCR and no billionaires like animal welfare, there will be no mechanism to democratize the distribution between pools. It may be that you don't care about that, but assuming you did, do you see EA funds as addressing that issue in some way that I am missing?
Second: I agree that a certain amount of donor 3 hiring donor 1 as a consultant or being convinced by a persuasive argument or similar goes on in EA (at least, much more than outside of EA). But the examples you give are such small levels of decision-making sharing. If you endorse the general rule that more decision makers tend to make better decisions than small groups, even when the small groups are composed of experts (which I think there is quite a bit of evidence for?) then a much more robust democratization seems good.