This is a repost from a Twitter thread I made last night. It reads a little oddly when presented as a Forum post, but I wanted to have the content shared here for those not on Twitter.
This is a thread of my thoughts and feelings about the actions that led to FTX’s bankruptcy, and the enormous harm that was caused as a result, involving the likely loss of many thousands of innocent people’s savings.
Based on publicly available information, it seems to me more likely than not that senior leadership at FTX used customer deposits to bail out Alameda, despite terms of service prohibiting this, and a (later deleted) tweet from Sam claiming customer deposits are never invested.
Some places making the case for this view include this article from Wall Street Journal, this tweet from jonwu.eth, this article from Bloomberg (and follow on articles).
I am not certain that this is what happened. I haven’t been in contact with anyone at FTX (other than those at Future Fund), except a short email to resign from my unpaid advisor role at Future Fund. If new information vindicates FTX, I will change my view and offer an apology.
But if there was deception and misuse of funds, I am outraged, and I don’t know which emotion is stronger: my utter rage at Sam (and others?) for causing such harm to so many people, or my sadness and self-hatred for falling for this deception.
I want to make it utterly clear: if those involved deceived others and engaged in fraud (whether illegal or not) that may cost many thousands of people their savings, they entirely abandoned the principles of the effective altruism community.
If this is what happened, then I cannot in words convey how strongly I condemn what they did. I had put my trust in Sam, and if he lied and misused customer funds he betrayed me, just as he betrayed his customers, his employees, his investors, & the communities he was a part of.
For years, the EA community has emphasised the importance of integrity, honesty, and the respect of common-sense moral constraints. If customer funds were misused, then Sam did not listen; he must have thought he was above such considerations.
A clear-thinking EA should strongly oppose “ends justify the means” reasoning. I hope to write more soon about this. In the meantime, here are some links to writings produced over the years.
These are some relevant sections from What We Owe The Future:
Here is Toby Ord in The Precipice:
Here is Holden Karnofsky: https://forum.effectivealtruism.org/posts/T975ydo3mx8onH3iS/ea-is-about-maximization-and-maximization-is-perilous
Here are the Centre for Effective Altruism’s Guiding Principles: https://forum.effectivealtruism.org/posts/Zxuksovf23qWgs37J/introducing-cea-s-guiding-principles
If FTX misused customer funds, then I personally will have much to reflect on. Sam and FTX had a lot of goodwill – and some of that goodwill was the result of association with ideas I have spent my career promoting. If that goodwill laundered fraud, I am ashamed.
As a community, too, we will need to reflect on what has happened, and how we could reduce the chance of anything like this from happening again. Yes, we want to make the world better, and yes, we should be ambitious in the pursuit of that.
But that in no way justifies fraud. If you think that you’re the exception, you’re duping yourself.
We must make clear that we do not see ourselves as above common-sense ethical norms, and must engage criticism with humility.
I know that others from inside and outside of the community have worried about the misuse of EA ideas in ways that could cause harm. I used to think these worries, though worth taking seriously, seemed speculative and unlikely.
I was probably wrong. I will be reflecting on this in the days and months to come, and thinking through what should change.
It's fair enough to feel betrayed in this situation, and to speak that out.
But given your position in the EA community, I think it's much more important to put effort towards giving context on your role in this saga.
Some jumping-off points:
- Did you consider yourself to be in a mentor / mentee relationship with SBF prior to the founding of FTX? What was the depth and cadence of that relationship?
- e.g. from this Sequoia profile (archived as they recently pulled it from their site):
- What diligence did you / your team do on FTX before agreeing to join the Future Fund as an advisor?&nb
... (read more)"The math, MacAskill argued, means that if one’s goal is to optimize one’s life for doing good, often most good can be done by choosing to make the most money possible—in order to give it all away. “Earn to give,” urged MacAskill.
... And MacAskill—Singer’s philosophical heir—had the answer: The best way for him to maximize good in the world would be to maximize his wealth. SBF listened, nodding, as MacAskill made his pitch. The earn-to-give logic was airtight. It was, SBF realized, applied utilitarianism. Knowing what he had to do, SBF simply said, “Yep. That makes sense.”"
[Edit after months: While I still believe these are valid questions, I now think I was too hostile, overconfident, and not genuinely curious enough.] One additional thing I’d be curious about:
You played the role of a messenger between SBF and Elon Musk in a bid for SBF to invest up to 15 billion of (presumably mostly his) wealth in an acquisition of Twitter. The stated reason for that bid was to make Twitter better for the world. This has worried me a lot over the last weeks. It could have easily been the most consequential thing EAs have ever done and there has - to my knowledge- never been a thorough EA debate that signalled that this would be a good idea.
What was the reasoning behind the decision to support SBF by connecting him to Musk? How many people from FTXFF or EA at large were consulted to figure out if that was a good idea? Do you think that it still made sense at the point you helped with the potential acquisition to regard most of the wealth of SBF as EA resources? If not, why did you not inform the EA community?
Source for claim about playing a messenger: https://twitter.com/tier10k/status/1575603591431102464?s=20&t=lYY65-TpZuifcbQ2j2EQ5w
I don't think EAs should necessary require a community-wide debate before making major decisions, including investment decisions; sometimes decisions should be made fast, and often decisions don't benefit a ton from "the whole community weighs in" over "twenty smart advisors weighed in".
But regardless, seems interesting and useful for EAs to debate this topic so we can form more models of this part of the strategy space -- maybe we should be doing more to positively affect the world's public fora. And I'd personally love to know more about Will's reasoning re Twitter.
See Holden Karnofsky's Some Thoughts on Public Discourse:
... (read more)I think it's important to note that many experts, traders, and investors did not see this coming, or they could have saved/made billions.
It seems very unfair to ask fund recipients to significantly outperform the market and most experts, while having access to way less information.
See this Twitter thread from Yudkowsky
Edit: I meant to refer to fund advisors, not (just) fund recipients
Lorenzo, I agree the expert traders and investor have more technical skills about investment. But it seems to me that MacAskill and FTX Future Fund board had more direct information about the personality of SBF and the personal connections among the leaders and the group dynamics. So, when it comes to your statement “having access to way less information”, I don’t think this is the case.
My understanding is that FTX's business model fairly straightforwardly made sense? It was an exchange, and there are many exchanges in the world that are successful and probably not fraudulent businesses (even in crypto - Binance, Coinbase, etc). As far as I can tell, the fraud was due to supporting specific failures of Alameda due to bad decisions, but wasn't inherent to FTX making any money at all?
I'm gonna wait it out on this one.
I'd currently wildly guess that Coinbase is not a fraud.
This seems to be “not even wrong” - FTX’s business model isn’t and never was in question. The issue is Sam committing fraud and misappropriating customer funds, and there being a total lack of internal controls at FTX that made this possible.
Sure, and what is your point?
My current best guess is that WM quite reasonably understood FTX to be a crypto exchange with a legitimate business model earning money from fees - just like the rest of the world also thought. The fact that FTX was making trades with depositor funds was very likely to be a closely kept secret that no one at FTX was likely to disclose to an outsider. Why the hell would they - it's pretty shady business!
Are you saying WM should have demanded to see proof that FTX's money was being earned legitimately, even if he didn't have any reason to believe it might not be? This seems to me like hindsight bias. To give an analogy - have you ever asked an employer of yours for proof that their activities aren't fraudulent?
Not disagreeing with your overall point, but if my non-EA aligned, low-level crypto trader friend is any indication, then there certainly was reason to believe that SBF was at the very least doing some shady things. In August, I asked this friend for his thoughts on SBF, and this is what he replied:
“He’s obviously super smart but comes across a bit evil while trying to portray the good guy front. His exchange is notorious for liquidating user positions, listing shit coins thats prices trend to zero. He also founded Alameda research (trading / market maker firm) alongside FTX (the exchange). Alameda are one of the biggest crypto trading firms with predatory reputation. There’s also the issue of barely any divide between the exchange and the trading firm so alameda likely sees a lot of exchange data that gives them an edge trading on FTX vs other users.”
The irony is that this friend lost most of his savings because he was a FTX user.
Also from the Sequoia profile: "After SBF quit Jane Street, he moved back home to the Bay Area, where Will MacAskill had offered him a job as director of business development at the Centre for Effective Altruism." It was precisely at this time that SBF launched Alameda Research, with Tara Mac Aulay (then the president of CEA) as a co-founder ( https://www.bloomberg.com/news/articles/2022-07-14/celsius-bankruptcy-filing-shows-long-reach-of-sam-bankman-fried).
To what extent was Will or any other CEA figure involved with launching Alameda and/or advising it?
Tara left CEA to co-found Alameda with Sam. As is discussed elsewhere, she and many others split ways with Sam in early 2018. I'll leave it to them to share more if/when they want to, but I think it's fair to say they left at least in part due to concerns about Sam's business ethics. She's had nothing to do with Sam since early 2018. It would be deeply ironic if, given what actually happened, Sam's actions are used to tarnish Tara.
[Disclosure: Tara is my wife]
The returns shown in the document are not indicative of fraud -- those sorts of returns are very possible when skilled traders deploy short-term trading strategies in inefficient markets, which crypto markets surely were at the time. The default risk when borrowing at 15% might have been very low, but not zero as they suggested. The "no downside" characterization should have been caught by a lawyer, and was misleading.
Nobody with an understanding of trading would have[EDIT] I would not have concluded they were engaged in Ponzi schemes or were misrepresenting their returns based on the document. There are plenty of sloppy, overoptimistic startup pitch decks out there, but most of the authors of those decks are not future Theranoses.Good points, Brian . . . I'm sure there are lots of overoptimistic pitch decks, and that a 15% return might be feasible, and maybe I'm just looking at this with the benefit of hindsight.
Even so, an investment firm normally doesn't do anything like this, right? I mean, I assume that even Renaissance Technologies wouldn't want to offer one single investment opportunity packaged as a loan with a legally guaranteed 15% rate of return with "no downside." https://www.bloomberg.com/news/articles/2021-02-10/simons-makes-billions-while-renaissance-investors-fume-at-losses#xj4y7vzkg They might brag about their past returns, but would include lots of verbiage about the risks, and about how past performance is no guarantee of future returns, etc.
One specific question I would want to raise is whether EA leaders involved with FTX were aware of or raised concerns about non-disclosed conflicts of interest between Alameda Research and FTX.
For example, I strongly suspect that EAs tied to FTX knew that SBF and Caroline (CEO of Alameda Research) were romantically involved (I strongly suspect this because I have personally heard Caroline talk about her romantic involvement with SBF in private conversations with several FTX fellows). Given the pre-existing concerns about the conflicts of interest between Alameda Research and FTX (see examples such as these), if this relationship were known to be hidden from investors and other stakeholders, should this not have raised red flags?
I believe that, even in the face of this particular disaster, who EAs are fucking is none of EA's business. There are very limited exceptions to this rule like "maybe don't fuck your direct report" or "if you're recommending somebody for a grant, whom you have fucked, you ought to disclose this fact to the grantor" or "Notice when somebody in a position of power seems to be leaving behind a long trail of unhappy people they've fucked", plus of course everything that shades over into harrassment, assault, and exploitation - none of which are being suggested here.
Outside of that, there's a heck of a lot of people in this world fucking a heck of a lot of other people; most people who are fucking don't blow up depository institutions; and controls and diligence on depository institutions should achieve reliability by some avenue other than checking which people are fucking. And I consider it generally harmful for a community to think that it has a right to pass judgment on fucking that is not like really clearly violating deontology. That's not something that community members owe to a community.
As a Bloomberg article put it in September: https://www.pymnts.com/cryptocurrency/2022/bankman-frieds-stake-in-quant-trading-firm-raises-conflict-questions/
Are you trying to suggest that when two firms need to be at arms-length because of the potential for an enormous conflict of interest, it wouldn't matter if the two firms' chief executives were dating each other?
I'm saying that if your clearance process is unable to tell whether or not two firms are arms-length, when they have a great deal to potentially gain from illegally interoperating, without the further piece of info about whether the CEOs are dating, you're screwed. This is like trying to fix the liar loan problem during the mortgage meltdown by asking whether the loan issuer is dating the loan recipient. The problem is not that, besides the profit motive, two people might also be fond of each other and that's terrible; the problem is if your screening process isn't enough to counterbalance the profit motive. A screening process that can make sure two firms aren't colluding to illegally profit should not then break down if the CEOs go on a date.
Or to put it more compactly and specifically: Given the potential energy between Alameda and FTX as firms, not to mention their other visible degrees of prior entanglement, you'd have to be nuts to rely on an assurance process that made a big deal about whether or not the CEOs were dating.
Maybe even more compactly: Any time two firms could gain a lot of financial free energy by colluding, just pretend you've been told their CEOs are dating, okay, and now ask what assurances or tests you want to run past that point.
...I think there must be some basic element of my security mindset that isn't being shared with voters here (if they're not just a voting ring, a possibility that somebody else raised in comments), and I'm at somewhat of a loss for what it could be exactly. We're definitely not operating in the same frame here; the things I'm saying here sure feel like obvious good practices from inside my frame.
Taking prurient interest in other people's sex lives, trying to regulate them as you deem moral, is a classic easy-mode-to-fall-into of pontificating within your tribe, but it seems like an absurd pillar on which to rest the verification that two finance companies are not intermingling their interests. Being like "Oh gosh SBF and Caroline were dating, how improper" seems like finding this one distracting thing to jump on... which would super not be a key element of any correctly designed corporate assurance process about anything? You'd go a... (read more)
I work (indirectly) in financial risk management. Paying special attention to special categories of risk - like romantic relationships - is very fundamental to risk management. It is not that institutions are face with a binary choice of 'manage risk' or 'don't manage risk' where people in romantic relationships are 'managed' and everyone else is 'not'. Risk management is a spectrum, and there are good reasons to think that people with both romantic and financial entanglements are higher risk than those with financial entanglements only. For example:
Romantic relationships inspire particularly strong feelings, not usually characterising financial relationships. People in romantic relationships will take risks on each other's behalf that people in financial relationships will not. We should be equally worried about familial relationships, which also inspire very strong feelings.
Romantic relationships inspire different feelings from financial relationships. Whereas with a business partner you might be tempted to act badly to make money, with a romantic partner you might be tempted to act badly for many other reasons. For example, to make your partner feel good, or to spare your
So if I were writing these rules, I might very well rephrase it as "do you have a very strong friendship with this other person" and "do you occasionally spend time at each other's houses" to avoid both allonormativity and the temptation to prurient sniffing; and I'd work hard to keep any disclosed information of that form private, like "don't store in Internet-connected devices or preferably on computers at all" private, to minimize incentives against honest disclosure. And even then, I might expect that among the consequences of the regulation, would be that CEOs in relationships would occasionally just lie to me about it, now that such incentives had been established against disclosure.
When you optimize against visible correlates of possible malfeasance, you optimize first and above all against visibility; and maybe secondarily against possible malfeasance if the visibility is very reliable and the correlations are strong enough to take causal leaning on them.
But, sure, if you know all that and you understand the consequences, then Sequoia could've asked if SBF and Caroline were in a relationship, understanding that a No answer might be a lie given the incentives they'd established, and that a Yes answer indicated unusual honesty.
I don't really understand why you are describing this as a hypothetical ("If I were writing these rules..."). You are the founder and head of a highly visible EA organisation recieving charitable money from donors, and presumably have some set of policies in place to prevent staff at that organisation from systematically defrauding those donors behind your back. You have written those policies (or delegated someone else to write them for you). You are sufficiently visible in the EA space that your views on financial probity materially affect the state of EA discourse. What you are telling me is that the policies which you wrote don't include a 'no undeclared sexual relationships with people who are supposed to act as a check on you defrauding MIRI' rule, based on your view that it is excessively paternalistic to inquire about people's sex life when assessing risk, and that your view is that this is the position that should be adopted in EA spaces generally.
This is - to put it mildly - not the view of the vast majority of organisations which handle money at any significant scale. No sane risk management approach would equate a romantic relationship with 'a very strong friendship'. R... (read more)
Somebody else in that thread was preemptively yelling "vote manipulation!" and "voting ring!", and as much as it sounds recursively strange, this plus some voting patterns (early upvotes, then suddenly huge amounts of sudden downvoting) did lead me to suspect that the poster in question was running a bunch of fake accounts and voting with them.
We would in fact be concerned if it turned out that two people who were supposed to have independent eyes on the books were in a relationship and didn't tell us! And we'd try to predictably conduct ourselves in such a mature, adult, understanding, and non-pearl-clutching fashion that it would be completely safe for those two people to tell the MIRI Board, "Hey, we've fallen in love, you need to take auditing responsibility off one of us and move it to somebody else" and have us respond to that in a completely routine, nonthreatening, and unexcited way that created no financial or reputational penalties for us being told about it.
That's what I think is the healthy, beneficial, and actually useful for minimizing actual fraud in real life culture, of which I do think present EA has some, and which I think is being threatened by performative indignation.
I'm struggling to follow your argument here. What you describe as the situation at MIRI is basically standard risk management approach - if two people create a risk to MIRI's financial security processes by falling in love, you make sure that neither signs off on risk taken by the other.
But in this thread you are responding with strong disagreement to a comment which says "if this relationship [between SBF and Caroline] were known to be hidden from investors and other stakeholders, should this not have raised red flags?". You said "who EAs are fucking is none of EA's business", amongst other comments of a similar tone.
I don't understand what exactly you disagree with if you agree SBF and Caroline should have disclosed their relationship so that proper steps could be taken to de-risk their interactions (as would happen at MIRI). It seems that you do agree it matters who EAs are fucking in contexts like this? And therefore that it is relevant to know whether Will MacAskill knew about the undisclosed relationship?
You could plausibly claim it gets disclosed to Sequoia Capital, if SC has shown themselves worthy of being trusted with information like that and responding to it in a sensible fashion eg with more thorough audits. Disclosing to FTX Future Fund seems like a much weirder case, unless FTX Future Fund is auditing FTX's books well enough that they'd have any hope of detecting fraud - otherwise, what is FTXFF supposed to do with that information?
EA generally thinking that it has a right to know who its celebrity donors are fucking strikes me as incredibly unhealthy.
I think we might be straying from the main point a bit; nobody is proposing a general right to peer into EA sex lives, and I agree that would be unhealthy.
There are some relatively straightforward financial risk management principles which msinstream orgs have been successfully using for decades. You seem to believe one of the pillars of these principles - surfacing risk due to romantic entanglements between parties - shouldn't apply to EA, and instead some sort of 'commonsense' approach should prevail instead (inverted commas because I think the standard way is basically common sense too).
But I don't understand where your confidence that you're right here is coming from - EA leadership has just materially failed to protect EA membership from bad actor risk stemming at least in part from a hidden conflict of interest due to a romantic entanglement. EA leadership has been given an opportunity to run risk management their way, and the result is that EA is now associated with the biggest crypto fraud in history. Surely the Bayesian update here is that there are strong reasons to believe mainstream finance had it approximately right?
Rereading the above, I think I might just be unproductively repeating myself at this point, so I'll duck out of the discussion. I appreciated the respectful back-and-forth, especially considering parts of what I was saying were (unavoidably) pretty close to personal attacks on you and the EA leadership more broadly. Hope you had a pleasant evening too!
This statement is incredibly out of touch Eliezer. If CEO #1 and CEO #2 are in a romantic relationship, there is a clear conflict of interest here, especially when not disclosed to the public. In agreement with Anonymous, I also strongly oppose the language you're using. I also agree with their comments regarding romantic relationships in the workplace. My general stance is 0 tolerance for workplace romance because it's messy and there are far too many power dynamics at play.
Conflict of interest is the issue my friend. Unbiased decisions cannot happen when one has an other-than-work-relationship with the persons they are dealing with.
...You think it's important to disclose this conflict of interest when you recommend a grant to someone, but not important when you as a CEO decide on a multi-billion dollar loan to the company where the other person is the CEO?
Because as somebody who could potentially be mistaken for a Leader I want to be pretty derned careful about laying down laws to regulate other people's sexuality; and while something like that would definitely be a red flag at, like, idk, CEA or MIRI, maybe it's different if we're talking about a 3-person startup. Maybe you'll say it's still ill-advised, but I don't know their circumstances and there's also a difference between ill-advised and Forbidden. I feel a lot more comfortable leaving out the 'maybe' when I pontificate my legislation about informing a donor that your recommended grantee is one with whom you've had a relationship - though even there, come to think, I'm relying on all the donors I ever talk to being sensible people who aren't going to go "OH NO, PREMARITAL SEX" about it.
It's not about the sex in and of itself, it's about the conflict of interest and favouritism. Romantic love interest is enough for that too. EA could probably learn a lot from how mainstream orgs deal with this.
Yes - I almost can't believe I am reading a senior EA figure suggesting that every major financial institution has an unreasonably prurient interest in the sex lives of their risk-holding employees. EA has just taken a bath because it was worse at financial risk assessment than it thought it was. The response here seems to be to double-down on the view that a sufficiently intelligent rationalist can derive - from first principles - better risk management than the lessons embedded in professional organisations. We have ample evidence that this approach did not work in the case of FTX funding, and that real people are really suffering because EA leaders made the wrong call here.
Now is the time to eat a big plate of epistemically humble crow, and accept that this approach failed horribly. Conspiracy theorising about 'voting rings' is a pretty terrible look.
I feel like people are mischarachterizing what Eliezer is saying. It sounds to me like he's saying the following.
"Sure, the fact that the two were dating or having sex makes it even more obvious that something was amiss, but the real problem was obviously that Alameda and FTX were entangled from the very start with Sam having had total control of Alameda before he started FTX, and there were no checks and balances and so on, so why are you weirdos focusing on the sex part so much and ignore all the other blatant issues?!"
That seems like a very charitable reading of the comment
"who EAs are fucking is none of EA's business. There are very limited exceptions to this rule like ... none of which are being suggested here."
I'd suggest that given the high stakes of the situation at the moment it is especially important not to inadvertently give the impression that EA leadership think they have privileged insight into financial risk management that they actually don't. If EY has merely mangled his argument (as you suggest) it would be very sensible for him to edit his comment to reflect that, and apologise for implying that vote rigging was the only reason he could have been down voted.
One thing that people in mainstream orgs do, if they want to act with integrity, is resign from roles/go work somewhere else when they want to start a relationship that would create a conflict of interest whilst both are in their current positions (or if they value their job(s) more, give up on the idea of the relationship).
...are you suggesting that nobody ought to dare to defend aspects of our current culture once somebody has expressed concerns about them?
Is the romantic relationship that big a deal? They were known to be friends and colleagues + both were known to be involved with EA and FTX future fund, and I thought it was basically common knowledge that Alameda was deeply connected with FTX as you show with those links - it just seems kind of obvious with FTX being composed of former Alameda employees and them sharing an office space or something like that.
Romantic love is a lot more intense than mere friendship! Makes conflicts of interest way more likely.
My $0.02 - (almost) the entire financial and crypto world, including many prominent VC funds that invested in FTX directly seem to have been blindsided by the FTX blowup. So I'm less concerned about the ability to foresee that. However the 2018 dispute at Alameda seems like good reason to be skeptical, and I'm very curious what what was known by prominent EA figures, what steps they took to make it right and whether SBF was confronted about it before prominent people joined the future fund etc.
+1, I think people are applying too much hindsight here.* The main counter consideration: To the degree that EAs had info that VCs didn't have, it should've made us do better.
*It's still important to analyze what went wrong and learn from it.
Hi Milan,
I come from the traditional accounting/ internal audit where governance teams and internal controls at the very least are installed and due diligence is a best practice especially in large sums of money being distributed. I am new here to the EA community and have expected similar protocols are in place as large scale fraud is not some new thing - it had brought down the accounting profession in 2001 (Enron) and the mortgage crisis in 2008 (Lehman).
I guess what is clear to me is EA lacks the expertise on fraud / error detection, moreover has to make some improvements in the near future.
All the best,
Miguel
My naive moral psychology guess—which may very well be falsified by subsequent revelations, as many of my views have this week—is that we probably won’t ever find an “ends justify the means” smoking gun (eg, an internal memo from SBF saying that we need to fraudulently move funds from account A to B so we can give more to EA). More likely, systemic weaknesses in FTX’s compliance and risk management practices failed to prevent aggressive risk-taking and unethical profit-seeking and self-preserving business decisions that were motivated by some complicated but unstated mix of misguided pseduo-altruism, self-preservation instincts, hubris, and perceived business/shareholder demands.
I say this because we can and should be denouncing ends justify the means reasoning of this type, but I suspect very rarely in the heat of a perceived crisis will many people actually invoke it. I think we will prevent more catastrophes of this nature in the future by focusing more on on integrity as a personal virtue and the need for systemic compliance and risk-management tools within EA broadly and highly impactful/prominent EA orgs, especially those whose altruistic motives will be systematically in tension with perceived business demands.
Relatedly, I think a focus on ends-justify-the-means reasoning is potentially misguided because it seems super clear in this case that, even if we put zero intrinsic value on integrity, honesty, not doing fraud, etc., some of the decisions made here were pretty clearly very negative expected-value. We should expect the upsides from acquiring resources by fraud (again, if that is what happened) to be systematically worth much less than reputational and trustworthiness damage our community will receive by virtue of motivating, endorsing, or benefitting from that behavior.
I thank you for apologizing publicly and loudly. I imagine that you must be in a really tough spot right now.
I think I feel a bit conflicted on the way you presented this.
I treat our trust in FTX and dealings with him as bureaucratic failures. Whatever measures we had in place to deal with risks like this weren't enough.
This specific post reads a bit to me like it's saying, "We have some blog posts showing that we said these behaviors are bad, and therefore you could trust both that we follow these things and that we encourage others to, even privately." I'd personally prefer it, in the future, if you wouldn't focus on the blog posts and quotes. I think they just act as very weak evidence, and your use makes it feel a bit like otherwise.
Almost every company has lots of public documents outlining their commitments to moral virtues.
I feel pretty confident that you were ignorant of the fraud. I would like there to be more clarity of what sorts of concrete measures were in place to prevent situations like this, and what measures might change in the future to help make sure this doesn't happen again.
There might also be many other concrete things that could be don... (read more)
EA posts are very unlike company virtue statements. They include philosophical arguments (at least some of screenshots and linked posts do). I agree that there's more that can (and maybe should) be said, but I think linking to extensive discussion of naive/act utilitarianism vs. global consequentialism, ethical injunctions, etc., is a great way to show that EAs have seriously engaged with these topics and come down pretty decisively on one side of it.
[Edit: I have a reply to this in the comments]
I think it's nice, but I also think we should be raising the bar of the evidence we need to trust people.
SBF and the inner FTX crew seemed very EA. SBF had a utilitarian blog[1] that I thought was pretty good (for the time, it was ~2014).
He repeatedly spoke about how important it was for crypto exchanges to not do illegal activity. He even actively worked to regulate the industry.
I'd bet that SBF spent a lot more effort speaking and advocating about the importance of trustworthiness in crypto, then perhaps any of us on the importance of trust and regularly-good moral principles.
Sam literally argued for trust and accountability to congress.
From what I understand, he was the poster boy for what trustworthy crypto looks like.
We at very least could really use measures that would have caught a SBF-lite.
> EA posts are very unlike company virtue statements.
Sure, but SBF definitely got through. I'm sure any of his co-conspirators also would have. EA-adjacent people an clearly fool EAs using these sorts of methods.
(I considered raising this issue more in the first post, but am happy to add it now that there's push-back.... (read more)
I believe this is SBF's blog: http://measuringshadowsblog.blogspot.com/
I think coming back to this, my point isn't straightforwardly fair. My post above uses a lot of evidence in a way that makes it seem like the point is very obvious.
I think that bars like "does the person have public writing showing they deeply understand EA principles" are generally pretty decent and often have worked decently well.
The case with SBF does seem extremely unusual to me. Protecting against it isn't just some "obvious set of regular measures". It might take a fair deal of thought and effort.
I think that we should be thinking about how to that thought of effort. I think we should be working to find and assume ways of verification that would have at least caught some lite-SBF.
So, the example of SBF seemed too good to not share, but it is extreme, so can't be taken too much as a typical example to be worried about.
I still think that we should set the bar higher than a few blog posts for situations like this though, and assume that Will would agree. (He meant this much more as a quick public statement, and not real evidence of innocence to EAs, I assume)
I agree with Lukas, though I also suspect this was mostly a failure of bureaucracy / competence / a few individuals' moral character, rather than a failure that has much connection to EA ideology. I expect we'll have more clarity on that later, as facts come to light.
What do you mean, this community is largely composed of people who do really weird things in their day job based on abstract arguments.
(Edit: I think now that I misread this a bit: I think this post is really meant as a hastily-written update, not a well-reasoned apology. I would appreciate just a head's up that a longer doc is coming.)
I think one issue I have is that this post seems to be doing a few things at once, and it's not very clear to me.
1. Publicly apologize for what happened
2. Re-affirm to those viewing this that doing things like fraud are not publicly endorsed by EA leadership
3. Outline Will's involvement in the situation and describe who was responsible for it
4. Make it clear that Will wasn't involved in the bad parts of the scandal, and can be trusted in the future
This post came from a Twitter thread. It seems like it was hastily written.
I don't think this post does a great job at all points. It seems likely to me that it wasn't meant to.
If this post is meant to be the best public statement of all 4 things, I would really like that to be made clear.
I want to say that I have tremendous respect for you, I love your writing and your interviews, and I believe that your intentions are pure.
How concerned were you about crypto generally being unethical? Even without knowledge of the possibly illegal, possibly fraudulent behaviour. Encouraging people to invest in "mathematically complex garbage" seemed very unethical. (Due to the harm to the investor and the economy as a whole).
SBF seemed like a generally dishonest person. He ran ads saying, "don't be like Larry". But in this FT interview, he didn't seem to have a lot of faith that he was helping his customers.
"Does he worry about the clients who lose life-changing sums through speculation, some trading risky derivatives products that are banned in several countries? The subject makes Bankman-Fried visibly uncomfortable. Throughout the meal, he has shifted in his seat, but now he has his crossed arms and legs all crammed into a yogic pose."
It is now clear that he is dishonest. Given he said on Twitter that FTX US was safe when it wasn't (please correct me if I'm wrong here).
I think that even SBF thinks/thought crypto is garbage, yet he spent billions bailing out a scam industry, poss... (read more)
Thanks for posting this, Dean. Just commenting because it aligns really well with everything I am feeling too.
Will:
One item that should be a part of your reflections in the days and months to come is whether you are fit to be the public face of the effective altruism movement, given your knowledge of Sam's unethical behavior in the past, ties to him going back to 2013, and your history of vouching for Ben Delo, another disgraced crypto billionaire.
The EA community has many excellent people - including many highly capable women - who are uninvolved in this scandal and could step up to serve in this capacity.
I am glad you felt okay to post this - being able to criticise leadership and think critically about the actions of the people we look up to is extremely important.
I personally would give Will the benefit of the doubt of his involvement in/knowledge about the specific details of the FTX scandal, but as you pointed out the fact remains that he and SBF were friends going back nearly a decade.
I also have questions about Will Macaskill's ties with Elon Musk, his introduction of SBF to Elon Musk, his willingness to help SBF put up to 5 billion dollars towards the acquisition of Twitter alongside Musk, and the lack of engagement with the EA community about these actions. We talk a lot about being effective with our dollars and there are so many debates around how to spend even small amounts of money (eg. at EA events or on small EA projects), but it appears that helping SBF put up to 5 billion towards Twitter to buy in with a billionaire who recently advocated voting for the Republican party in the midterms didn't require that same level of discussion/evaluation/scrutiny. (I understand that it wasn't Will's money and possibly SBF couldn't have been talked into putting it towards ot... (read more)
Of course. This reads as almost bizarre: it would be a baby-eater-type conspiracy theory to think that Will (or anyone else in EA leadership) knew about this. That's just not how things work in the world. The vast majority of people at Alameda/FTX didn't know (inner circle may have been as small as four). I mean, maybe there's a tiny chance that Sam phoned up someone a week ago and wanted a billion in secret, but you can see how ridiculous that sounds. I mean picture the conversation: "Hey [EA leader], turns out I fucked up. Can you wire me a billion? You're down with me secretly trading with customer funds, right?"
In any case, I think that isn't Kerry's point. The point isn't "Did Will know about concrete fraud," but rather, "Was there reason to think fraud is unusually likely? And if so, did people take the types of precautions that you'd want to take if you thought you were dealing with someone capable of doing shady things?"
For comparison, think of Zuckerberg. If the Social Network movie is roughly accurate, then Zuckerberg acted in utterly de... (read more)
I don't think leadership needed to know how the sausage was made to be culpable to some degree. Many people are claiming that they warned leadership that SBF was not doing things above board and if true then has serious implications, even if they didn't know exactly what SBF was up to.
Not I am not claiming that anyone, specific or otherwise, knew anything.
Thanks for your response. On reflection, I don't think I said what I was trying to say very well in the paragraph you quoted, and I agree with what you've said.
My intent was not to suggest that Will or other FTX future fund advisors were directly involved (or that it's reasonable to think so), but rather that there may have been things the advisors chose to ignore, such as Kerry's mention of Sam's unethical behaviour in the past. Thus, we might think that either Sam was incredibly charismatic and good at hiding things, or we might think there actually were some warning signs and those involved with him showed poor judgement of his character (or maybe some mix of both).
Is Ben Delo a "disgraced crypto billionaire"? From Jess Riedel's description, it wasn't obvious to me whether the thing BitMEX got fined for was something seriously evil, versus something closer to "overlooked a garden-variety regulation and had to go pay a fine, as large businesses often do".
(Conflict-of-interest notice: I work for MIRI, which received money from Open Phil in 2020 that came in part from Ben Delo.)
I'd prefer that the discussion focus on more concrete, less PR-ish things than questions of who is "fit to be the public face of the effective altruism movement". The latter feels close to saying that Will isn't allowed to voice his personal opinions in public if EAs think he fucked up.
I'd like to see EA do less PR-based distancing itself from people who have good ideas, and also less signal-boosting of people for reasons orthogonal to their idea quality (and ability to crisply, accurately, and honestly express those ideas). Think less "activist movement", more "field of academic discourse".
I’d be interested to know why you thought it relevant to mention “women” specifically?
It seems like there's this expectation of public figures to always have acted in a way that is correct given the information we have now - basically hindsight bias again.
One of the many wildly charismatic people you hang out with later turns out to be No Good? Well of course you shouldn't have associated with them. One of the many rumours you might have heard turns out to be true and a big deal? Off course you should have acted on it.
I don't think this is very fair or useful. I guess we might worry that the rest of the world will think like that but I don't see why we should.
I, in general, share your sentiments, but I wanted to pick up on one thing (which I also said on twitter originally)
While it might sound good to say people should be honest, have integrity, and reject 'ends justify the means' reasoning, I do see how you can expect people to do all three simultaneously: many people - including many EAs and almost certainly you, given your views on moral uncertainty - do accept that the ends sometimes justify the means. Hence, to go around saying "the ends don't justify the means" when you think that, sometimes - perhaps often - they do, smacks of dishonesty and a lack of integrity. So, I hope you do write something further to your statements above.
It seems like the better response is to accept that, in theory, the ends can sometimes justify the means - it would be right to harm one person to save *some* number more - but then say that, in practice, defrauding people of their money is really not a time when this is true.
I agree... Was very bothered by the categorical proscriptions against "ends justifying the means" as well as the seeming statements that some kinds of ethical epistemology are outside of the bounds of discourse. Seemed very contrary to the EA norm of open discourse on morality being essential to our project.
This has indeed always been the case, but I'm glad it is so explicitly pointed out now. The overgeneralization from “FTX/SBF did unethical stuff” to “EA people think the end always justifies the means” is very easy to make for people that are less familiar with EA - or perhaps even SBF fell for this kind of reasoning, though his motivations are speculations for now.
It would probably be for the better to make the faulty nature of “end justify the means” reasoning (or the distinction between naive and prudent utilitarianism) a core EA cultural norm that people can't miss.
Please someone explain to me, how the information publicly available years ago did not clearly indicate this was a fraud risk , as before starting FTX SBF engaged in the Kimchi Premium, or arbitrage on South Korean and perhaps Japanese exchanges ? South Korean authorities may have a word to say about that, considering that such arbitrage was as illegal then as it is now, and SBF used a EA cutout to carry it out.
How did the Goodwill MacAskill says existed with SBF originate, and what was the breaking point and limits defined? There is much more explaining to do here than pointing out at passages of your recent book I am afraid.
Thank you for writing this.
What do you think you/we could have done better? What should you/we have done differently? My sense is that the you were one of the key people who's job it was to ensure what Sam was doing was above board (I could be mistaken here). If that's the case, I'm surprised by a lack of discussion about what should have happened differently.
Likewise if there were no safeguards here, that seems startlingly naïve from us.
Will,
Like others, I appreciate your openness about your role in the FTX debacle, no matter how limited that role was, and your readiness to consider seriously how this impacts your philosophy from the ground up. I am a long-time lurker on this forum who has a lot of admiration for the commitment shown to their cause by EA advocates, but who also has a lot of disregard for the politics of EA, which I consider to be frankly naive and inadequate to the actual nature of the problems EA seeks to address.
With regard to the collapse of FTX, it seems that only a handful people in the EA movement had a critique of wealth that would lead them to question whether the means by which Bankman-Fried acquired his wealth were entirely ethical, and (assuming that one agreed that those means were ethically questionable) whether the willingness of Bankman-Fried and his colleagues to use those means to create their wealth might lead them to make further questionable decisions in future.
I would be interested to hear from you or other EA advocates on this forum whether recent events have caused you to question not just whether Sam Bankman-Fried's behaviour in this instance was ethical, but more broadly whether the same events have caused any of you to reconsider your views about the ethical status of the super-wealthy in general, and what EA's collective view of such individuals should be. I recognise that my politics are probably not shared by most (if any!) contributors to this forum, but now might be a good time to have that conversation.
Hello Wil,
I commented this in Nathan's post earlier and this is a best practice that the EA governance team may consider creating...
All the best,
Miguel
Mentioned in a response to another post, but has MacAskill ever discussed his visits to colleges looking for recruits?
And trying to point them in certain directions?