I’m a little confused who this is for: I think it’s for anyone who might want thoughts on orienting to the FTX situation in ways they’d most endorse later, especially if they are in a position of leadership or have people relying on them for guidance. It might not be coherent, it's just some thoughts, in the spirit of Scattered Takes and Unsolicited Advice.
This is written in my personal capacity, not as an employee of CEA
Something I’m thinking about a lot right now is how rationality, values, and judgment can be hardest to use when you need them most. My vision of a community is one that makes you most likely to be your best self in those times. I think I'm seeing a lot of this already, and I hope to see even more.
So for anyone thinking about FTX things, or talking about them with others, or planning to write things in that strange dialect known as “comms”, here’s my set of things I don’t want to forget. Please feel encouraged to add your own in the comments.
Integrity
- There is no party line (I say by fiat) - I want EA to be ok after this, and it’s sure true that there are things people could say that would make that less likely, but I just really really don’t want EA to be a place where people can’t think and say things
- I want to give explicit okness to questions and wondering around judgment, decision quality or integrity of EA, EAs and EA leaders, and I don’t want to have a missing mood about people’s understandable curiosity and concern
- That said, obviously people in sensitive situations may not answer all questions you’re curious about, and in fact many of them have legal reasons not to.
- It is not your job by dint of being an EA to protect “EA the brand”.[1] You may decide that the brand is valuable in service of its goals; you may also have your own opinions on what the brand worth protecting is.
- Sometimes I have opinions about what makes sense to share based on confidentiality or other things, but at a broad stroke, I tend to be into people saying the truth out loud (or if my system 1 says different, I want to be into it).
- I want to give explicit okness to questions and wondering around judgment, decision quality or integrity of EA, EAs and EA leaders, and I don’t want to have a missing mood about people’s understandable curiosity and concern
- Soldier-iness (here meaning the feeling of wanting to defend “your tribe”) is normal and some of it is tracking real and important things about the value of what we’ve built here. Integrity doesn’t mean highlighting every bad faith criticism. (But also don’t let the desire to protect what is valuable warp your own beliefs about the world)
- There are going to be a lot of incentives to pile on, especially if any particular narrative starts emerging, and I also want EA to be a place where you can say “this thing that looks bad doesn’t seem actually object level bad to me for these reasons”, or “Utilitarianism is good, actually” [2]or “EA is/isn’t worse than the reference class on this” or “I think the ways in which EA is different from other movements was a worthwhile bet, even if it added risk” or “I don’t think I know enough about this to have a take.”
- Updating your views makes sense, but probably you for the moment have most of the same views you had two weeks ago, and overupdating also lands you in the wrong place
- I would be sad if people jumped too quickly to repudiate their system of ethics, or all the unusual features of it that have let us aim at doing an unusual amount of good. I would also be sad if the vibe of our response felt disingenuous - aiming to appear less consequentialist than is the case (whatever that true case is), less willing to think about tradeoffs, etc.
- You don’t even need to have one take - you can just say a lot of things that seem true to you
- There are going to be a lot of incentives to pile on, especially if any particular narrative starts emerging, and I also want EA to be a place where you can say “this thing that looks bad doesn’t seem actually object level bad to me for these reasons”, or “Utilitarianism is good, actually” [2]or “EA is/isn’t worse than the reference class on this” or “I think the ways in which EA is different from other movements was a worthwhile bet, even if it added risk” or “I don’t think I know enough about this to have a take.”
- I want to say things here, on twitter, out loud, etc, that are filtered by “is this helping me and others think better and more clearly”. I might not always be maximizing “epistemic support”, but I certainly don’t want to be a burden to it.[3]
- I think people underestimate how valuable it can be to others to say what you think and why you think it, how you feel and whether you endorse it, and what you’re still uncertain about. I’d often recommend doing that rather than waiting to talk to people until you’re certain, especially if you’re not in a situation where legal advice is relevant.
- I worry about jumping quickly to good stories or framings
- Like neat distinctions (Sam is the naive kind of utilitarian, not like us) being a new kind of EA judo
- An adaptation of a question I heard recently: “What’s a way you can especially live out your values in the next weeks and months?”
Interlude - a note of caution
- The New York Times is quoting tweets, Bloomberg is reading the forum, be aware that anything you say might go viral or get misinterpreted on social media or the news, with all the attendant stresses. I’m writing this in a personal capacity, but I happen to know that if that happens and you’d like support from the community health team, this form for contacting the team is here.
- Some people are in tricky legal situations which inform what they choose to say. Decide for yourself how generous to be to that possibility.
Being in a position of supporting others
7. Worry, confusion, anger, stress, sadness, betrayal, are all normal, as is “I’m confused, I’m going to not think about this for a few weeks and see what shakes out” - people’s emotions don’t need to be changed or managed, though people might need support
8. I think honoring your own feelings - including all the above - is good, and can be done while being clear about which feelings you endorse and don’t
9. Be honest with people, including meta-honesty or meta-transparency if that makes sense (ie tell people when there are things you won’t be able to talk openly about)
10. A possible failure mode is to want to throw yourself under the bus quickly - it’s good to reflect on your own thoughts and judgment, but jumping too early to certainty isn’t helpful even when you’re the one taking the hit
If you’re trying to figure out what’s true
11. I think “What do you think you know and why do you think you know it” remains a crucial question, especially since we’re in a position where more information is likely to come out in the future, and much information isn’t in a position to be shared. It’s hard for me to think there is a person reading this who should reasonably think they have the full story. Alongside, making your epistemic status and reasoning transparent helps collaborative sensemaking happen.
12. Split and commit - when you’re confused about what’s true, figure out in advance what your views will be if different things turn out or you see different evidence.
- E.g. how will you update if enough money is raised by FTX.com to make the FTX.com customers whole? Or if it turns out there’s even less than is currently thought?
- E.g. how will you update if what FTX did was unethical but not illegal? Normal in crypto world but not in normal business world? Or the biggest financial scandal since 2008?
- E.g. how will you update if EAs are more furious about this than you expect? Less?
13. People are going to disagree about what this situation means, how to react, how people should reflect, etc. and you get to have your own thoughts in contention with them, or rejigger your whole thinking if you get a big update. You might want to think about your models of the world, what they would have expected, what evidence is different if your model is false and what evidence is equally likely regardless.
14. It’s normal for opinions to jostle a lot back and forth - notice when you’re not in reflective equilibrium and be honest with yourself and others about it. I, for instance, find my orientations and feelings to this jumping around a lot, so the tone of my takes is likely to differ from day to day. I want to be really open about that and say that I, like so many others, am muddling through it, and trying to only make decisions I have agreed with for a continuous 48 hours
15. I worry about hindsight bias being very prevalent, especially in judgments of others
I also think it is very sensible to think ahead to future difficult circumstances and try to extract a lot of learning from this
16. How do I think the world (of communities, of crypto, of finance, of scandals, in general) works? What am I surprised by?
17. What predictions can I make now to see in days, weeks, months or years whether my model of the world is correct?
Fin
If you’re feeling pressured to take some party line, the community health team would love to hear from you, to support you, to get feedback and to know what’s happening. CEA is giving out media guidance and plans to give out more, but in the end you will decide how you want to act and what seems right to you.
I’m really motivated by a vision of supporting other people to act as they endorse in the next days and weeks and get that support myself, to get and give whatever advice and encouragement makes that possible even while it’s hard. I also expect to fuck up, and for other people to fuck up, and to keep trying.
- ^
I like this comment about PR and honesty
- ^
I like Tyler Cowen’s line: "I do anticipate a boring short-run trend, where most of the EA people scurry to signal their personal association with virtue ethics. Fine, I understand the reasons for doing that, but at the same time grandma, in her attachment to common sense morality, is not telling you to fly to Africa to save the starving children (though you should finish everything on your plate). Nor would she sign off on Singer (1972). While I disagree with the sharper forms of EA, I also find them more useful and interesting than the namby-pamby versions."
- ^
Example 1: I’m pretty happy that this post, “We must be very clear: fraud in the service of effective altruism is unacceptable” exists, and also that its comment section contains arguments about whether it itself is too soldier-y, whether that’s an appropriate type of thinking for people trying to understand human rationality and solving important problems, whether sacred values are being pulled in that you’re not allowed to trade against and how bad that is, what utilitarianism really means and says about the kind of behavior FTX might have engaged in. All of it - and especially the comments section - makes me feel safer to think for myself.
Example 2: Ronny Fernandez’s policy for tweeting
I think in times like these we need good epistemics more than ever
Thanks. There is a lot of good advice in this post, and I appreciate it.
One thing I have tried to keep in mind is that EA is a bunch of different things. It refers to abstract principles, to concrete ideas, and to specific actions; to a broad community, to particular institutions, and to individual people.
What the "FTX situation" means for EA varies across that landscape. I don't think any differently about Famine, Affluence, and Morality or the importance of funding projects focused on the future. I do think differently about the FTX Future Fund and how affluent donors affect moral & epistemic clarity.
That framing has helped me to make sense, at least in part, of many of the contradictory emotions and reactions of the past ~10 days.
Thank you for sharing your thoughts. This whole post is dense with super sensible and helpful generally applicable advice. I really enjoyed reading this.
I didn't mention this at the time, but I was grateful you wrote this post!
I also think about the onion test all the time, and generally admire you for modeling high integrity :)
Chana -- thanks for your wisdom and insights in this post.
To expand upon this issue of EA wanting to 'protect 'EA the brand', and feelings of soldier-iness and EA tribalism:
It's worth remembering that tribalism evolved for good game-theoretic reasons, in the context of group-vs-group competition.
As Darwin put it in The Descent of Man (1871): "A tribe including many members who, from possessing in a high degree the spirit of patriotism, fidelity, obedience, courage, and sympathy, were always ready to aid one another, and to sacrifice themselves for the common good, would be victorious over most other tribes; and this would be natural selection.”
Or, as Jonathan Haidt put it in this passage from The Righteous Mind, humans are sort of 90% chimpanzee and 10% bee (in the sense of having the capacity to act similar to eusocial insects, for the good of the group, under some conditions). (NB the concept of 'group selection was often rejected by evolutionary biologists from c. 1966 through about the mid-90s, but has been revived in the form of 'multi-level selection', and the evolutionary game theory of 'group selection' is now recognized as functionally interchangeable with 'selfish gene' thinking, when genes can form individual-level and group-level aggregates).
So, humans evolved tribalism and soldier-iness, including tribal emotions, motivations, cognitions, and reactions, over millions of years of intensive group-vs-group competition.
Is this a defense of tribalism in modern EA, in response to crises and criticism?
No.
The dynamics of prehistoric group-vs-group competition, warfare, territory disputes, and resource competition don't map perfectly onto the dynamics of 21st century moral/social movements. There are some deep similarities that should not be discounted, but there are also important differences -- especially given the way that social media shapes PR narratives, and the fact that we're not engaged in physical, life-or-death warfare over land or resources, but in psychological wars of influence over beliefs and values.
I'm just trying to remind people not to feel too guilty or self-critical if we feel these 'protect our EA tribe at all costs!' kind of emotions bubbling up. Of course they will bubble up. We're hyper-social primates who evolved in clans and tribes.
Another thing to be cautious about is that, given human tribal psychology, people who are perceived as traitors or defectors to their group, especially in times of crisis, may suffer some heavy reputational costs in the future. This concern for tribal loyalty is partly something to guard against, but partly something to take pragmatically into account, when weighing whether, when, and how to 'speak up' with criticisms of EA culture and organizations.
Note: I'm being descriptive about human tribal psychology here, not prescriptive or normative about what exact lessons we should take away from all this. I am concerned that EAs who have more exposure to moral philosophy, computer science, and cognitive biases research than to evolutionary psychology (my field) might become overly hard on themselves for feeling ordinary human tribalistic feelings in times of crisis.
Thanks for writing this, I think it's a valuable post with actionable suggestions.
Emotions are naturally running very high right now, and this is good both to remind people that yes, it is ok to have strong emotions about it and that these reactions are understandable and normal.