I am currently pursuing an EA-motivated personal project. Posting the details on the EA forum seems like a great way to get feedback, accountability, and a stronger sense of community. I’d like to write out a full post at some point, but I’m starting with a shortform to ease into things and make it less scary.
My current mission is to become an excellent coach, specializing in personal growth and talent development. My previous role was in software engineering, where I earned-to-give and accumulated a financial runway, which I am now using to give myself ~1 year to focus on this mission and evaluate if it would be feasible in the longer-term.
The argument for coaching
The real reason I chose this mission is because that is what gives me the most joy and satisfaction. I’ve always preferred support and multiplier roles, and I find it much easier to get better at things I love to do at a gut-level. That being said, I also believe at an intellectual level that there is a lot of low-hanging fruit in the realm of EA talent-development. This feeds into my sense of motivation, but it’s a red-flag for motivated reasoning, which is part of my desire to write out my intuitions more explicitly. For now I will just outline some key beliefs and intuitions.
Belief: There is a significant gap between many EA’s potential for impact, and the amount of impact they will actually make by default.
This feels pretty self-explanatory to me so I won’t discuss it much, but I’d love to hear from anyone who disagrees.
Belief: There are cost-effective ways to help narrow that gap.
Intuition-pump 1: Having regular access to high-quality pair-debugging sessions feels personally valuable to me, and to many other EAs I’ve talked to, to the extent that it dwarfs the costs involved. A significant proportion of the EAs I know have some desire for regular debugging, yet aren’t getting it currently. If someone became reasonably good at debugging, it feels like doing only that could be higher-impact than most other options. That doesn’t feel like a high bar. Possible variations include getting good at teaching other EAs how to be good pair-debuggers.
Intuition-pump 2: If you got 3 highly-skilled coaches to launch an intensive program in which 12 early-career EAs spent 2 years building long-term skills and personal capital, it seems like it wouldn’t have to multiply their lifetime impact by that much before it was net-positive. Naively, investing 30/12=2.5 person-years each, breaking even at <7% additional expected impact over a 40-year career. It seems like it really ought to be possible to exceed that bar, and that’s with an unusually high-cost proposal!
Intuition-pump 3: I’ve been hearing for years that many EA cause-areas are primarily talent-constrained. This means that increasing the level of talent among EAs would be highly impactful. Most of the discussion I’ve seen focuses on either getting highly-talented people into the EA community, or getting EAs to try more ambitious projects to see if they are secretly more talented than they think. Looking at EA infrastructure fund grants can serve as an indication. As far as I can tell, talent-development for existing EAs seems neglected. The above intuitions point to it being tractable as well.
Counter-intuition: If there was low-hanging fruit, other EA orgs would have already picked them. If EA meta-organizations aren’t prioritizing this very much, doesn’t that indicate they don’t think it’s valuable? Maybe, but Inadequate Equilibria convinced me that this shouldn’t stop me from acting on my own beliefs. At the very least I can write up thoughts on the EA forum and see if there are strong counterarguments I’m not considering.
Key Uncertainties
Is the potential-impact gap as big as I think it is?
Is talent development as tractable as I think it is?
Which interventions are most effective?
How can I tell whether I am a good personal fit?
What are the relevant skills and how can I best pursue them?
What I’m doing now
Attempting to have coaching/debugging conversations with a variety of people, especially those outside my usual social circle. I'm trying to get a sense of what the most common bottlenecks are, how I might learn to help people get past them, and how much value I seem to be providing. (If you are interested, feel free to sign up on my calendly!)
Reading personal-growth and coaching books. I’ve read 11 so far, and I’d like to start distilling my thoughts into book reviews.
Starting to engage more with the EA forum, particularly commenting and posting.
Trying to learn a variety of skills quickly, with 10-20 hours of investment per skill. This is partly to get better at learning, but also to help me build confidence in my ability to do new things in general, which has often limited me.
What I am working on
I am currently pursuing an EA-motivated personal project. Posting the details on the EA forum seems like a great way to get feedback, accountability, and a stronger sense of community. I’d like to write out a full post at some point, but I’m starting with a shortform to ease into things and make it less scary.
My current mission is to become an excellent coach, specializing in personal growth and talent development. My previous role was in software engineering, where I earned-to-give and accumulated a financial runway, which I am now using to give myself ~1 year to focus on this mission and evaluate if it would be feasible in the longer-term.
The argument for coaching
The real reason I chose this mission is because that is what gives me the most joy and satisfaction. I’ve always preferred support and multiplier roles, and I find it much easier to get better at things I love to do at a gut-level. That being said, I also believe at an intellectual level that there is a lot of low-hanging fruit in the realm of EA talent-development. This feeds into my sense of motivation, but it’s a red-flag for motivated reasoning, which is part of my desire to write out my intuitions more explicitly. For now I will just outline some key beliefs and intuitions.
Belief: There is a significant gap between many EA’s potential for impact, and the amount of impact they will actually make by default.
This feels pretty self-explanatory to me so I won’t discuss it much, but I’d love to hear from anyone who disagrees.
Belief: There are cost-effective ways to help narrow that gap.
Intuition-pump 1: Having regular access to high-quality pair-debugging sessions feels personally valuable to me, and to many other EAs I’ve talked to, to the extent that it dwarfs the costs involved. A significant proportion of the EAs I know have some desire for regular debugging, yet aren’t getting it currently. If someone became reasonably good at debugging, it feels like doing only that could be higher-impact than most other options. That doesn’t feel like a high bar. Possible variations include getting good at teaching other EAs how to be good pair-debuggers.
Intuition-pump 2: If you got 3 highly-skilled coaches to launch an intensive program in which 12 early-career EAs spent 2 years building long-term skills and personal capital, it seems like it wouldn’t have to multiply their lifetime impact by that much before it was net-positive. Naively, investing 30/12=2.5 person-years each, breaking even at <7% additional expected impact over a 40-year career. It seems like it really ought to be possible to exceed that bar, and that’s with an unusually high-cost proposal!
Intuition-pump 3: I’ve been hearing for years that many EA cause-areas are primarily talent-constrained. This means that increasing the level of talent among EAs would be highly impactful. Most of the discussion I’ve seen focuses on either getting highly-talented people into the EA community, or getting EAs to try more ambitious projects to see if they are secretly more talented than they think. Looking at EA infrastructure fund grants can serve as an indication. As far as I can tell, talent-development for existing EAs seems neglected. The above intuitions point to it being tractable as well.
Counter-intuition: If there was low-hanging fruit, other EA orgs would have already picked them. If EA meta-organizations aren’t prioritizing this very much, doesn’t that indicate they don’t think it’s valuable? Maybe, but Inadequate Equilibria convinced me that this shouldn’t stop me from acting on my own beliefs. At the very least I can write up thoughts on the EA forum and see if there are strong counterarguments I’m not considering.
Key Uncertainties
What I’m doing now