Here I carry out a quick review of estimates of value drift in the EA community - including several unpublished estimates and a new estimate for attendees of the 2014 CEA weekend away - and based on these, try to come to an overall view.
Overall this review of the evidence updated me towards lower drop out rates among those who are highly engaged, in part because the new samples I found suggested lower drop out rates than previous estimates.
In this post, I don’t want to address the issue of what exactly ‘value drift’ means. Instead, I will try to estimate the more neutral ‘drop out rate’, which is the rate at which people both (i) stop engaging with the effective altruism community and (ii) stop working on paths commonly considered high-impact within the community. I won’t try to assess whether people dropped out for good or bad reasons. Where possible, I look at both the rate at which people fully dropped out, and the rate at which people decreased involvement.
I will also make all the estimates conditional on ‘EA continuing in something like its current form’. If the effective altruism community falls apart, then lots of people will drop out. However, I prefer to let people make a separate estimate of EA collapse risk and then overlay that on drop out estimates.
I focus most on estimating the rate among the most highly engaged ~2,000 community members, though will also cover some data relevant to the broader group.
What’s coming up:
- What priors should we have when estimating drop out?
- A review of six drop out rate estimates
- Summary tables of all data
- A list of other considerations to take into account
- My overall estimates
- Suggestions for further work
Priors
I think we should expect that drop out rates will be very different among different groups.
My guess is that the most significant factor is someone’s degree of social integration—i.e. I expect that people with friends or colleagues who are into EA are less likely to drop out of the community.
Relatedly, I think the degree to which someone identifies with EA will be important. For instance, someone who has been featured in the media as being into EA seems much less likely to drop out. We could think of both of these as aspects of ‘engagement’.
Some other relevant factors seem likely to be: someone’s level of dedication (i.e. how much they prioritise impact compared to other factors), the length of time they’ve been involved, and whether they support ‘weirder’ or more mainstream issues. Of course, many of these factors overlap.
Another important consideration is that I expect drop out rates will decline with time—i.e. someone who has been involved for a year is much more likely to drop out next year than someone who has been involved for ten years (even holding their level of engagement constant). This means we need to be careful not to simply extrapolate forward a short-term drop out rate.
Empirical data
1. Joey Savoie’s informal survey
This is perhaps the most well known and earliest estimate (original post), so I start with it, though I think the later estimates are more useful (since engagement is more clearly defined). The later estimates also show lower drop out rates among more engaged people.
Joey considered a group of people he knew in the community, and categorised them in terms of dedication, as either (i) ‘50% donors’, defined as someone doing something equivalent to donating 50% of their income or (ii) 10% donors.
The results over 5 years are as follows:
16 people were ~50% donors → 9/16 stayed around 50%
22 people were ~10% donors → 8/22 stayed around 10%
This is a 44% and 64% drop out rate over 5 years.
Joey was also asked if any of the ‘50% donors’ dropped to being ‘10% donors’, and answered:
I did not break down the data that way when I made it, but a quick look would suggest ~75% moved from 50% to 10% and drifting was mildly concentrated at the beginning.
This would imply that among the ‘50% donors’, only ~2 out of 16 (12.5%) fully dropped out, and 31% decreased their level of engagement.
One issue with this data is that it categorises via dedication rather than social integration. My guess is that if these people had been working at EA organisations in a key EA hub, the rate would be lower—an idea supported by the later samples.
2. CEA early employees and volunteers
In the same thread as Joey’s estimate, an anonymous poster posted their estimates for the drop out rates among CEA’s team in 2011.
Overall I don't think this is an especially useful estimate due to the small sample and the issues I note below, but I also post it here for completeness. One advantage of this sample is that the measure of engagement is fairly high, objective, and was recorded at the time.
The commenter estimated that perhaps 4 out of 17 (24%) had dropped out after 6.5 years (evaluated two years ago when the post was released).
My personal estimate as of 2020 is that 3 dropped out (18%) and another 2 decreased involvement (12%) after 9 years.
Note that this estimate is about half of Joey’s—the time period is almost twice as long, and the drop out rates are slightly lower.
I think probably one person dropped out in the last 2-3 years, which would mean the marginal rate is about 3% per year.
My impression is also that drop out is concentrated into earlier years.
The advantage of this sample is that most of these people were highly integrated. On the other hand, the sample was not further filtered by level of dedication – and ranges from people who had volunteered for under a year to people who helped launch the community in the first place.
I also have the intuition that drop out from the very early days (i.e. 2013 and earlier) will be higher, since at that point it was undecided what EA even was, and everyone was new and somewhat flung together. EA has also become more professional and respectable since then, with higher salaries. I’d expect that if you looked at the team in 2013-2015, the rate would be lower. This motivated me to find the data for the CEA weekends away, discussed below.
2a. Early employees at other organisations
We also did a private analysis of the early staff at another key EA organisation to check whether the CEA estimate was unusual, and found that the drop out rates were similar to CEA from 2011 (~14% over ~6 years). (This is actually lower, but given the tiny sample sizes I think the best we can conclude is ‘roughly similar’.)
3. CEA 2014 weekend away
CEA ran two weekend retreats with about 40-80 people in the summer of 2013 and 2014. I have an attendees list for the 2014 weekend away, which I used to make a new estimate.
This sample is useful because it covers a people pursuing a wide range of career paths, but also those who have some degree of social integration. Unlike the CEA founding team list, it’s also from 2014, at which point the movement seemed more solidified, and the sample is much larger.
One downside is that attendees spanned a spectrum of engagement, from the founders of the movement to CEA volunteers to people who had only recently heard about EA and were interested in getting more involved. We looked at the drop out rate across the whole sample and then again among a more engaged subset.
Rate across the entire sample
The list contains 69 names. Several of the team went over the names checking who is still involved. We started with our own knowledge, and then checked our own engagement data for ambiguous cases.
We counted someone as ‘still involved’ if they were doing something as engaged as fulfilling a GWWC pledge.
On this basis, we counted about 10 people we think have ‘dropped out’, which is 14% of the total in 6 years.
The number of people who decreased involvement (e.g from planning a career in EA towards only fulfilling the pledge) is somewhat higher, though I haven’t done a formal estimate of this.
More engaged subset
Eyeballing the results, drop out seems very concentrated among the people who were newer and less engaged at the time.
To correct for this, we can use some additional data from 2014. In the attendees list, people were classified as either ‘speakers’ or ‘staff’.
Of the 14 classified as staff, I don’t count any clear cases of drop out. 12 are working at EA organisations, and I think the remaining 2 would still be interested in high-impact roles in the future whether at EA organisations or otherwise.
So, we could summarise this as 0% drop out over 6 years, and 14% becoming less involved, though not clearly in a permanent way.
Of the 24 categorised as ‘speakers’ (which mostly don’t overlap), I could only count 1 case of drop out (4%).
It's also interesting to note that 20 out of the 24 (83%) are currently working in EA organisations.
I think this supports my prior that people who were highly engaged and socially integrated after 2013 were not likely to drop out.
Less engaged subset
If we subtract the 24 speakers from the sample, then drop out among the remainder was 20% over 6 years, with more decreasing their level of involvement.
4. EA Survey data
The EA Survey in 2018 offered an interesting way to estimate drop out rates across a much wider group, which was carried out by Peter Hurford.
In 2014, about 1,500 people took the EA Survey, suggesting that at least that many people were involved at that date.
In the 2018 EA Survey, 885 people said they joined the EA community in 2014 or earlier.
This suggests that roughly 60% of the people who filled out the survey in 2014 are still filling it out in 2018—four years later—suggesting a drop out rate of 40% over 4 years.
We know from the 2019 survey data that this sample spans the whole spectrum of engagement. In 2019, only about half of the respondents reported a 5/5 or a 4/5 level of engagement with EA (someone working at an EA organisation would be at ‘5’). So, we should also expect it to be an overestimate of the drop out rate among the more engaged.
In 2020 we will be able to apply the same method among a subset of more engaged respondents.
5. 80k’s 'top plan change' data
We looked at people whose plan changes we have at some point classified as ‘top plan changes’ or ‘rated-100 plan changes’ (in our previous system), and considered whether they seemed to us to have dropped out of EA.
There are 16 top plan changes, and another ~11 who were counted as rated-100 plan changes in our old system, covering a wide range of career paths.
We think the level of engagement of this group is very high: we’ve previously estimated that there are only around 300 longtermist EAs who are at the level of capability/dedication/engagement typically needed to make these plan changes – so this is probably a stricter measure of engagement than self-rating as 5/5 on the EA survey, working at an EA organisation, or being a 50% donor equivalent.
Among this group, we didn’t find any cases of drop out from the community.
The average top plan change has been counted for about 2 years, so we could roughly say we measured a 0% drop out rate over 2 years.
One counter might be that people are unlikely to drop out in the couple of years after changing jobs.
6. Giving What We Can’s data
In their 2014 impact evaluation, GWWC very roughly estimated a 5% per year drop out rate. This would be 23% over 5 years.
I would expect this rate to be on the high end, since many GWWC members are less integrated into the community than those in some of the other samples, though it’s also less demanding—which could make it easier to stick with.
There was also a more recent estimate using the 2018 EA Survey data, which aimed to estimate which fraction of those in the survey are fulfilling their pledge. My view of how to interpret this data is in line with AGB, who said:
My best guess thinking over all this would be that 73% of the GWWC members in this EA survey sample are compliant with the pledge, with extremely wide error bars (90% confidence interval 45%-88%).
Note that many of those who don't fulfil the pledge, still donate a significant amount, so this is more an estimate for decreased involvement than drop out.
On the other hand, the GWWC members who fill out the EA Survey are likely to be more engaged than average, so this could suggest a higher drop out rate across the whole sample.
It’s unclear from the data how this rate is changing with time.
Summary tables
Note that the samples involve people with very different levels of engagement, and it can be misleading to compare them side-by-side. Please see the relevant sections for more explanation of each figure.
Drop out rates
Decreased engagement rates
Other considerations
In favour of greater drop out rates:
-
EA hasn’t been through a major public scandal or disaster and has had decent growth. We might expect drop out to be concentrated among these events, and so the past record underestimates it.
-
The community is still young. If you think most drop out happens in people’s 30s and 40s—especially as people transition to wanting to have a family—then this won’t be in the sample yet. This doesn’t seem to be happening among those I know who are having children, so my best guess is there won’t be a big uptick in drop out here (I think the transition from university to work is also a very big one), but I’m pretty uncertain.
In favour of lower drop out rates:
-
My impression is that the community is better managed than in the early days. It’s also more respected (e.g. AI safety is a lot less weird than in the past) and more professionalised, which should reduce drop out rates.
-
Reactivation – some people who appear to have dropped out might get involved again in the future, and I’ve seen examples in the past.
Outside views:
-
What is the drop out rate in similar movements? I don’t have any data here, and that seems like an interesting direction for future research.
-
One point is that there seems to be a strong trend of people being more involved in social activism when young and decreasing their involvement as they age (though I would guess still supporting the issues).
My overall estimates
Note that:
- I focus on the rate of decreasing involvement.
- My estimates are conditional on EA continuing. I think it’s clearer to separately estimate the chance of EA collapsing.
- These are not especially robust.
Over 5 years
The rate seems very sensitive to how engaged and socially integrated the people in the sample are.
If you roughly add up all the estimates for more engaged people, you end up with about 15% over 5 years. However, I think there remains significant differences in engagement between this group.
I put the most weight on the CEA 2014 weekend away sample for speakers, which found only ~7% becoming less involved or dropping out after 6 years.
Overall:
-
My estimate for someone who’s highly engaged, enthusiastic, and socially integrated would be about ~10% over 5 years. I estimate there are ~500 people in the community who are at this level of risk for value drift. This is a little higher than the CEA weekend away sample (despite a shorter time horizon) due to (i) the outside view (ii) putting some weight on other samples (iii) concern that this wider group is less integrated than that sample.
-
For someone who's in the top ~2000 most engaged people (which would roughly correspond to the typical level of someone working at an EA org, though would mostly not be people actually in that path), it would be a little higher, perhaps 20%.
-
Among those who are fans (e.g. 4/5 or 5/5 ‘engaged’ in the EA Survey) but not necessarily very socially integrated, I estimate 30%.
-
Among those who are GWWC members but not very involved otherwise, I estimate 40%.
Year 5-10 drop out rates
My prior is that drop out decreases over time, and in the data we have, drop out seems to be slightly more concentrated in the earlier years, so I would expect that the chances of someone who has already been highly engaged for 5 years to be much lower than when they are new.
My guess is that the rate for years 5-10 should be perhaps 50-75% of the rate for years 0-5.
If we go beyond 10 years, we don’t have any existing data. However, my prior is that the drop out rate will be declining (if the community continues in its current form), and that is also what extrapolating our current data would suggest, so my best guess is that the drop out rates over years 10-30 will be even lower, and that perhaps half of drop out will be concentrated in the first 10 years.
One big uncertainty is whether something along the lines of having children/buying a house/saving for retirement/middle age ends up being a big cause of drop out. I haven’t seen major signs of this so far, though most community members are under the age of 35, so there is little data, and the outside view seems to suggest this is a big transition.
Expected lifetime engagement
Previously, I put more weight on higher initial drop out rates for the most engaged, and would often assume that drop out rates wouldn’t decline as much. This made me think that the typical community member would stay involved for less than half of their career, which would mean that most of their expected impact would come from the earlier and middle parts of their career.
This analysis updated me towards the idea that for the most engaged current community members, much of their impact will come from work they do 15+ years in the future (depending on your assumptions about discount rates and how much productivity increases over a career).
For instance, if the drop out rate for the most engaged core is:
- Year 0-5: 10%
- Year 5-10: 7%
- Year 10-30: 15%
Then, the chance of staying involved the rest of their career is about 70%, which would mean for someone with a 40 career ahead of them, the expected length of engagement is roughly 30 years (see Larks' calculation in the comments).
Further work
There is a lot of further work that could be done, including:
-
Collect more overall estimates of drop out rates based on the data we already have. An average of estimates should be more accurate than just mine alone.
-
A second category of work would be trying to do more surveys.
-
The ideal survey would pre-identify a sample of people, collect enough information on them that we can control for and subdivide by the predictors of drop out, and then do heavy follow up in 5 years to make sure nobody is missed.
-
We will be able to do something a bit like this with 80k’s top plan change data.
-
It’s possible the EA Survey team could also do something like this if they collect more contact information from a subset of respondents.
-
-
There are probably more older samples that people could dig out and follow up on themselves, such as the analysis of CEA’s early team.
-
It could be useful to look at more outside view estimates, such as the rates of drop out among other social movements, or how much people’s attitudes and priorities tend to change over their life on average.
-
My estimates could be carried out with real statistical analysis. For instance, we could use the samples to estimate confidence intervals on the drop out rates, or carry out a mini meta-analysis with all of the samples. I also wonder if statistical models (like Laplace’s rule of succession) could be used to improve our estimates.
Thanks for this post! I found it quite interesting and useful.
I feel like some parts of this post could give the (very likely inaccurate) impression that you/80,000 Hours thinks working at an EA organisation is distinctly better than essentially all other roles. Specifically, I think this might result from the post repeatedly talking about people who work(ed) at EA orgs, and whether they left, without as repeatedly/prominently talking about people working in other high-impact roles (e.g., in non-EA academia, politics, or AI labs).
I'm pretty sure that the real reasons why this post gives disproportionate attention to data from EA orgs are simply that:
And those seems to me to be good reasons for this post to be the way it is.
But I felt like it might be worth making those reasons explicit, to counter potential (very likely mistaken) inferences that Ben Todd/80k considers working at EA orgs to be essentially the most impactful thing to do. That's because, historically, many people seem to have updated hard on what they perceived 80k to be saying, even when 80k didn't mean that, including on this topic in particular. (And there are also other things that seem to bias EAs towards focusing overly much on roles at EA orgs, as discussed e.g. here.)
Hi Michael, I made some quick edits to help reduce this impression.
I also want to clarify that out of the 6 methods given, only 1 is about people working at EA organisations.
Is there even 1 exclusively about people working at EA organisations?
If someone had taken a different job with the goal of having a big social impact, and we didn't think what they were doing was horribly misguided, I don't think we would count them as having 'dropped out of EA' in any of the 6 data sets.
I was referring to things like phrasings used and how often someone working for an EA org vs not was discussed relative to other things; I wasn't referring to the actual criteria used to classify people as having dropping out / reduced involvement or not.
Given that Ben says he's now made some edits, it doesn't seem worth combing through the post again in detail to find examples of the sort of thing I mean. But I just did a quick ctrl+f for "organisations", and found this, as one example:
This is definitely not explicitly saying "not dropping out = working at an EA org". Instead, I think it's meant as something more like "There are many ways one can stay involved in EA, but in this case we had the obvious evidence that most of these people were still working at EA orgs, making it unnecessary to check if they were still involved in other ways."
That said:
Also, to be clear, I didn't mean my original comment as even a mild criticism of this post, really. I just thought it would be useful for this point to be explicitly made, to push against an impression some people might erroneously form after reading this post.
[1] To the extent to which it seems plausible that 80k has contributed to this phenomena, I don't think it would've been easy for someone else to have done better. I think 80k has an unusual degree of prominence and respect in the EA community that makes it unusually likely that people will be influenced by 80k in ways that 80k didn't intend, even if 80k is doing a well-above-average job of communicating carefully and with nuance. (And I indeed think 80k is doing a well-above-average job of that.)
FWIW, I did a quick meta-analysis in Stan of the adjusted 5-year dropout rates in your first table (for those surveys where the sample size is known). The punchline is an estimated true mean cross-study dropout rate of ~23%, with a 90% CI of roughly [5%, 41%]. For good measure, I also fit the data to a beta distribution and came up with a similar result.
I struggle with how to interpret these numbers. It's not clear to me that the community dropout rate is a good proxy for value drift (however it's defined), as in some sense it is a central hope of the community that the values will become detached from the movement -- I think we want more and more people to feel "EA-like", regardless of whether they're involved with the community. It's easy for me to imagine that people who drift out of the movement (and stop answering the survey) maintain broad alignment with EA's core values. In this sense, the "core EA community" around the Forum, CEA, 80k, etc is less of a static glob and more of a mechanism for producing people who ask certain questions about the world.
Conversely, value drift within members who are persistently engaged in the community seems to be of real import, and presumably the kind of thing that can only be tracked longitudinally, by matching EA Survey respondents across years.
Hi Matt,
It's cool you did that, though I wouldn't recommend simply combining all the samples, since they're for really different groups at very different levels of engagement (which leads to predictably very different drop out rates).
A quick improvement would be to split into a highly engaged and a broader group.
The highly engaged meta-analysis could include: Joey's 50% donors; CEA weekend away highly engaged subset; 80k top plan changes; CEA early employees.
The broader meta-analysis could be based on: GWWC estimate; EA survey estimate; Joey 10% donors; CEA weekend away entire sample.
I'd be keen to see the results of this!
This is the reason for doing a random effects meta-analysis in the first place: the motivating assumption is that the populations across studies are very different and so are the underlying dropout rates (e.g. differing estimates are due not just to within-study variation but also to cross-study variation of the kind you describe).
Still, it was sloppy of me to describe 23% as the true estimate above- in RE, there is no true estimate. A better takeaway is that, within the scope of the kind of variation we see across these survey populations, we'd almost certainly expect to see dropout of less than 40%, regardless of engagement level. Perhaps straining the possibilities of the sample size, I ran the analysis again with an intercept for engagement-- high engagement seems to be worth about 21 percentage points' worth of reduced dropout likelihood on the 5-year frame.
>60% persistence in the community at large seems pretty remarkable to me. I understand that you haven't been able to benchmark against similar communities, but my prior on youth movements (as I think EA qualifies) would be considerably higher. Do you have a reference class for the EA community in mind? If so, what's in it?
Thank you, that's helpful!
Do you mean 21 percentage points, so if the overall mean is 23%, then the most engaged are only 2%? Or does it mean 21% lower, in which case it's 18%?
I'm not aware of a good reference class where we have data - I'd be keen to see more research into that.
It might be worth saying that doing something like taking the GWWC pledge is still a high level of engagement & commitment on the scale of things, and I would guess significantly higher than the typical young person affiliating with a youth movement for a while.
(The mean & median age in EA is also ~28 right now, so while still on the youthful side, it's not mainly not students or teenagers.)
The former! This is pretty sensitive to modeling choices-- tried a different way, I get an engagement effect of 31 percentage points (38% vs. 7% dropout).
The modeling assumption made here is that engagement level shifts the whole distribution of dropout rates, which otherwise looks the same; not sure if that's justifiable (seems like not?), but the size of the data is constraining. I'd be curious to hear what someone with more meta-analysis experience has to say about this, but one way to approximate value drift via a diversity of measurements might be to pile more proxy measurements into the model—dropout rates, engagement reductions, and whatever else you can come up with—on the basis that they are all noisy measurements of value drift.
I'd be super curious to know if the mean/median age of EA right now is a function of the people who got into it as undergrads or grad students several years ago and who have continued to be highly engaged over time. Not having been involved for that long, I have no idea whether that idea has anecdotal resonance.
It would be super interesting to work on how to improve "retainment" with social integration. I was thinking that having a regular gather.town "mega meeting" of EAs may be pretty nice in times of confinement to promote social interactions, project collaborations, etc.
Do you have in mind that people who support more mainstream issues - like climate change or global health and development, rather than AI safety - are more likely to leave EA because they have more alternative options of people to talk to? Or the same prediction, but because of something else, like a focus on more distinctly EA issues being evidence of a "more distinctly EA mindset"?
Or do you have in mind that people who support issues that are perceived as "weirder" within EA - like anti-ageing or psychedelics research - are more likely to leave EA?
The comment on AI safety having become less weird prompted the following thought: Perhaps a (weak) argument that drop-out rates will increase in future is that:
But perhaps 1 and/or 2 are quite unlikely. Or perhaps we shouldn't call that "drop -out" exactly, since the people would still be focused on issues we consider important (just not "under an EA banner").
Thanks, I find this very useful!
I guess I would refine the"weird cause area" reason with adding that some EAs may leave because of strong disagreement with some EA mainstream or public figures' views. For example, a few years ago climate change was not taken as an x-risk, and somewhat regularly dismissed, which would have put off a few longtermists. I know someone who left EA because of strong disagreement with how AI safety is handled - eg encouraging working for an organization that works on AGI development. Basically, I think that sometimes there is a "tipping point" for strong disagreement where some people leave. Ideally, EA would be able to strongly focus on "EA is a question, not an ideology" so that people who have informed different opinions still say in.
I suspect that burnout may also be another reason why people in EA orgs leave.
Cause preference (i.e. prioritising different causes than the EA community or thinking that the EA community focused too much on particular causes and ignored others) was the second most commonly cited reason among people who reported declining interest in EA.
Thank you, this list is a useful complement to this post.
Thanks for this useful summary!
Note that section 4 reiterates Peter Hurford's analysis in a post from last year.
One possibility is to take a look at the top contributors to Felicifia, an early EA/utilitarian forum, and note how many are still around. Louis Francini kindly restored the original site earlier this year, which had been down for a long time, so this can be done very easily.
Ah sorry I meant to link to Peter Hurford's analysis - I'll add it now.
My understanding is that David/Rethink has a reasonably accurate model of this, i.e. they can predict how someone would respond to the engagement questions on the basis of how they answered other questions.
It might be interesting to try doing this to get data from prior years.
These are still the best data on community drop out I'm aware of.
Nice work with this!
One thing that comes to mind, (though perhaps a bit strange), is to really consider Effective Altruism under a similar lens as you would a SaaS product or similar. In the SaaS (software as a service) industry, there are a fair bit of best practices around understanding retention rates, churn, and doing cohort analysis and the like. There's also literature in evaluating the quality of a product on NPS score and better. It could be neat to have people rank "Effective Altruism" and "The EA Community" on NPS scores.
Likewise, it could be interesting to survey people with things like, "How would you rate the value you are getting from the EA ecosystem", and then work to maximize this value. Consider the costs (donations, career changes) vs. the benefits and see if you can model total value better.
Hey Ozzie, that makes sense. I think the last EA survey did some things pretty similar to this, inc. asking about value adds & issues, and something similar to the NPS score, as well as why people don't recommend it.
Yeh much of this is in our Community Information post where we:
I'm pretty sceptical about the utility of Net Promoter Score in the classical sense for EA. I don't think there's any good evidence for the prescribed way of calculating Net Promoter Score (ignoring respondents who answer in the upper-middle of the scale, and then subtracting the proportion of people who selected one of the bottom 7 response levels from the proportion who selected one of the top two response items). And, as I mentioned in our original post, its validity and predictive power has been questioned. Furthermore, one of the most common uses is comparing the NPS score of an entity to an industry benchmark (e.g. the average scores for other companies in the same industry), but it's very unclear what reference class would be relevant for EA, the community, as a whole, so it's fundamentally not clear whether EA's NPS score is good or bad. In the specific case of EA, I also suspect that the question of how excited one would be to recommend EA to a suitable friend may well be picking up on attitudes other than satisfaction with EA, i.e. literally how people would feel about recommending EA to someone. This might explain why the people with the highest 'NPS' scores (we just treated the measure as a straightforward ordinal varlable in our own analyses) were people who had just joined EA, and fairly reliably became lower over time.
Are you assuming quite short careers? Using bucket midpoints I calculate
Which suggests you are using ~24 years for a full career, which seems a little low. If I substitute 40 years I get over 30 years of engagement.
The answer does not change very much when I converted these numbers to annualised risk factors in excel (and assumed 100% dropoff at year 40).
I was doing a very hacky calculation - I'll change to 30 years and mention your comment.