Hide table of contents

Last month, we[1] ran the 2024 edition of the Meta Coordination Forum (MCF) near Santa Cruz, California. As outlined in our previous post, the event brought together 39 people leading EA community-building organizations and projects to foster a shared understanding of some of the biggest challenges facing our community and align on strategies for addressing these problems. Here's a high-level summary of how it went and how we might improve future iterations.

Event highlights

Overall, we think the Meta Coordination Forum 2024 was successful in facilitating connections and collaborations while improving attendees'[2] understanding of both the EA brand/communications and the current funding landscape:

  • Attendees rated their likelihood to recommend at 8.94/10 on average.
  • 88.5% of attendees found it more valuable than how they would typically spend their time, with 57.1% rating it at least 3 times more valuable.
  • Over 90% of attendees reported an improved understanding of both the EA communications landscape and the funding ecosystem (our two focus areas).
  • Attendees particularly valued dedicated time for 1:1s, practical skills training (e.g., media engagement), and focused discussions on EA's key challenges.

Key outcomes

  1. Improved understanding of focus areas: Over 90% of survey respondents reported an improved understanding of both the EA communications landscape and the funding ecosystem. This was one of our main goals.
  2. Improved relationships: The event provided valuable opportunities for networking and trust-building among people leading EA community-building organizations and projects. Many attendees reported that the event was useful for building new connections and strengthening existing ones.
  3. Improved motivation and morale: Multiple attendees reported feeling reinvigorated and more committed to their work as a result of attending the event.
  4. Initial concrete results:
    1. New funding leads for organizations
    2. Improved coordination between organizations and plans for collaborative projects
    3. People being more willing to engage in public communications

We'll follow up with attendees in 6 months to assess longer-term outcomes.

Future considerations

Based on attendee feedback and our observations, we're considering the following for future events:

  1. Extending the event duration to allow for more 1:1 meetings or adding a one-day event around an EAG.
  2. Incorporating more practical skills training sessions and inviting more experts from relevant areas.
  3. Exploring ways to balance improving understanding with generating actionable next steps, acknowledging the challenges of creating concrete action plans for complex issues in a short timeframe.

Conclusion

We're grateful to all of this year’s attendees for their valuable contributions and feedback, and look forward to applying these insights to future events. 

Please see our previous announcement post for more details about the event's goals and attendees.

  1. ^

     The organizing team was Amy Labenz, Ollie Base, Sophie Thomson, Niko Bjork, and David Solar.

  2. ^

     The following metrics are based on 35 feedback survey responses out of 39 attendees.

Comments6
Sorted by Click to highlight new comments since:

For me, and as someone who is involved in object level EA work for many years, this event and its main takeaways are quite underwhelming:

  1. It seems like the vast majority of the people who attended the conference do meta EA work and/or work at a large EA org (e.g. OP, GWWC, CEA). this seems like a massive skew, and a lot of the impact that the movement generates comes from people doing object level work e.g. working at an impactful GH charity, doing biosecurity policy work. Therefore, it should follow that they would be represented more proprtionally at the MCF.
  2. The goals of the meeting seem quite underambitious, and its outcomes underwhelming as a result.  Goals of improved understanding of focus areas, relationships and motivation and morale for a small group of people seems like an extended "pep talk" for EA leaders rather than a more thorough investigation of more fundamental questions- I must profess that I dont have a great list of what those should be, but it would feel like strategic questions about where EA sees its marginal values/its strategy with outreach and funding etc.
  3. It seems like the people invited and its agenda were largely done without consulting the rest of the community; I understand that this is hard, but why didn't you ask the forum what are pressing questions that they think the MCF should try and work together on? This seems like a really obvious thing to do.
  4. As a largely side note, the self-reported data on how people found the MCF/its NPS seems a largely useless metric of success. As with the design of any scientific study, you should have set out clear, objective (where possible) outcomes on which you would measure success before and measure those afterwards.

It seems like the vast majority of the people who attended the conference do meta EA work and/or work at a large EA org (e.g. OP, GWWC, CEA). 

Isn't that what you'd expect from a Meta Coordination Forum? It's the forum for meta people to coordinate at. There are other forums for people doing object-level work.

I think this point teases out my underlying issue with the forum

  • If this event was a coordination forum for meta EA individuals, then it would be reasonable for the vast majority of the attendees to be people who do EA meta work
  • If, as I thought + think is more useful, this is a coordination forum on meta EA issues, then this is not a good composition of people.

On this:

  1. The original event aim definitely sounds much more like the latter
  2. I think even if the event claims to be the former (whch I think would be a retrospective change in the stated outcome of the event), the nature of the people and orgs attending mean that some aspects of the latter would have been discussed/worked through; because of this, I think my original points largely stands

(I helped organise this event)

Thanks for your feedback.

Actually, I think this event went well because:

  • The organising team (CEA) were opinionated about which issues to focus on, and we chose issues that we and MCF attendees could make progress on.
  • Our content was centered around just two issues (brand and funding) which allowed for focus and more substantive progress.

Many attendees expressed a similar sentiment, and some people who’ve attended this event many times said this was one of the best iterations. With that context, I’ll respond to each point:

  1. We wanted to focus on issues that were upstream of important object-level work in EA, and selected people working on those issues, rather than object-level work (though we had some attendees who were doing object-level work). I agree with you that a lot of (if not all!) the impact of the community is coming from people working at the object level, but this impact is directly affected by upstream issues such as the EA brand and funding diversity. Note that many other events we run, such as EA Global and the Summit on Existential Security, are more focused on object-level issues.
  2. To the contrary, I think we made valuable progress, though this is fairly subjective and a bit hard to defend until more projects and initiatives play out. I’m not sure what the distinction is you’re pointing to here; you mention we should’ve considered “[EA]’s strategy with outreach and funding”, but these were the two core themes of the event.
  3. This was a deliberate call, though we’re not confident it was the right one. CEA staff and our attendees spend a lot of time engaging with the community and getting input on what we should prioritise. We probably didn’t capture everything, but that context gives us a good grasp of which issues to work on.
  4. I don't think every event, project and meeting in EA spaces needs to be this stringent about measuring outcomes. We use similar metrics across all of our events, things like LTR/NPS are used in many other industries, so I think these are useful benchmarks for understanding how valuable attendees found the event.

Thanks for posting this! I appreciate the transparency from the CEA team around organizing this event and posting about the results; putting together this kind of stuff is always effortful for me, so I want to celebrate when others do it.

I do wish this retro had a bit more in the form of concrete reporting about what was discussed, or specific anecdotes from attendees, or takeaways for the broader EA community; eg last year's MCF reports went into substantial depth on these, which really enjoyed. But again, these things can be hard to write up, perfect shouldn't be the enemy of good enough, and I'm grateful for the steps that y'all have already taken towards showing your work in public.

Thanks, Austin :)

Results from the survey we conducted at the event (similar to the one you linked to) are still to come. Rethink Priorities led on that this year, and are still gathering data / putting it together. 

Curated and popular this week
Relevant opportunities