Hide table of contents

Large movement organising takes time. It takes listening deeply to many communities' concerns, finding consensus around a campaign, ramping up training of organisers, etc.

But what if the AI crash is about to happen? What if US tariffs[1] triggered a recession that is making consumers and enterprises cut their luxury subscriptions? What if even the sucker VCs stop investing in companies that after years of billion-dollar losses on compute, now compete with cheap alternatives to their not-much-improving-LLMs?

Then there is little time to organise and we must jump to mobilisation. But AI Safety has been playing the inside game, and is poorly positioned to mobilise the resistance.

So we need groups that can:

  1. Scale the outside game, meaning a movement pushing for change from the outside.
  2. Promote robust messages, e.g. affirm concerns about tech oligarchs seizing power.
  3. Bridge-build with other groups to start campaigns around connected concerns.
  4. Legitimately pressure and negotiate with institutions to enforce restrictions.

Each group could mobilise a network of supporters fast. But they need money to cover their hours. We have money. Some safety researchers advise tech billionaires. You might have a high-earning tech job. If you won't push for reforms, you can fund groups that do.

You can donate to organisations already resisting AI, so more staff can go full-time.
Some examples:

Their ideologies vary widely, with some controversial to other groups. By supporting many to stand up for their concerns, you can preempt the ‘left-right’ polarisation we saw around climate change. Many different groups are needed for a broad-based movement.

At the early signs of a crash, groups need funding to ratchet up actions against weakened AI companies. If you wait, they lose their effectiveness. In this scenario, it is better to seed fund many proactive groups than to hold off.[2] 

Plus you can fund coaches for the groups to build capacity. The people I have in mind led one of the largest grassroots movements in the last decade. I'll introduce them in the next post. 

There is also room for large campaigns grounded in citizens' concerns. These can target illegal and dehumanising activities by leading AI companies. That's also for the next post.

Want to discuss more?  Join me on Sunday the 20th. Add this session to your calendar.

  1. ^

    The high tariffs seem partly temporary, meant to pressure countries into better trade deals. Still, AI's hardware supply chains span 3+ continents. So remaining tariffs on goods can put a lasting damper on GPU data center construction. 

    Chaotic tit-for-tat tariffs also further erode people’s trust in and willingness to rely on the US economy, fueling civil unrest and eroding its international ties. The relative decline of the US makes it and its allies vulnerable to land grabs, which may prompt militaries to ramp up contracts for autonomous weapons. State leaders may also react to civil unrest by procuring tools for automated surveillance. So surveillance and autonomous weapons are "growth" opportunities that we can already see AI companies pivot to.

  2. ^

    Supporting other communities unconditionally also builds healthier relations. Leaders working to stop AI's increasing harms are suspicious of us buddying up with and soliciting outsized funds from tech leaders. Those connections and funds give us a position of power, and they do not trust us to wield that power to enable their work. If it even looks like we use our money to selectively influence their communities to do our bidding, that will confirm their suspicions. While in my experience, longtermist grants are unusually hands-off, it only takes one incident. This already happened – last year, a fund suddenly cancelled an already committed grant, for political reasons they didn't clarify. The recipient runs professional activities and has a stellar network. They could have gone public, but instead decided to no longer have anything to do with our community. 

No comments on this post yet.
Be the first to respond.
Curated and popular this week
Relevant opportunities