Hide table of contents
The host has requested RSVPs for this event

Smarter-than-human Artificial Intelligence could be around the corner, with AI companies racing to build these systems as quickly as possible. Meanwhile, leading researchers have warned that superhuman AI could cause global catastrophe. A 2023 statement signed by thousands of AI experts warned us that “mitigating the risk of extinction from AI should be a global priority”.

It’s a bad idea to build something smarter than you, if you don’t know how to control it. We need guardrails to prevent dangerous, superhuman AI – and fast.

If you're American, contacting your Congressmember is one of the most effective things you can do to support AI guardrails. Research indicates that even a handful of letters to a Congressional office can have an impact.

Join us to learn more about the dangers of AI and policy solutions to keep us safe – and how to pressure our elected officials to act on this crucial problem.

About us: PauseAI US is a nationwide grassroots movement dedicated to achieving a global, indefinite pause on superhuman AI development – until we can be confident that the technology is safe for humanity. We have local groups in 9 US cities and counting.

2

0
0

Reactions

0
0
Comments
No comments on this post yet.
Be the first to respond.
Curated and popular this week
Relevant opportunities