Alignment Jams is an enthusiastic community of AI safety researchers that hosts ~monthly research sprints in AI safety and governance.