Online and/or international
AI Safety Camp connects you with interesting collaborators worldwide to discuss and decide on a concrete research proposal, prepare online as a team, and try your hand at research during a 9-day intensive retreat.
Local AI Safety/Alignment groups
Oxford AI Safety Reading Group
We meet once a week to discuss the Alignment Newsletter and once a fortnight to read a specific, recent AI safety paper. The group is aimed primarily at postgrads and early career researchers, though undergrads are also welcome. If people are interested they can join our Facebook group and/or email Lewis to be added to the Google Calendar event.
CEEALAR (formerly the EA Hotel) is an Effective Altruist community hub in the North West of England. We host people working on promising charitable projects, be they research, remote charity work, self-study, starting new EA-aligned charities, or similar.
Our friends are other people and groups who are working towards similar goals as AI Safety Support, i.e. to empower people who want to do AI Safety research.
If you know someone or some group we should be friends with, please let us know.