will soon be rebuilt as an entry point to AI Safety. The main skillset needed right now is design. Open
TBD
plex#1874, Søren Elverlin #9083
A living document listing AI Safety training programs, with dates and applications. Looking for someone to add self-study options. Open
plex
stampy.ai / is a single point of access for learning about AGI safety created by Rob Miles's volunteer team. Accessibility testing volunteer wanted! OpenReact/Remix (UI), Python (Discord bot), Coda + Google Docs (edit interface)
plex
A fund to help people in lower and middle income countries travel to connect with core members of EA and AI Safety. Open
HP#2176
FAST - Frugal Ai Safety Talent
Tutor Vals #6237
Research for Rob Miles videos
Could someone make a launch page specifically for these entities, plus any I've missed, complete with small descriptions of their cruxes and aims?
Alignment research employers: , , , , , , , , , , , , , , ; Academic alignment research hubs: , , , ; Independent research funders: . Also would be sweet to have a feed of the and the . I have a very specific vision:
As professional-looking as the and the ; As cruxy and useful as ""; Targeted to people who specifically want to get employed/funded post MATS + Refine + ARENA + REMIX + MLAB + WMLB + PIBBSS (i.e., doesn't include random Discords or Slacks or programs graduates have exceeded); Ideally, this would happen by the first week of Feb (to capture MATS 3.0 alumni). Open
softr+airtable
hello@plex.ventures
EigenTrust is a mechanism for scaling trust by allowing individuals to leverage their network's combined experience. First, peer vetting of alignment research contributions at scale. Then, the world! OpenPython (server and bot), Remix/TypeScript (website)
TJ
Databases of research and project ideas towards alignment, with features for fostering collaboration and connecting teams. Open
EleutherAI is a grassroots collective of researchers working to open source AI research projects. They have many active alignment channels. Open
The open-source codebase that the AI Alignment Forum, Effective Altruism Forum, and LessWrong run on, supporting alignment discussion. Open
A feed for AI Safety content, personalized to optimize for intellectual growth. Should use sources from the alignment dataset. Open
TBD
plex
Some approaches to solving alignment go through teaching ML systems about alignment and getting research assistance from them. Training ML systems needs data, but we might not have enough alignment research to sufficiently fine tune our models, and we might miss out on many concepts which have not been written up. Furthermore, training on the final outputs (AF posts, papers, etc) might be less good at capturing the thought processes which go into hashing out an idea or poking holes in proposals which would be the most useful for a research assistant to be skilled at. It might be significantly beneficial to capture many of the conversations between researchers, and use them to expand our dataset of alignment content to train models on. Additionally, some researchers may be fine with having their some of their conversations available to the public, in case people want to do a deep dive into their models and research approaches. The two parts of the system which I'm currently imagining addressing this are: An email where audio files can be sent, automatically run through , and added to the . Clear instructions for setting up a tool which captures audio from calls automatically (either a general tool or platform-specific advice), and makes it as easy as possible to send the right calls to the dataset platform. Open
plex, michael trazzi would be a good person to talk to (he's already hiring ppl to edit otter docs), and daniel filan
The idea is to collect and connect everyone interested in AIS who are applying for a PhD each year. Find all aspiring AIS researchers who wants to start a PhD in 2024. Put them in the same slack or discord. Organize the effort to look in to different programs and share information with each other. Those who wants to can coordinate where they apply to end up at the same university, and thus be sure to have at least one other AI Safety researcher around.
E.g: find all aspiring AIS researchers who wants to start a PhD in 2024. Put them in the same slack or discord. Organize the effort to look in to different programs and share information with each otherer. Those who wants to can coordinate where they apply to end up at the same university, and thus be sure to have at least one other AI Safety researcher around.
We have and a with logo ready for this. Open
Discord, Carrd
A living document listing AI Safety communities. Open
Conversational agent informed by the Alignment Research Dataset. For more details, see here: OpenPython + Flask + Node.js + OpenAI embeddings/ChatGPT.
BionicD0LPH1N#5326
Enhancing AI safety ecosystem via debates Our epistemics and mutual understanding could be improved with regular debates/adversarial collaborations between alignment researchers who disagree on particular topics.
I think about something similar to but in audio (+video) format and open to people’s suggestions about who should we “pitch against” whom and what topic we’d like them to discuss.
Spencer Greenberg’s podcast has several episodes that can serve as an example: . Open
bagginsmatthew@gmail.com
Reading What We Can is a web page which has reading challenges designed to upskill people rapidly on alignment topics. Open
AGI Safety Fundamentals Video Playlist
YouTube Playlist
plex, ccstan99
A homepage for AI safety, linking out to the relevant parts of the ecosystem. Open
Alignment Project Factory
List of existing research teams
I imagine a spreadsheet, similar to this one. Each entry is a research team. Start date (when the team was formed) End date (e.g. AISC teams has an end date, but for others this will be empty) Open to join (Options: Yes, No, Maybe) How to get in touch (This is a text box with instruction for how to contact and/or join the team. E.g. “to collaborate, please join our discord [link]”.) Research (what the team is doing) Affiliation (e.g: AISC, SERI MATS, PIBBSS, Independent, Mixed, etc.) Current number of members (can be just one, but only if they are open for people to join, otherwise it’s not a team) For inspiration for more people to start teams. Teams are great! Don’t add teams with out their permission. This should be a list of teams that wants the visibility and are happy to be contacted. If this is created it needs to be maintained. I don’t think this will be much work, but it will be necessary. Suggestion created by Linda Linsefors () I will not lead this project. Don’t reach out to me to offer to help unless you are also willing to take ownership. However if you are willing to take ownership, then I can help you. Open
linda.linsefors@gmail.com
Needs a community manager. Open
Wordpress plugins ( & ), community management