AI Safety Fellowship applications are now open; apply before February 6th!
Managing risks from advanced artificial intelligence is one of the most important problems of our time. We are a community of technical and policy researchers at Georgia Tech aimed at reducing these risks, training the next generation of researchers, engaging the public, and steering the trajectory of AI development for the better.
Read more »We pursue and publish novel work that seeks to understand, evaluate, and make safe powerful artificial intelligence systems, including through interpretability, goal specification, and governance projects.
View projects »We host a variety of educational fellowships, upskilling programs, and large events exploring the fields of AI alignment and governance, drawing in talent from around Atlanta and the Georgia Tech community.
View all programs »We seek to engage a wide variety of students, faculty, and researchers from across fields and departments. Our AI Safety Pipeline is designed to make technical and governance research approachable and well-supported at Georgia Tech, and we are constantly looking for innovative and creative ways to further our mission.
We host various events including open meetings, speaker events, and reading groups to foster engagement within the AI Safety community and promote the education of AI Safety issues. To stay up to date on all our events, you can view the calendar below and join our discord server.
Join us for engaging presentations by leading researchers and industry professionals who share their insights on AI safety and showcase their latest research. These events, open to both members and the public, offer a unique opportunity to explore emerging topics in the field, engage directly with speakers, and connect with others who share your interest in ensuring AI development benefits humanity. To stay up to date about upcoming speaker events, please join our discord server.
This calendar will be updated with upcoming events and deadlines. Sync this to your calendar and watch out for updates!
Our members have collaborated with researchers at:
Proudly supported by: