Hide table of contents

Hello! I discovered EA and alignment work in 2022. I currently work in the humanities field, have taken math & cs college-level courses but have no relevant degrees. I want to create a curriculum for myself to follow in 2023 so that I can eventually accumulate the skills necessary to partake in alignment research and signal such to relevant opportunities. Better yet, if I can find a community of people with similar goals, that would be great. However I am concerned about making wrong choices regarding subjects, methodology, input/output ratio etc., and would like to have as detailed and rational a plan as possible. Can anyone point me to some directions? Currently I have completed courses in data structure, algorithms, machine learning, some statistics, linear algebra and analysis, but have very little practical experience in all of these and am following Yann Lecun's deep learning course at the moment. Moreover since I am no longer a student I am at a loss on how to find research opportunities and collaborators. 

Any advice appreciated, thanks!

10

0
0

Reactions

0
0
New Answer
New Comment

4 Answers sorted by

  1. Ngo's ‘AGI safety fundamentals’ curriculum
  2. Karnofsky's ‘Getting up to speed on AI alignment’ guidance and reading list

Here's the canonical introductory curriculum!

See also Study Guide.

Also ML Safety Scholars: https://course.mlsafety.org/

And probably a course in Deep Learning where you write code in Pytorch

The latest formal iteration of which is here. You can always go through the content on your own time at the link above too.

Thanks everyone! ^_^

Curated and popular this week
Relevant opportunities