AI ALIGNMENT FORUM
AF

AI Alignment FieldbuildingCognitive ScienceWorld Modeling
Frontpage

33

Survey for alignment researchers!

by Cameron Berg, Judd Rosenblatt, Trent Hodgeson
2nd Feb 2024
1 min read
11

33

AI Alignment FieldbuildingCognitive ScienceWorld Modeling
Frontpage
Survey for alignment researchers!
4Linda Linsefors
1Vivek Hebbar
0Cameron Berg
New Comment
3 comments, sorted by
top scoring
Click to highlight new comments since: Today at 8:31 PM
[-]Linda Linsefors2y40

I timed how long it took me to fill in the survey. It took 30 min. I could probably have done it in 15 min if I skipped the optional text questions. This is to be expected however. Every time I've seen someone someone guesses how long it will take to respond to their survey, it's off by a factor of 2-5. 

Reply
[-]Vivek Hebbar2y12

Note: The survey took me 20 mins (but also note selection effects on leaving this comment)

Reply
[-]Cameron Berg2y00

Definitely good to know that it might take a bit longer than we had estimated from earlier respondents (with the well-taken selection effect caveat). 

Note that if it takes between 10-20 minutes to fill out, this still works out to donating $120-240/researcher-hour to high-impact alignment orgs (plus whatever the value is of the comparison of one's individual results to that of community), which hopefully is worth the time investment :)

Reply
Moderation Log
More from Cameron Berg
View more
Curated and popular this week
3Comments

UPDATE 3/9: Thanks to broad participation from the community, this and associated surveys have raised approximately $10,000 for high-impact alignment organizations. Given the reasonable sample size we now have, we are now going to pause donations for any subsequent responses. (However, we will preserve the charity voting question, and if anyone wants to sponsor donations for any surveys taken after March 9th, please ping us at alignment@ae.studio.)

AE Studio is launching a short, anonymous survey for alignment researchers, in order to develop a stronger model of various field-level dynamics in alignment. 

This appears to be an interestingly neglected research direction that we believe will yield specific and actionable insights related to the community’s technical views and more general characteristics. 

The survey is a straightforward 10-15 minute Google Form with some simple multiple choice questions.

For every alignment researcher who completes the survey, we will donate $40 to a high-impact AI safety organization of your choosing (see specific options on the survey). We will also send each alignment researcher who wants one a customized report that compares their personal results to those of the field. 

Together, we hope to not only raise some money for some great AI safety organizations, but also develop a better field-level model of the ideas and people that comprise alignment research. 

We will open-source all data and analyses when we publish the results. Thanks in advance for participating and for sharing this around with other alignment researchers! 

Survey full link: https://forms.gle/d2fJhWfierRYvzam8

Mentioned in
72Shallow review of technical AI safety, 2024