Are there any reasons to believe that LLMs are in any way more alignable than other approaches?
So, ideally you would like to assume only
and conclude A and B ?
Are there any reasons to believe that LLMs are in any way more alignable than other approaches?