The Causes of Power-seeking and Instrumental Convergence
Instrumental convergence posits that smart goal-directed agents will tend to take certain actions (eg gain resources, stay alive) in order to achieve their goals. These actions seem to involve taking power from humans. Human disempowerment seems like a key part of how AI might go very, very wrong.
But where does instrumental convergence come from? When does it occur, and how strongly? And what does the math look like?