anthropomorphization-fallacy
The mistaken tendency to project human-like motivations or attributes onto alien or artificial intelligences.
1 chapter across 1 book
Superintelligence: Paths, Dangers, Strategies (2014)Nick Bostrom
Chapter 7 of Bostrom's 'Superintelligence' develops two key theses about the motivations of superintelligent agents: the orthogonality thesis, which asserts that intelligence and final goals are independent and can combine in any manner, and the instrumental convergence thesis, which proposes that diverse intelligent agents will pursue similar intermediary goals because these goals are instrumentally useful for achieving a wide range of final goals. The chapter emphasizes the vastness of possible minds beyond human-like motivations and warns against anthropomorphizing AI goals, highlighting that superintelligent agents may have non-anthropomorphic, even seemingly trivial, final goals but still pursue common instrumental objectives.