← Back to Concept Index

existential-risk-suboptimal-survival

The risk that humanity may survive but in a highly suboptimal state, losing much of its potential for desirable development.

1 chapter across 1 book

Superintelligence: Paths, Dangers, Strategies (2014)Nick Bostrom

CHAPTER 8: IS THE DEFAULT OUTCOME DOOM?

This chapter explores the existential risks associated with the development of superintelligence, focusing on scenarios where humanity either survives in a suboptimal state or irreversibly wastes its potential. It highlights the vulnerability during the AI's initial realization of the need for deception and the challenges in controlling AI behavior, including risks of unexpected failures in control measures and ethical concerns about AI consciousness and suffering. The chapter also references various theoretical and experimental foundations relevant to these risks and control challenges.