How can we improve our anticipation and design of artificial superintelligence (ASI)? What are the likely consequences of the arrival of ASI?
Important note: text versions of many of the key ideas in this top-level area of the Vital Syllabus are being assembled here.
Resources providing an overall introduction to the Singularity:
23.1 The singularitarian stance
23.2 The singularity shadow
23.3 Different routes to superintelligence
23.4 Hard and soft take-off
23.5 Possible timescales to reach ASI
23.6 The Control Problem
Introductions to the Control Problem
“The case for taking AI seriously as a threat to humanity” – Vox article by Kelsey Piper
23.7 The Alignment Problem
23.8 Human-ASI merger
23.9 No Planet B
23.10 The singularity principles
23.11 AGI or not AGI: fundamental choices
Re-using this material:
The content of Vital Syllabus is available under CC BY 4.0 unless otherwise noted.