is the hypothesis that substantial progress in
artificial general intelligence (AGI) could someday result in
human extinction or some other unrecoverable
global catastrophe. It is argued that the
human species currently dominates other species because the
human brain has some distinctive capabilities that other animals lack. If AI surpasses humanity in general intelligence and becomes "
superintelligent", then this new superintelligence could become powerful and difficult to control. Just as the fate of the
mountain gorilla depends on human goodwill, so might the fate of humanity depend on the actions of a future machine superintelligence.