ch8

the development of SI comes with a variety of safety issues and, in the worst case scenario, could lead to the destruction of humankind.

we should consider every potential scenario before bringing a hyper-powerful force like SI into the world.

we can make safety a priority through a long-term international collaboration.

Greed for power, and competitive advantage should be eliminated because then scientists would forgo safety to speed up their process and wouldn’t share their work with others. That means that if an SI project went horribly wrong and threatened humanity with extinction, too few people would understand the machine’s design well enough to stop it.

Global organization like governments, institutions and research groups should join together, so they could slowly build a safe and highly beneficial SI with appropriate afety measures and thorough oversight for each phase of the design.

an international superintelligence project would promote peace through its universal benefits.

Last updated