Decedes or centuries keynotes version
Transcript of Decedes or centuries keynotes version
“Decades” scenarioX-risk : 2030-2050
Probability is rising exponentially
Chaotic and complex processes near the event horizon (Technological singularity)
AI is main factor
Decades: 10-30 yearsTiming of creation superhuman AI and other super-technologies: 2030 – Vinge, 2045 – Kurzweil
Superhuman AI
or destroy humanity
or prevent x-risks
Period of vulnerability to x-risks will be finished after creation of superhuman AI
Arguments for decades scenario
Exponential growth of technologies
Exponential growth of x-risks
Deadly viruses – cheaper
AI – simpler
Arguments for decades scenario
Possible triggers in the near future:
World war
New arms race
Peak oil
Runaway global warming
Smaller catastrophe starts bigger one
Centuries scenario50-500 years from now
Rare events
Accidental
Mutually independent
Linear distribution of probability
Prevention by space dwelling
Arguments for centuries scenario:
Most predictions about AI: false
Most predictions about
near future global catastrophe:
false
Arguments for centuries scenario:
Exponential growth – level up
Moore’s law – stop
Linear future growth
Public bias for centuries scenario:
Long-term predictions:
More scientific
Less chances to be false
Improving the reputation
Helping to prevent x-risks.
John Leslie – 500 years (1996)
Nick Bostrom – 200 (2001)
Martin Rees – 100 (2003)
Decades scenario is worse
Sooner
Less time to prepare
More complex
Military AI – Unfriendly
In our lifetime
ConclusionOpen question – timeframe
It depends on exponential or linear development of future technologies
Different risks interact in complex and unpredictable ways near Technological Singularity
It could happen as soon as in next 15 years