Decades or centuries - timeframe of the risks of human extinction
description
Transcript of Decades or centuries - timeframe of the risks of human extinction
![Page 1: Decades or centuries - timeframe of the risks of human extinction](https://reader036.fdocuments.net/reader036/viewer/2022070319/5583ef39d8b42ab0278b5348/html5/thumbnails/1.jpg)
Decades or centuries?Timeframe of the risks
of human extinction
Alexey Turchin, Longevity party
![Page 2: Decades or centuries - timeframe of the risks of human extinction](https://reader036.fdocuments.net/reader036/viewer/2022070319/5583ef39d8b42ab0278b5348/html5/thumbnails/2.jpg)
TimeframeOpen question: when?
Timeframe: x-risk happened or prevented
![Page 3: Decades or centuries - timeframe of the risks of human extinction](https://reader036.fdocuments.net/reader036/viewer/2022070319/5583ef39d8b42ab0278b5348/html5/thumbnails/3.jpg)
Two theories about x-risk timeframe:
Decades (15-30 years)
Centuries (now-500 years)
![Page 4: Decades or centuries - timeframe of the risks of human extinction](https://reader036.fdocuments.net/reader036/viewer/2022070319/5583ef39d8b42ab0278b5348/html5/thumbnails/4.jpg)
“Decades” scenarioX-risk : 2030-2050
Probability is rising exponentially
Chaotic and complex processes near the event horizon (Technological singularity)
AI is main factor
![Page 5: Decades or centuries - timeframe of the risks of human extinction](https://reader036.fdocuments.net/reader036/viewer/2022070319/5583ef39d8b42ab0278b5348/html5/thumbnails/5.jpg)
Decades: 10-30 yearsTiming of creation superhuman AI and other super-technologies: 2030 – Vinge, 2045 – Kurzweil
![Page 6: Decades or centuries - timeframe of the risks of human extinction](https://reader036.fdocuments.net/reader036/viewer/2022070319/5583ef39d8b42ab0278b5348/html5/thumbnails/6.jpg)
Superhuman AI
or destroy humanity
or prevent x-risks
Period of vulnerability to x-risks will be finished after creation of superhuman AI
![Page 7: Decades or centuries - timeframe of the risks of human extinction](https://reader036.fdocuments.net/reader036/viewer/2022070319/5583ef39d8b42ab0278b5348/html5/thumbnails/7.jpg)
Arguments for decades scenario
Nano-Bio-Info-Cogno
Convergence
Everything appears
simultaneously
![Page 8: Decades or centuries - timeframe of the risks of human extinction](https://reader036.fdocuments.net/reader036/viewer/2022070319/5583ef39d8b42ab0278b5348/html5/thumbnails/8.jpg)
Arguments for decades scenario
Exponential growth of technologies
Exponential growth of x-risks
Deadly viruses – cheaper
AI – simpler
![Page 9: Decades or centuries - timeframe of the risks of human extinction](https://reader036.fdocuments.net/reader036/viewer/2022070319/5583ef39d8b42ab0278b5348/html5/thumbnails/9.jpg)
Arguments for decades scenario
Possible triggers in the near future:
World war
New arms race
Peak oil
Runaway global warming
Smaller catastrophe starts bigger one
![Page 10: Decades or centuries - timeframe of the risks of human extinction](https://reader036.fdocuments.net/reader036/viewer/2022070319/5583ef39d8b42ab0278b5348/html5/thumbnails/10.jpg)
Centuries scenario50-500 years from now
Rare events
Accidental
Mutually independent
Linear distribution of probability
Prevention by space dwelling
![Page 11: Decades or centuries - timeframe of the risks of human extinction](https://reader036.fdocuments.net/reader036/viewer/2022070319/5583ef39d8b42ab0278b5348/html5/thumbnails/11.jpg)
Arguments for centuries scenario:
Most predictions about AI: false
Most predictions about
near future global catastrophe:
false
![Page 12: Decades or centuries - timeframe of the risks of human extinction](https://reader036.fdocuments.net/reader036/viewer/2022070319/5583ef39d8b42ab0278b5348/html5/thumbnails/12.jpg)
Arguments for centuries scenario:
Exponential growth – level up
Moore’s law – stop
Linear future growth
![Page 13: Decades or centuries - timeframe of the risks of human extinction](https://reader036.fdocuments.net/reader036/viewer/2022070319/5583ef39d8b42ab0278b5348/html5/thumbnails/13.jpg)
Arguments for centuries scenario:
X-risks:
Independent
Accidental
Unknown origin
No chain reaction
![Page 14: Decades or centuries - timeframe of the risks of human extinction](https://reader036.fdocuments.net/reader036/viewer/2022070319/5583ef39d8b42ab0278b5348/html5/thumbnails/14.jpg)
Public bias for centuries scenario:
Long-term predictions:
More scientific
Less chances to be false
Improving the reputation
Helping to prevent x-risks.
John Leslie – 500 years (1996)
Nick Bostrom – 200 (2001)
Martin Rees – 100 (2003)
![Page 15: Decades or centuries - timeframe of the risks of human extinction](https://reader036.fdocuments.net/reader036/viewer/2022070319/5583ef39d8b42ab0278b5348/html5/thumbnails/15.jpg)
Decades scenario is worse
Sooner
Less time to prepare
More complex
Military AI – Unfriendly
In our lifetime
![Page 16: Decades or centuries - timeframe of the risks of human extinction](https://reader036.fdocuments.net/reader036/viewer/2022070319/5583ef39d8b42ab0278b5348/html5/thumbnails/16.jpg)
ConclusionOpen question – timeframe
It depends on exponential or linear development of future technologies
Different risks interact in complex and unpredictable ways near Technological Singularity
It could happen as soon as in next 15 years
![Page 17: Decades or centuries - timeframe of the risks of human extinction](https://reader036.fdocuments.net/reader036/viewer/2022070319/5583ef39d8b42ab0278b5348/html5/thumbnails/17.jpg)
We need to search effective mode of actions to prevent x-risks
Create social demand for preventing existential risks
Example: fight against nuclear war in 80ies
Political movement for x-risk prevention and life extension.
Near-term risk is more motivating for actions