| 000 | 03368nam a2200349 i 4500 | ||
|---|---|---|---|
| 001 | 18843950 | ||
| 003 | OSt | ||
| 005 | 20260201115032.0 | ||
| 008 | 151104r20172014enka 001 0 eng d | ||
| 020 |
_a9780198739838 _qpaperback |
||
| 040 |
_aCDX _beng _cCDX _erda _dOCLCO _dBDX _dYDXCP _dEQO _dOCLCO _dOCLCF _dMIQ _dOCLCO _dBKL _dDLC _duoc |
||
| 082 | 0 | 4 |
_a006.301 _223 _bNIC |
| 100 | 1 |
_aBostrom, Nick, _d1973- _eauthor. _94564 |
|
| 245 | 1 | 0 |
_aSuperintelligence : _bpaths, dangers, strategies / _cNick Bostrom, Director, Future of Humanity Institute, Director, Strategic Artificial Intelligence Research Centre, Professor, Faculty of Philosophy & Oxford Martin School, University of Oxford. |
| 264 | 1 |
_aUnited Kingdom; _bOxford University Press, _c2017. |
|
| 264 | 4 | _c© Nick Bostrom 2014. | |
| 300 |
_axvi, 415 pages: _billustrations; _c20 cm. |
||
| 336 |
_atext _btxt _2rdacontent |
||
| 337 |
_aunmediated _bn _2rdamedia |
||
| 338 |
_avolume _bnc _2rdacarrier |
||
| 504 | _aIncludes bibliographical references (383-406) and index. | ||
| 505 | 0 | 0 |
_g1. _tPast developments and present capabilities -- _g2. _tPaths to superintelligence -- _g3. _tForms of superintelligence -- _g4. The _tkinetics of an intelligence explosion -- _g5. _tDecisive strategic advantage -- _g6. _tCognitive superpowers -- _g7. The _tsuperintelligent will -- _g8. _tIs the default outcome doom? -- _g9. The _tcontrol problem -- _g10. _tOracles, genies, sovereigns, tools -- _g11. _tMultipolar scenarios -- _g12. _tAcquiring values -- _g13. _tChoosing the criteria for choosing -- _g14. The _tstrategic picture -- _g15. _tCrunch time. |
| 520 |
_aThe human brain has some capabilities that the brains of other animals lack. It is to these distinctive capabilities that our species owes its dominant position. Other animals have stronger muscles or sharper claws, but we have cleverer brains. If machine brains one day come to surpass human brains in general intelligence, then this new superintelligence could become very powerful. As the fate of the gorillas now depends more on us humans than on the gorillas themselves, so the fate of our species then would come to depend on the actions of the machine superintelligence. But we have one advantage: we get to make the first move. Will it be possible to construct a seed AI or otherwise to engineer initial conditions so as to make an intelligence explosion survivable? How could one achieve a controlled detonation? To get closer to an answer to this question, we must make our way through a fascinating landscape of topics and considerations. Read the book and learn about oracles, genies, singletons; about boxing methods, tripwires, and mind crime; about humanity's cosmic endowment and differential technological development; indirect normativity, instrumental convergence, whole brain emulation and technology couplings; Malthusian economics and dystopian evolution; artificial intelligence, and biological cognitive enhancement, and collective intelligence. -- _cSource other than Library of Congress. |
||
| 650 | 0 |
_aArtificial intelligence _xPhilosophy. |
|
| 650 | 0 |
_aArtificial intelligence _xSocial aspects. |
|
| 650 | 0 |
_aArtificial intelligence _xMoral and ethical aspects. |
|
| 650 | 0 | _aComputers and civilization. | |
| 650 | 0 | _aCognitive science. | |
| 650 | 7 |
_aArtificial intelligence. _2fast |
|
| 650 | 7 |
_aComputers and civilization. _2fast _94565 |
|
| 942 |
_2ddc _cBK |
||
| 999 |
_c1802 _d1802 |
||