Download The Technological Singularity AudioBook Free
The idea that history is getting close a "singularity" - that regular humans will someday be overtaken by artificially wise machines or cognitively enhanced biological cleverness, or both - has shifted from the realm of research fiction to serious issue. Some singularity theorists predict that if the field of unnatural intelligence (AI) continues to build up at its current dizzying rate, the singularity could happen in the center of the present hundred years. Murray Shanahan offers an introduction to the thought of the singularity and considers the ramifications of such a potentially seismic event. Shanahan's aim is never to make predictions but instead to investigate a range of situations. Whether we think that singularity is next to or way, likely or impossible, apocalypse or utopia, the idea raises crucial philosophical and pragmatic questions, forcing us to think seriously about what we want as a types. Shanahan describes technological developments in AI, both biologically motivated and constructed from scratch. Once human-level AI -- theoretically possible, but difficult to perform 0- has been achieved, he clarifies, the changeover to superintelligent AI could be very quick. Shanahan considers the particular presence of superintelligent machines could imply for such matters as personhood, responsibility, privileges, and personality. Some superhuman AI providers might be created to profit humankind; some might go rogue. (Is Siri the design template, or HAL?) The singularity presents both an existential danger to humanity and an existential chance for humanity to transcend its constraints. Shanahan helps it be clear that people need to imagine both prospects if we want to cause the better results.