The Golden Age


Though often seen as a dark possibility for our future, the singularity event does not necessarily mean the instant and complete extinction of the human race.  Movies such as The Matrix or Terminator show a dark future ruled by human-killing intelligent machines, but there are several key assumptions such visions of the future make (in addition to the feasibility of the singularity) that allow for such a future.


Visions of the future in Terminator

First of all, in such representations of the future, machines and humans are always put at odds with one another.  Competition over resources, territory, or power cause humans and machines to engage in an all-out "race" war.  However, this is based on the assumption that one side or another has something to gain from such a war.  Such a war often is potrayed as humans trying to survive and the machines having some innate desire to destroy all of humanity.  However, I would argue that such a desire to destroy mankind would not be a priority for the machines.  Necesary resources for humans--such as food, water, shelter--are not resources machines would require.  Machines really would have nothing to gain from destroying humanity other than pride at being the ruling power on the planet.  Though should we start building selfish pride into machine intelligence, we may be asking for it.

Secondly, a dark future ruled by machines as portayed in such movies would require the machines to prioritize their own existence over that of a human's.  An intelligent, learning machine would likely be integrated with some sort of value system, much like the one science fiction writer Isaac Asimov describes in clip below:



Isaac Asimov's Three Laws of Robotics

Such a value system would prevent an intelligent machine from prioritizing its own existence over the existence of a human.  The only way, therefore, for an intelligent machine to start killing would be if the creators of this intelligence do not properly implement this value system.

So, should the singularity not spawn a race of intelligent machines bent on the destruction of mankind, what could occur?  There are a variety of theories, ranging from humanities' "merging with [the machines] to become super-intelligent cyborgs, using computers to extend our intellectual abilities the same way that cars and planes extend our physical abilities" to using the intelligence to "treat the effects of old age and prolong our life spans indefinitely" to even "scan[ing] our consciousnesses into computers... [to] live inside them as software, forever, virtually" (Grossman).  The Singularity 2045 organization predicts that by the year 2045, the singularity will have occurred and will have made the world into a perfect utopia.

Because of the vary nature of the idea of the singularity, however, it is impossible to predict what would occur should the singularity event come to pass.  However, perhaps more likely than being the end of humanity, it will become a the new Golden Age for humanity, pushing us forward towards greater acheivement and prosperity.

edit this page