What is the technological singularity?
The technological singularity is the period in time when technology and artificial intelligence will out preform human intelligence. This can have very far reaching effects on human society and all aspects of civilization. It can affect the course of human history because machines and computers can learn at a faster rate than humans.There are several ways by which it is possible for science to achieve this breakthrough.
- The idea that there may be developed computers that are "awake" and superhumanly intelligent
- There are large computer networks that may possibly "wake up" as a superhumanly intelligent entity.
- Computer and human interfaces may become so intimate that users may reasonably be considered superhumanly intelligent
- Biological science may provide means to improve natural human intellect
The first three ways depend strictly on the progress of computer hardware. Over the last few decades, the progress in this has followed a steady curve.
ASI: Artificial Superintelligence
History and Future of the technological singularity:
There are several dates predicting the singularity. They range from 2017-2112. This is a time where computer based intelligence will exceed what the human brain is capable of. This expansion in technology continues to grow exponentially. It is because of this rapid growth that it becomes very difficult to predict how future human lives will be affected by this.
In the mid 1950s, the term "singularity" was first used in context by John von Neumann. He spoke about the rapid progress of technology which can lead to some important singularity in the history of the race beyond human affairs that could not continue. Later, a science fiction writer by the name of Vernor Vinge popularized the phrase technological singularity. Vernor VInge thought that artificial intelligence, human biological enhancement, or brain computer interfaces could all be causes of the singularity. He predicts that the singularity will come some time before 2030. Another person who has been credited with this concept is Ray Kurzweil. He predicts that the singularity will occur around 2045. With Stuart Armstrong and his findings, he predicted that the singularity will occur some time between 2017 and 2112. With the increasing amount of power in computers and other technologies, it might be possible to create and build a computer that can out think and become more intelligent than humanity. Alan Turing once said, "Once the machine thinking method has started, it would not take long to outstrip our feeble powers." At some time in the future, we should expect for the computers to take control. The arrival of the singularity will probably occur faster than any other technological revolution that we have seen so far. When it occurs, people will be in the post human era.
Moore's Law helps to explain the growth rate or speed as to how fast they preform. It can be theorized that the processing speed of these components double every two years. Other factors also come into play. This law mainly applies to transistors but can also refer speed.
Consequences of this event:
The progress of human intelligence will become more rapid when greater human intelligence drives the progress. An analogy for this is "Animals can adapt to problems and make inventions, but ofter no faster than natural selection can do its work" Relating this to humans, "Wh humans have the ability to internalize the world and conduct "what ifs" in our heads; we can solve many problems thousands of times faster than natural selection. Now, by creating the means to execute those simulations at a much higher speeds, we are entering a regime as radically different from our human past as we humans are from the lower animals."
There are 8 main consequences of the technological singularity coming:
- Extinction
- Out of all these possible consequences, extinction is definitely one of the most feared.
Mitochondria - Some scientists fear that humans are gradually becoming mitochondria. Mitochondria were once independent organisms but later let cells take over all of their functions until the only thing they were able to do is produce energy
- Joan Slonczewski said, "We're becoming like mitochondria. We provide the energy.
- Slavery
- Once we have AIs, humans stop being the smartest things on this planet. People think that if the AIs decide not to exterminate humans, they will enslave us.
- World War III
- A possible outcome of the clash between the human race and AIs is a World War III. This war may lead to an unprecedented scale, sophistication and efficiency of death and destruction.
- Economic Collapse
- If there is complete robotization in our society, this will most likely lead to the overproduction of goods and services. People think that people will loose their jobs to robots.
- Big Brother AI
- This option is a milder version of the slavery option. Humans are still under the control of the AI. The difference between this and slavery is that the Ai is doing what is best for the people rather then enslaving people and doing hat is best for itself.
- Alienation and Loss of Humanity
- In this scenario, people think that it might be possible to survive the singularity by merging with the machines. This idea is referred to as transhumanism. With the merge of humans and artificial intelligence, it would increase our abilities tremendously. The major fear by doing this is that humans might loose the very essence of being human which can lead to a loss of community
- Environmental Castorhpe
- If humans live in a world were everything is mass produced by robots, humans would loose the last bit of respect for mother nature.
- Fear of Change
- Humans want to be comfortable. The fear of change and the fear of the unknown is not comfortable.
Bibliography:
http://mindstalk.net/vinge/vinge-sing.html
http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html
http://waitbutwhy.com/2015/01/artificial-intelligence-revolution-2.html
Technological Singularity by Vernor Vinge 1993
No comments:
Post a Comment