Is it fair to define intelligence as the ability to affect emotion?
In the discussion of reaching and exceeding Singularity, there is a fear that computers/machines will one day irreversibly match, if not exceed, the intelligence of humans. ¬†Some may argue that Singularity has already been reached, while others may argue that it has not yet been reach. ¬†And still, there are those who may argue that Singularity will never be reached.
In the past, we have observed sightings of Singularity’s potential to exist. ¬†For example,¬†Eliza passed the Turing test, conversing with people in such a way that real people on the other end quickly forgot they were conversing with a machine. ¬†However, it is evident that Eliza would not have passed the Turing test for every human that interacted with it. ¬†In this case, it only affected those who felt an emotional connection/attachment to Eliza, as the affected people sought the sympathy and attention Eliza offered. ¬†Because Eliza influenced human emotions, it had the power to instill fear and the power to be deemed intelligent.
Nevertheless, Eliza is of the past. ¬†Although intelligent to a certain extent, Eliza was not intelligent enough to outsmart humans overall. ¬†To do so, Eliza would need to have been able to affect the entire human race, and to a greater extent. ¬†If a machine or¬†being like that is ever created, that is when Singularity will be reached; this is the power of artificial intelligence.
As dangerous as Singularity can be, it is also important to remember the value of the advance of technology and the creation of programs far more powerful than Eliza, similar to biotechnological advances like those of pedestrian cruise control and communication through brain waves connected to the Internet.
The advance of technology is perpetual, and it is therefore¬†important that humans try to reconcile with the notion of Singularity, rather than push it away completely. ¬†There are always risks, dangers, and imperfections in society. ¬†However, there is also always a way to balance pros and cons – minimizing risks to potentially nothing, while¬†augmenting benefits¬†into even greater advantages. ¬†Is it safe enough to trust this natural phenomenon?
If not, is it better¬†to block technology’s advance to Singularity? ¬†And if so, is it actually possible to avoid Singularity? ¬†Perhaps. ¬†Indeed, if humans were desensitized, void of feeling and emotion, reaching Singularity would be an impossibility.