Day 6 Reflections

In seminar yesterday we talked about the Singularity, which to me is…FREAKY! The Singularity is a hypothesis that at some point Artificial Intelligence will surpass human intelligence, and computers will be smarter than humans. To me, this is a horribly terrifying concept. Personally, I do not believe that the Singularity is near. Actually, I don’t believe that machine intelligence will ever surpass human intelligence, but more on that later.  Paul Allen in his 2011 article “The Singularity Isn’t Near” explains that the Singularity isn’t near because not only will we need the necessary hardware to create such intelligence, but also we must create more complex software programs that are smarter and more capable than the human brain. He argues that “Creating this kind of advanced software requires a prior scientific understanding of the foundations of human cognition, and we are just scraping the surface of this.” If we barely understand the human brain, how can we expect to write software that is smarter than it?

Back to my earlier statement that the Singularity will not happen. I know this is an unpopular opinion, and deep down it could just be what I hope, but for now I truly believe that the singularity will never happen. EVENTUALLY, (according to Moore’s Law, the exponential growth of our computing power, and many other hypotheses etc.) I believe that we will have the necessary software and hardware to create a machine that can surpass all human intelligence. But will humans allow that to happen? Personally, I do not think so. Humans thrive so much from being at the top, from being the “smartest” creatures on the planet. Why would humans ever want to create something that would overpower them, that would be above them on the intelligence chain? As inherently selfish as we are, and how much we thrive from being on top,  human beings will never create a machine that will outsmart them. They might create a machine that is very, very smart (perhaps even smarter than they are), but humans will still be able to control this machine somehow. We LOVE being in control, and I can’t see humanity losing their control to machines.

Yet, what if the Singularity happens?…………..

Just take a moment and think about it. Imagine computers dating, marrying, and reproducing new, smarter machines when they get old. Imagine humans being like pets to these machines. Can anything stop these super-intelligent machines from making smarter and smarter machines? Will there be a limit to the intelligence these machines have?

Personally, I get lost just thinking about it.

I have a pessimistic view on the Singularity. If humans lose control, it will be chaos. We will not know what these machines are capable of. If humans are “inherently bad”, what will machines be? To me, the Singularity is too absurd of a concept for it to be beneficial to us.

Overall, the Singularity was definitely the most interesting topic of discussion so far. We could talk about the Singularity for weeks on end, coming up with new arguments to why it will happen or why it wont; to why it will be beneficial, to why it will ruin humanity. There is so much to talk about with the Singularity. I really hope I can study this topic further in my college career.

I hope you enjoyed Hollenberg’s Thoughts. More to come next week.

Brady is back and better than ever. Watch out!

1 Comment »

  1. jlhenry

    October 23, 2016 @ 10:17 pm


    I totally agree about the Singularity being a freaky concept; however, I personally disagree about how near it may be. I think it is 100% feasible that the Singularity has already started to begin in some ways. While clearly we don’t have fully functioning robots that act as humans yet, there are many complex machines that pass the turing test we talked about during our discussion. If we already have machines that can convince humans they are humans, hasn’t part of the Singularity already been achieved? A scary thought in my opinion.

Leave a Comment

Log in