You are viewing a read-only archive of the Blogs.Harvard network. Learn more.

Blog Post 6: The Singularity

This week, we spoke about the coming of the Singularity-a time in the future when artificial intelligence will become stronger than human intelligence, allowing machines to improve the power of technology at exponentially increasing rates and thereby leaving human intelligence obsolete. The Singularity requires machines to become more and more like humans, and such mechanical behavior is hard to fathom. But of even greater interest to me (at least for now) is considering what it is about human nature/intelligence that is different from artificial intelligence and behavior, and if machines will be able to imitate and enjoy the things that make us human.

Personally, I think that machines will forever be a different kind of entity, unable to enjoy the meaning or experience of life in the same way that humans do. Humans, at the end of the day, are animals with an inner wildness that is never fully tamed, but must be sacrificed in part in order to enjoy the benefits of living in society. As Adam Gropnik writes in his essay about the children’s book Babar (connection to my expos class), “the pleasures of civilization come with discontent at its constraints…there is allure in escaping from the constraints that button you up and hold you; there is also allure in the constraints and the buttons.” This simultaneity-the allure of wildness and autonomy coupled with the allure of civilization which is fueled by the power of loving relationships and happiness-is something artificial intelligence simply cannot understand. Indeed, AI is geared completely towards order and efficiency; it has no urge for wildness, no fascination with escaping from the order it both encounters and creates. Without such a wildness, the sacrifice of living in society, in the midst of order, is meaningless. Nothing is truly sacrificed by the machine to exist in civilization, and having lost nothing, it cannot appreciate the meaning of the things that hold civilization together: loving relationships and happiness.

Moreover, artificial intelligence also lacks another essential human experience that is central to experiencing meaning in the world: struggle. Bruno Bettelheim writes: “Only by struggling courageously against what seem like overwhelming odds can man succeed in wringing meaning out of his existence.” It is through, struggle, not success that humans are able to experience meaning in life. Computers, on the other hand, are completely focused on results. Artificial intelligence is interested only in outcomes.Can you imagine a computer finding delight in the struggle to complete a task? Of course not, computers simply load…and load…and load. The beauty of a machine is its ability to concentrate on a task and to achieve a success that a human could not. (Think of Allan Turing’s computer in the Imitation Game) We might take pleasure in the machine’s process or in our own struggle to design and correct the machine, but the machine itself is interested in merely the result, a phenomena that humans have found time and again is quite useless as a source of meaning in life.

The Singularity, with all this in mind, might be frightening in an unexpected sense. We think of the Singularity as having arrived when the machines become like us; in reality, the Singularity may be signaled by a change in humanity, by a time when we become like the machines. Of course, we won’t ever achieve the power of computers once their intelligence has surpassed ours. But we may become robotic as our concerns grow ever closer to the concerns of machines, as we forget about the beauty of the process and of struggle when everything we might need is available at the press (or non-press) of a button.

3 Comments

  1. Nate Hollenberg

    October 18, 2016 @ 5:30 PM

    1

    Noah,

    Your ideas on the fact that humans embrace the beauty of the process and of struggle is a very interesting topic. I have never really thought of it that way. But personally, I believe that computers will actually be able to act like humans and embrace this struggle. I think that with the way computer technology is on the rise, there is no way that eventually humans cannot find a way to create the exact mind of a human in the machine. Do you believe that it is impossible for a machine to have that human quality? Another question I have is, will humans ever allow such a machine to be created? We value control so much, so will we ever create something that will relinquish our control?

    Overall, really interesting thoughts on such a complex topic. Definitely got me thinking differently.

  2. Mike Smith

    October 25, 2016 @ 8:34 PM

    2

    I like your focus on an action that is more than just the action. We can construct a computer that adds to numbers, and it adds without effort. We can even construct a robot that walks, and while it may “struggle” to walk, this concept is materialized only in our own minds. It is our interpretation of what we see the robot doing. Watch the following to experience what I mean: https://www.youtube.com/watch?v=g0TaYhjpOfo

    When I say “materialized” I mean that nothing in these robots’ circuitry is reacting to its failure to walk. The researchers simply set the robot back on its feet and restart the program — maybe tweaking a few lines of code or hardwired initial values along the way.

    I suppose you could program a humanoid robot to try different combinations of movements until it mimicked a person walking (i.e., if it failed, then try again). This is more like a child learning to walk. But I imagine your point would be that the robot still wouldn’t feel the struggle, even though it’s actions would look more like a struggle to walk to us.

    Why do we feel the struggle when we learn to walk? I assume we’ve all experienced a child’s frustration when he or she doesn’t do what he or she wants to do the first time it’s tried. Is it the idea that each of us should be able to do what others are able to do? Could you program that into the robot? Or is it because a struggle makes us tired? Could you program a sense of what energy something should take versus the energy actually expended to accomplish that action? Would we be getting closer to something that not only resembled a struggle, but was internally processed by the robot as something we might recognize as a feeling of a struggle?

    I don’t know, but I appreciate the prompt to think along these lines!

  3. school of applied science

    November 10, 2016 @ 8:31 PM

    3

    I am very happy that you cover about the Singularity-a time in the future when artificial intelligence will become stronger than human intelligence this is actually a very scary thing. When I’m watching a movie about artificial intelligence becomes more powerful than the human mind, the result made human useless by artificial intelligence and manipulation occurred because of intelligence that was created by scientists. This may not happen

Leave a Comment

Log in