Soul of the Machine: Awakening the moral conscience of impersonal systems

Ever since Ultima IV showed us how computer games might embrace virtue, I’ve longed for similar titles with moral depth. Over a year ago, Kent Quirk awoke me to the power that computer games offer and why they are so important right now. At a local Games for Change meetup, Kent showed off Melting Point, a game about climate change. What impressed me about Melting Point was that Kent wasn’t proselytizing for a particular policy or worldview but rather hoping players would understand the interplay of complex systems (climate and economy) and make up their own minds about what, if anything, we should do about it.

This made me realize that computer games can merge two important features — player choice and systems-modeling — to achieve something even more powerful: nurturing morally aware systems-thinking. In other words, I began to see games as a tool to enable people to see that the complex systems around us — whether global trade or ocean ecosystems — have moral consequences, and that we aren’t just idle observers but actors both within and over those systems.

And it’s at this very moment in human history that we, as a species, must learn to see ourselves as moral agents within systems.

Never before has humanity had the power to destroy each other and the world as we know it, whether in clouds of radiation or of carbon dioxide. Never before has so much of humanity been at the mercy not of human tyrants and local lords but of machine code and faraway tribunals. The world, as Max Weber predicted, is becoming an iron cage of systems and bureaucracies beyond human ken.

It’s beyond our common understanding because homo sapiens didn’t evolve to naturally grasp large, complex systems but rather small networks of people. As psychologists are steadily learning, scruples aren’t merely nice but actually hard-wired into our brains. Ask someone whether it’s right to push a big man in front of a runaway train to save the lives of five bystanders, and parts of our brains begin firing to tell us, “no.” But ask whether it’s OK to throw a switch that decides between the fate of a man on one track versus that of five on the other, and those same neurons stay quiet.

So our genetic code instructs us to treat our face-to-face relationships as potentially moral, but our innate moral sense may not extend into our systemic or mediated relationships. Bringing chicken soup to our sick neighbor strikes us as self-evidently virtuous, but shaping our nation’s health care policy — not so much, at least not until it begins affecting us personally. Viewing policy as a structure that embodies collective morality is learned, not instinctual.

Computer games offer at least two possible responses to our collective human predicament. First, they can open players’ eyes to the moral implications of systems by experimenting with them and witnessing the results. Games might offer moments of reflection and of epiphany, connecting personal morality with systemic awareness. A player might see how tweaking health care policies affects a family’s lives, or how environmental regulation could shape the destiny of a polar bear. Games might lead people to begin to see a soul within the machine.

And perhaps systems might begin to learn lessons from game design. Why must the computer systems that exercise more and more control over our daily lives be morally inert? If computer games — mere software — can lead players to weep, perhaps the mechanization of our world needn’t be soulless. If a global society demands that our interpersonal relations become abstracted into an iron cage of systems, can’t we re-envision such systems as a purposeful tool for realizing our collective moral vision?

Computer games won’t solve the problems that face humanity and our planet. But media, from cuneiform to newspapers to film, have always assisted humanity to reach new levels of moral self-realization and galvanize moral action. How fortuitous it may prove that computer games with their unique capacity for choice and systems-modeling should arise at this critical juncture of our evolution.

– Gene Koo