You are viewing a read-only archive of the Blogs.Harvard network. Learn more.

Think About New Media Algorithmically

March 20th, 2014 by Christian

(or: How to Explain Yourself to a General Audience of Sophomores)

I recently gave a guest lecture to the University of Michigan sophomore special topics course “22 Ways to Think About New Media.”  This is a course intended for students who have not yet declared a major, where each week a faculty member from a different discipline describes a “way” that they think about “New Media.”  One goal of this is “a richer appreciation of the liberal arts and sciences,” and so I was asked to consider my remarks in the context of questions like: “What is the place of your work in society? What kinds of questions do you ask? How, in short, do you think?”

Wow, that’s a tall order. Explain and defend your field — communication and information studies — to people who have never encountered it before. Tell (for example) an undergraduate interested in chemistry why they should care about your work. And say something interesting about new media. Well, I’ll give it a shot. Here’s a summary of my attempt.

I decided that the way I want people to think about New Media is “algorithmically.” I meant that as a one-word shorthand for “I am interested in algorithms,” or “I think about new media algorithms and try to understand their implications,” and not “I am an algorithm.” (*)

A central question in the study of communication is this one: How do communication and information systems and institutions organize and shape what we know and think? That is, there is a great amount of material that could be watched, read, and heard but of course we each only have time to experience a small fraction of the whole. While we have some freedom to choose what we experience, there are also processes in media systems that shape what music, movies, news, and even conversations we pay attention to. This shaping ultimately helps to determine our shared culture, and new media are now transforming these processes — and therefore our shared culture.

(For instance, Twitter’s algorithms currently think I should pay attention to #NCAAMarchMadness2014 [which is trending]. They tell me this is a recommendation “just for me” [see below]. In fact I hate sports, so perhaps Twitter hates me.)

I used the example of trashy pop bands – a student suggested One Direction – to illustrate this. There may be a large number of musicians with enough skill to comprise a trashy pop band but only a few trashy pop bands are successful at any given time. Musical talent is far more widely distributed than attention to specific bands. Even a casual music listener will agree that talent does not necessarily determine popularity. So what does?

The same is true of more serious topics—consider news. There is enough serious news to fill many newspapers but somehow it comes to be that we hear about certain topics over and over again, while other topics are ignored. How is it that the same events might get more coverage at one moment but less at another moment? It does not seem to be about the “quality” of the news story or the importance of the events, taken in isolation. At this point I employed Ethan Zuckerman’s comparison of attention to Kim Kardashian vs. famine.

Google Trends: Interest in Kardashian vs. famine

(Click to enlarge)

Ultimately this shaping and organization of communication and information determines who we are as a collective, as a public, as a society. A central problem in the study of communication and information has been: how do communication and information systems and institutions shape our knowledge and attention?

This is a particularly interesting moment to consider this topic because, while this is a perennial research problem in the study of communication (cf. Gatekeeping Theory, Agenda-Setting Theory, Framing, Priming, Cultivation Theory, Theories of the Public Sphere, etc.), the new prevalence of attention sorting algorithms on the Internet is transforming the way that attention and knowledge are shaped. A useful phrase naming the overall phenomenon is “Algorithmic Culture,” coined by Alex Galloway.

Decades ago, decisions made by a few behind-the-scenes industry professionals like legendary music producer John Hammond would be instrumental in selecting and promoting specific media content (like the musical acts of Count Basie, Bob Dylan, and Aretha Franklin), and newspaper owners like Joseph Pulitzer decided what should be spread as news (such as color comic strips or crusading investigative reporting exposing government corruption).

They may or may not have done a good job, but it is interesting that today they do not wield power in the same way. Today on the Internet many decisions about media content and advertising are made by algorithms. An algorithm, or step-by-step procedure for accomplishing something, is typically a piece of computer software that uses some data about you to determine what you will watch, hear, or read. A simple algorithm might be “show the most recent thing any friend of mine has posted” — however most algorithms in use are much more complex.

Algorighms sort both content and advertising. Older media industries often promoted content quite broadly, but now the resulting decisions may be individualized to you, meaning that no two people might see the same Web page. Although algorithms are written by people, they often have effects that are hard for any single person to anticipate.

To introduce this topic, I suggested two online readings that are intended to be accessible to a general audience. They both consider how new media are now re-shaping the selection of content online by focusing on the idea of the algorithm. I decided to forward these two from The Atlantic:

(1.) “The Algorithm Economy: Inside the Formulas of Facebook and Amazon,” by Derek Thompson, 12 March 2014, The Atlantic

This very short blog post introduces the idea that algorithms (meaning, a repeatable step-by-step procedure for accomplishing something) now drive much of our experience with new media. It contrasts two major algorithms that most people are familiar with: (1) Amazon.com product recommendations (technically called item-to-item collaborative filtering) and (2) the Facebook news feed (called EdgeRank). A key point is that all algorithms are not equal — these two implementations of algorithmic sorting of content are quite different in their implications and effects.

(2.) “A Guide to the Digital Advertising Industry That’s Watching Your Every Click,” by Joe Turow, 7 Feb 2012, The Atlantic

Most content on the Internet is available for free and supported by online advertising. This longer article is a book excerpt from the introduction of Turow’s book The Daily You. It introduces the new ways that the online advertising industry operates and describes the way that firms match customer data to online content and advertising. This article focuses on the data about audiences that must be gathered and analyzed in order to provide personalized advertising. It then raises the question of whether or not people know about this large-scale data collection about them and considers how they feel about it.

Optional extra: For a more in-depth treatment of the topic, see Tarleton Gillespie’s “The Relevance of Algorithms,” recently released in Media Technologies.

Okay, I’ll stop here for now. But in my next post, I’ll consider how to demonstrate the effects of algorithmic sorting in a simple and easy-to-understand way. Then I’ll tell you how the students reacted to all this.

(*) – Although some days I do feel like an algorithm.

2 Responses to “Think About New Media Algorithmically”

  1. Corrupt Personalization | Social Media Collective Says:

    […] my last two posts I’ve been writing about my attempt to convince a group of sophomores with no background in my field that there has been a shift to the algorithmic allocation of attention – and that this is […]

  2. Show-and-Tell: Algorithmic Culture | Social Media Collective Says:

    […] Last week I tried to get a group of random sophomores to care about algorithmic culture. I argued that software algorithms are transforming communication and knowledge. The jury is still out on my success at that, but in this post I’ll continue the theme by reviewing the interactive examples I used to make my point. I’m sharing them because they are fun to try. I’m also hoping the excellent readers of this blog can think of a few more. […]

Leave a Reply

Bad Behavior has blocked 234 access attempts in the last 7 days.