A Plea for the Obscure Parts of Obvious Systems

February 4th, 2010 by Christian

(or: Doing Better User Studies by Looking Away from the User)

This is a brief summary of my presentation at the iConference 2010.  I was invited to address the topic of “Ethnographies of Large-Scale Technical Systems.”  I did this along with a delightful panel of people: David Ribes, Steven Jackson, and Susan Leigh Star.  Here goes:

A Plea for the Obscure Parts of Obvious Systems

As an ethnographer, my plan of attack has often been to choose a somewhat obscure object, technical system, or site (recently, I’ve studied several wireless Internet systems).  It might be a large technical system but it’s usually obscure, and that used to make me feel safe.  Most people haven’t heard of the wireless systems I’ve written about.  I get to have my site all to myself.  If another social researcher suddenly started to talk about my object that would be horrible news–they might contradict me.  (Yikes!)

When writing up my work on obscure systems, I feel an obligation to thoroughly explain and contextualize them because they are almost certainly unfamiliar to my readers.  With my writing, to do a good job I probably need to cover:

  • What is it?
  • Who uses it?
  • What do they do with it? Why?
  • Who made it?
  • Who maintains it?
  • Who pays for it?
  • How is it made? How does it work?
  • How does its design have consequences?
  • Why does all this matter?

When I read other ethnographies of obscure large technical systems, I find they have the same objectives.  For instance, on this very panel I sit with Steven Jackson who has written useful stuff about DWRSIM (which I had never heard of before I read his article), and Leigh Star who has written useful stuff about the Worm Community System (never heard of it before I read her article).  It’s the general theoretical insight I’m after and so it doesn’t matter so much what obscure system it is.

Lately I’ve started to think that it might also be useful to have ethnographies of  large-scale technical systems that have millions of users and are far from obscure.  This is my sentiment as a reader of ethnography as well as as an ethnographer.


I mean systems like Facebook, World of Warcraft, YouTube, GMail, AIM, Modern Warfare 2, and so on.  These systems are the everyday informational infrastructure for most people. They are most definitely studied in Information Schools, but when the system being studied is not obscure, for some reason we rarely find it necessary to do all of the contextualization work I listed in the bullets above.  If someone comes up to you in the hallway of an Information School and says: “I’m studying ____.” and the object in the blank is not obscure (say it’s twitter), you can assume that they are doing a user study (who uses twitter? what do people say on twitter? …).

Some of my Best Friends are Users

Some of my best friends are users, as the saying goes. You are a user.  I love users. I am a user.  I love user studies.  People in Information Schools should do user studies.  But I worry that there is a big trade-off in framing research on what we might call “obvious” systems as being only (or usually) about users.  If someone says “I’m studying YouTube.” When I read what they are writing I often find that they focus mostly on two of my bullet points:

  • Who uses it?
  • What do they do with it? Why?

Hopefully they also spend some time on:

  • Why does all this matter?

The turn toward user studies (and somewhat away from design/producer studies) is generally OK with me but I find it interesting that in a USER study of an obscure system a lot of contextualization work is still required.  Whereas with an obvious system you pretty much get a pass on that part.  (We all know what YouTube is, so why bother with all that?)

People who frame studies of large systems as obviously about users are making a trade-off between access to the site and the impact of the findings they’ll get.  Of course when we propose a study of “Facebook” we probably want to study users because these are the most accessible parts of the system.  I imagine it would be quite hard to talk your way into Facebook HQ to do a producer study, so researchers interested in the topic don’t even pause to consider it as an option.  They also usually don’t learn very much about how Facebook works or who owns it, who designed it, etc.

But I think this trade-off is being mismanaged.  Sure, access is much harder when you want to study powerful corporations who don’t particularly want your critical eyes on them.  But focusing only on the accessible part means that you have the tough job of analyzing the familiar and then selling your insights about data that everyone has.  This can be really hard.

I am no Erving Goffman

To pick a familiar name, Erving Goffman had a great talent of generating amazing insights out of the kind of everyday interactions that everyone experiences.  But I am no Erving Goffman. I’d rather unearth something you didn’t know and ought to know.  If you focus on the accessible parts of a system you’ll have access to the users, but then as an ethnographer you are forced into coming up with something new and insightful to say about an experience where your reader already feels expert.  These studies hail the reader by saying: Here is an article where I, a Facebook user, will tell you, a Facebook user, amazing new things about Facebook.   That’s hard!  So maybe they have no access problem but they have a big findings problem. A user ethnography of Facebook can be done and done very well (remember — I like users) but we don’t want the literature to be only focused there.

If instead of a “pure” user study a researcher spends a little more time on context — digging up information that was not widely known — the resulting work has a lot more chance to gain a wide readership.  It could still be a user study but with a little more contextual research, or it could be immersion in another site of the system along with or instead of users. There is a small but interesting literature in anthropology about the ethnography of the powerful (sometimes called “studying up”).  People in Information Schools are usually very comfortable doing ethnography of the powerful, but only certain kinds of power are OK to study.  It is common to see social researchers studying computer scientists or natural scientists.  They’re powerful interlocutors and this changes your ethnography (no one in those groups will be too impressed by your fancy sociology or information studies Ph.D.).

However we are ready to ignore other kinds of power.  Popular commercial systems are designed, deployed, and managed by the rich, and this is a kind of power we don’t usually want to grapple with.  Access to these sites sounds difficult and in fact it is difficult, but if you can get in there, oh the news you can bring back!

This approach is already out there.  It just isn’t common.  Some of my favorite ethnographies of technological systems are those where the researcher picked an obvious system and then went to the trouble to fully contextualize it (including at least a little bit about many things like users, producers, design, manufacture, maintenance, finances, public policy, and more). Two quick touchstones:  Grint and Woolgar’s The Machine at Work takes care to cover production and use of the Personal Computer.  Boczkowski’s writing about online journalism required him to talk his way into the newsroom of The New York Times.

Let me end by paraphrasing Eugene Webb, Donald Campbell, Richard Schwartz, and Lee Sechrest (from Unobtrusive Measures):  Science should opportunistically exploit all available points of observation.

So here’s to daring.   Here’s to accessing hard-to-reach and unfamiliar sites and populations.  Here’s to putting the hardest part of a research project at the beginning (access) and not at the end (findings).  Here’s to learning more about users by looking briefly away and studying the many other parts of a sociotechnical system that profoundly affect them.


In the live, in-person version of these remarks I referenced a pretty obscure book chapter by Lucy Suchman, Daniel Miller, and Don Slater.  It is titled “Anthropology” and it appeared in The Academy and the Internet edited by Monroe Price and Helen Nissenbaum (Peter Lang, 2004).  It says in part that focusing on the user/producer division in ethnographies is silly.


A truly fantastic audience member (who was he? FOUND: Ira Monarch)  pointed out in the Q&A that for some systems the producers are easier to access than the users and this skews the ethnographic research in the opposite direction that I am talking about.  He mentioned military procurement, where access to developers and businesses working in the area was relatively easy compared to doing a user study on the people who use weapons in wars.  (I guess the term of art is “warfighters.”)


Looks like I could have been more upbeat when presenting this in person?  http://twitter.com/bblodgett/status/8637139868

4 Responses to “A Plea for the Obscure Parts of Obvious Systems”

  1. Dawn Nafus Says:

    Great piece Christian! A possible lead: anthropologists already working in those big corporations and designing these systems occasionally write about them. Though sometimes the exact systems can’t be named, you do get a sense of how ‘the user’ figures in the building of them. Melissa Cefkin just recently came out with a book along these lines called “Ethnography and the Corporate Encounter” which was very tech-heavy. It would, though, take a very particular kind of reading of this literature to get at what you are seeking.

  2. Kevin Hamilton Says:

    I say, if a presentation such as this constitutes being “grouchy,” I’ll take some more grouchy please.

  3. Alison Powell » The Social Media Echo Chamber Says:

    […] what I’m thinking about is along the lines of what Christian Sandvig is working on: these applications are now becoming infrastructures for participation. To understand […]

  4. The Social Media Echo Chamber | Alison Powell Says:

    […] what I’m thinking about is along the lines of what Christian Sandvig is working on: these applications are now becoming infrastructures for participation. To understand […]

Leave a Reply

Bad Behavior has blocked 50 access attempts in the last 7 days.