Social shell games

If you listen to Episode 49: Parler, Ownership, and Open Source of the latest Reality 2.0 podcast, you’ll learn that I was blindsided at first by the topic of Parler, which has lately become a thing. But I caught up fast, even getting a Parler account not long after the show ended. Because I wanted to see what’s going on.

Though self-described as “the world’s town square,” Parler is actually a centralized social platform built for two purposes: 1) completely free speech; and 2) creating and expanding echo chambers.

The second may not be what Parler’s founders intended (see here), but that’s how social media algorithms work. They group people around engagements, especially likes. (I think, for our purposes here, that algorithmically nudged engagement is a defining feature of social media platforms as we understand them today. That would exclude, for example, Wikipedia or a popular blog or newsletter with lots of commenters. It would include, say, Reddit and Linkedin, because algorithms.)

Let’s start with recognizing that the smallest echo chamber in these virtual places is our own, comprised of the people we follow and who follow us. Then note that our visibility into other virtual spaces is limited by what’s shown to us by algorithmic nudging, such as by Twitter’s trending topics.

The main problem with this is not knowing what’s going on, especially inside other echo chambers. There are also lots of reasons for not finding out. For example, my Parler account sits idle because I don’t want Parler to associate me with any of the people it suggests I follow, soon as I show up:

l also don’t know what to make of this, which is the only other set of clues on the index page:

Especially since clicking on any of them brings up the same or similar top results, which seem to have nothing to do with the trending # topic. Example:

Thus endeth my research.

But serious researchers should be able to see what’s going on inside the systems that produce these echo chambers, especially Facebook’s.

The problem is that Facebook and other social networks are shell games, designed to make sure nobody knows exactly what’s going on, but feels okay with it, because they’re hanging with others who agree on the basics.

The design principle at work here is obscurantism—”the practice of deliberately presenting information in an imprecise, abstruse manner designed to limit further inquiry and understanding.”

To put the matter in relief, consider a nuclear power plant:

(Photo of kraftwerk Grafenrheinfeld, 2013, by Avda. Licensed CC BY-SA 3.0.)

Nothing here is a mystery. Or, if there is one, professional inspectors will be dispatched to solve it. In fact, the whole thing is designed from the start to be understandable, and its workings accountable to a dependent public.

Now look at a Facebook data center:

What it actually does is pure mystery, by design, to those outside the company. (And hell, to most, maybe all, of the people inside the company.) No inspector arriving to look at a rack of blinking lights in that place is going to know either. What Facebook looks like to you, to me, to anybody, is determined by a pile of discoveries, both on and off of Facebook’s site and app, around who you are and what to machines you seem interested in, and an algorithmic process that is not accountable to you, and impossible for anyone, perhaps including Facebook itself, to fully explain.

All societies, and groups within societies, are echo chambers. And, because they cohere in isolated (and isolating) ways it is sometimes hard for societies to understand each other, especially when they already have prejudicial beliefs about each other. Still, without the further influence of social media, researchers can look at and understand what’s going on.

Over in the digital world, which overlaps with the physical one, we at least know that social media amplifies prejudices. But, though it’s obvious by now that this is what’s going on, doing something to reduce or eliminate the production and amplification of prejudices is damn near impossible when the mechanisms behind it are obscure by design.

This is why I think these systems need to be turned inside out, so researchers can study them. I don’t know how to make that happen; but I do know there is nothing more large and consequential in the world that is also absent of academic inquiry. And that ain’t right.

BTW, if Facebook, Twitter, Parler or other social networks actually are opening their algorithmic systems to academic researchers, let me know and I’ll edit this piece accordingly.



Leave a Reply

Your email address will not be published. Required fields are marked *