Yochai Benkler

You are currently browsing articles tagged Yochai Benkler.

A couple days ago I responded to a posting on an email list. What I wrote struck a few chords, so I thought I’d repeat it here, with just a few edits, and then add a few additional thoughts as well. Here goes.

Reading _____’s references to ancient electrical power science brings to mind my own technical background, most of which is now also antique. Yet that background still informs of my understanding of the world, and my curiosities about What’s Going On Now, and What We Can Do Next. In fact I suspect that it is because I know so much about old technology that I am bullish about framing What We Can Do Next on both solid modern science and maximal liberation from technically obsolete legal and technical frameworks — even though I struggle as hard as the next geek to escape those.

(Autobiographical digression begins here. If you’re not into geeky stuff, skip.)

As a kid growing up in the 1950s and early ’60s I was obsessed with electricity and radio. I studied electronics and RF transmission and reception, was a ham radio operator, and put an inordinate amount of time into studying how antennas worked and electromagnetic waves propagated. From my home in New Jersey’s blue collar suburbs, I would ride my bike down to visit the transmitters of New York AM stations in the stinky tidewaters flanking the Turnpike, Routes 46 and 17, Paterson Plank Road and the Belleville Pike. (Nobody called them “Meadowlands” until many acres of them were paved in the ’70s to support a sports complex by that name.) I loved hanging with the old guys who manned those transmitters, and who were glad to take me out on the gangways to show how readings were made, how phasing worked (sinusoidal synchronization again), how a night transmitter had to address a dummy load before somebody manually switched from day to night power levels and directional arrays. After I learned to drive, my idea of a fun trip was to visit FM and TV transmitters on the tops of buildings and mountains. (Hell, I still do that.) Thus I came to understand skywaves and groundwaves, soil and salt water conductivity, ground systems, directional arrays and the inverse square law, all in the context of practical applications that required no shortage of engineering vernacular and black art.

I also obsessed on the reception end. In spite of living within sight of nearly every New York AM transmitter (WABC’s tower was close that we could hear its audio in our kitchen toaster), I logged more than 800 AM stations on my 40s-vintage Hammarlund HQ-129x receiver, which is still in storage at my sister’s place. That’s about 8 stations per channel. I came to understand how two-hop skywave reflection off the E layer of the ionosphere favored flat land or open water midway between transmission and reception points . This, I figured, is why I got KSL from Salt Lake City so well, but WOAI from San Antonio hardly at all. (Both were “clear channel” stations in the literal sense — nothing else in North America was on their channels at night, when the ionosphere becomes reflective of signals on the AM band.) Midpoint for the latter lay within the topographical corrugations of the southern Apalachians. Many years later I found this theory supported by listening in Hawaii to AM stations from Western North America, on an ordinary car radio. I’m still not sure why I found those skywave signals fading and distorting (from multiple reflections in the very uneven ionosphere) far less than those over land. I am sure, however, that most of this hardly matters at all to current RF and digital communication science. After I moved to North Carolina, I used Sporadic E reflections to log more than 1200 FM stations, mostly from 800 to 1200 miles away, plus nearly every Channel 3 and 6 (locally, 2,4 and 5 were occupied) in that same range. All those TV signals are now off the air. (Low-band VHF TV — channels 2 to 6 — are not used for digital signals in the U.S.) My knowledge of this old stuff is now mostly of nostalgia value; but seeking it has left me with a continuing curiosity about the physical world and our infrastructural additions to it. This is why much of what looks like photography is actually research. For example, this and this. What you’re looking at there are pictures taken in service to geology and archaeology.

(End of autobiographical digression.)

Speaking of which, I am also busy lately studying the history of copyright, royalties and the music business — mostly so ProjectVRM can avoid banging into any of those. This research amounts to legal and regulatory archaeology. Three preliminary findings stand out, and I would like to share them.

First, regulatory capture is real, and nearly impossible to escape. The best you can do is keep it from spreading. Most regulations protect last week from yesterday, and are driven by the last century’s leading industries. Little if any regulatory lawmaking by established industries — especially if they feel their revenue bases threatened, clears room for future development. Rather, it prevents future development, even for the threatened parties who might need it most. Thus the bulk of conversation and debate, even among the most progressive and original participants, takes place within the bounds of still-captive markets. This is why it is nearly impossible to talk about Net-supportive infrastructure development without employing the conceptual scaffolding of telecom and cablecom. We can rationalize this, for example, by saying that demand for telephone and cable (or satellite TV) services is real and persists, but the deeper and more important fact is that it is very difficult for any of us to exit the framing of those businesses and still make sense.

Second, infrastructure is plastic. The term “infrastructure” suggests physicality of the sturdiest kind, but in fact all of it is doomed to alteration, obsolescence and replacement. Some of it (Roman roads, for example) may last for centuries, but most of it is obsolete in a matter of decades, if not sooner. Consider over-the-air (OTA) TV. It is already a fossil. Numbered channels persist as station brands; but today very few of those stations transmit on their branded analog channels, and most of them are viewed over cable or satellite connections anyway. There are no reasons other than legacy regulatory ones to maintain the fiction that TV station locality is a matter of transmitter siting and signal range. Viewing of OTA TV signals is headed fast toward zero. It doesn’t help that digital signals play hard-to-get, and that the gear required for getting it sucks rocks. Nor does it help that cable and satellite providers that have gone out of their way to exclude OTA receiving circuitry from their latest gear, mostly force subscribing to channels that used to be free. As a result ABC, NBC, CBS, Fox and PBS are now a premium pay TV package. (For an example of how screwed this is, see here.) Among the biggest fossils are thousands of TV towers, some more than 2000 feet high, maintained to continue reifying the concept of “coverage,” and to legitimize “must carry” rules for cable. After live audio stream playing on mobile devices becomes cheap and easy, watch AM and FM radio transmission fossilize in exactly the same ways. (By the way, if you want to do something green and good for the environment, lobby for taking down some of these towers, which are expensive to maintain and hazards to anything that flies. Start with this list here. Note the “UHF/VHF transmission” column. Nearly all these towers were built for analog transmission and many are already abandoned. This one, for example.)

Third, “infrastructure” is a relatively new term and vaguely understood outside arcane uses within various industries. It drifted from military to everyday use in the 1970s, and is still not a field in itself. Try looking for an authoritative reference book on the general subject of infrastructure. There isn’t one. Yet digital technology requires that we challenge the physical anchoring of infrastructure as a concept. Are bits infrastructural? How about the means for arranging and moving them? The Internet (the most widespread means for moving bits) is defined fundamentally by its suite of protocols, not by the physical media over which data travels, even though there are capacity and performance dependencies on the latter. Again, we are in captured territory here. Only in conceptual jails can we sensibly debate whether something is an “information service” or a “telecommunication service”. And yet most of us who care about the internet and infrasructure do exactly that.

That last one is big. Maybe too big. I’ve written often about how hard it is to frame our understanding of the Net. Now I’m beginning to think we should admit that the Internet itself, as concept, is too limiting, and not much less antique than telecom or “power grid”.

“The Internet” is not a thing. It’s a finger pointing in the direction of a thing that isn’t. It is the name we give to the sense of place we get when we go “on” a mesh of unseen connections to interact with other entitites. Even the term “cloud“, labeling a utility data service, betrays the vagueness of our regard toward The Net.

I’ve been on the phone a lot lately with Erik Cecil, a veteran telecom attorney who has been thinking out loud about how networks are something other than the physical paths we reduce them to. He regards network mostly in its verb form: as what we do with our freedom — to enhance our intelligence, our wealth, our productivity, and the rest of what we do as contributors to civilization. To network we need technologies that enable what we do in maximal ways.  This, he says, requires that we re-think all our public utilities — energy, water, communications, transportation, military/security and law, to name a few — within the context of networking as something we do rather than something we have. (Think also of Jonathan Zittrain’s elevation of generativity as a supportive quality of open technology and standards. As verbs here, network and generate might not be too far apart.)

The social production side of this is well covered in Yochai Benkler‘s The Wealth of Networks, but the full challenge of what Erik talks about is to re-think all infrastructure outside all old boxes, including the one we call The Internet.

As we do that, it is essential that we look to employ the innovative capacities of businesses old and new. This is a hat tip in the general direction of ISPs, and to the concerns often expressed by Richard Bennett and Brett Glass: that new Internet regulation may already be antique and unnecessary, and that small ISPs (a WISP in Brett’s case) should be the best connections of high-minded thinkers like yours truly (and others named above) to the real world where rubber meets road.

There is a bigger picture here. We can’t have only some of us painting it.

Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

I dunno why the New York Times appeared on my doorstep this morning, along with our usual Boston Globe (Sox lost, plus other news) — while our Wall Street Journal did not. (Was it a promo? There was no response envelope or anything. And none of the neighbors gets a paper at all, so it wasn’t a stray, I’m pretty sure.) Anyway, while I was paging through the Times over breakfast, I was thinking, “It’s good, but I’m not missing much here–” when I hit Hot Story to Has-Been: Tracking News via Cyberspace, by Patricia Cohen, on the front page of the Arts section. It’s about MediaCloud, a Berkman Center project, and features quotage from Ethan Zuckerman and Yochai Benkler

ez_yb

(pictured above at last year’s Berkman@10).

The home page of MediaCloud explains,

The Internet is fundamentally altering the way that news is produced and distributed, but there are few comprehensive approaches to understanding the nature of these changes. Media Cloud automatically builds an archive of news stories and blog posts from the web, applies language processing, and gives you ways to analyze and visualize the data.

This is a cool thing. It also raises the same question that is asked far too often in other contexts: Why doesn’t Google do that? Here’s the short answer: Because the money’s not there. For Google, the money is in advertising.

Plain enough, but let’s go deeper.

It’s an interesting fact that Google’s index covers the present, but not the past. When somebody updates their home page, Google doesn’t remember the old one, except in cache, which gets wiped out after a period of time. It doesn’t remember the one before that, or the one before that. If it did it might look, at least conceptually, like Apple’s Time Machine:

timemachine_hero_a

If Google were a time machine, you could not only see what happened in the past, but do research against it. You could search for what’s changed. Not on Google’s terms, as you can, say, with Google Trends, but on your own, with an infinite variety of queries.

I don’t know if Google archives everything. I suspect not. I think they archive search and traffic histories (or they wouldn’t be able to do stuff like this), and other metadata. (Mabye a Googler can fill us in here.)

I do know that Technorati keeps (or used to keep) an archive of all blogs (or everything with an RSS feed). This was made possible by the nature of blogging, which is part of the Live Web. It comes time-stamped, and with the assumption that past posts will accumulate in a self-archiving way. Every blog has a virtual directory path that goes domainname/year/month/day/post. Stuff on the Static Web of sites (a real estate term) were self-replacing and didn’t keep archives on the Web. Not by design, anyway.

I used to be on the Technorati advisory board and talked with the company quite a bit about what to do with those archives. I thought there should be money to be found through making them searchable in some way, but I never got anywhere with that.

If there isn’t an advertising play, or a traffic-attraction play (same thing in most cases), what’s the point? So goes the common thinking about site monetization. And Google is in the middle of that.

So this got me to thinking about research vs. advertising.

If research wants to look back through time (and usually it does), it needs data from the past. That means the past has to be kept as a source. This is what MediaCloud does. For research on news topics, it does one of the may things I had hoped Technorati would do.

Advertising cares only about the future. It wants you to buy something, or to know about something so you can act on it at some future time.

So, while research’s time scope tends to start in present and look back, advertising’s time scope tends to start in the present and look forward.

To be fair, I commend Google for all the stuff it does that is not advertising-related or -supported, and it’s plenty. And I commend Technorati for keeping archives, just in case some business model does finally show up.

But in the meantime I’m also wondering if advertising doesn’t have some influence on our sense of how much the past matters. And my preliminary response is, Yes, it does. It’s an accessory to forgetfulness. (Except, of course, to the degree it drives us to remember — through “branding” and other techniques — the name of a company or product.)

Just something to think about. And maybe research as well. If you can find the data.

Tags: , , , , , , , , , , , , ,