Business

You are currently browsing the archive for the Business category.

Wow: Regis McKenna‘s Wikipedia entry is one short paragraph. Geoffrey Moore‘s is barely more than a stub. We’re talking here about two of the greatest marketing minds in human history. I’m not joking. Amazing.

Neither has a picture, either. I just checked my own 31,000-shot gallery, and didn’t find either one. I did find the great Phil Moore, however. Like I said at that link, one of my heroes.

Blog search is mighty thin in Wikipedia. Technorati’s entry is stale. IceRocket and BlogPulse are stubs. BlogScope is minimal.

It’s really wierd. While “real time” is heating up as a topic, real time search seems to have fallen off the radar of everybody other than itself.

Take this piece by Marshall Kirkpatrick in ReadWriteWeb. It begins, Web search, real-time search and social search. That’s a pretty compelling combination and it’s what both Google and Facebook put on the table today in a head-to-head competiton. Then it compares Google, Facebook and Bing at all three, in a chart.

Hey, why not the search engines that have been looking at real time for the duration? Here’s IceRocket on real time search as a string. You get blogs, Twitter, video, news and images. Fast, simple, uncomplicated, straightforward. Like a search engine ought to be.

Here’s the IceRocket trend line for “real time search”. And here’s the BlogScope trend line for “blogging”.

Earth to buzz: You’re obsessing on the wrong thing. “Real time search” isn’t just Twitter and Facebook. It’s blog search too. Always was.

Syndication and real time will matter long after “social” goes passé. (And “social” will matter long after the next buzzthing goes passé.)

For whatever reasons, Google and Bing don’t get it. There are better tools out there for Live Web search. Check ’em out.

Bonus graph.

The older I get, the earlier it seems.

So many gone things once looked like final stages: AM radio, nuclear bombs, FM, stereo, FM stereo, TV, color TV, quadrophonic sound, answer machines, PCs, online services, bulletin boards, home PBXes, newsgroups, instant messaging, cell phones, HD, browsing, pirate radio, free wi-fi, friending, tweeting.

Yeah, some of those aren’t gone yet, but don’t count on their staying around. Not in their current forms.

Three conditions have been profoundly increased by technology during my brief (62.2 year) lifetime: connectivity, autonomy and abundance. Those have been provided respectively by the Net, personal computing, and data processing and storage. I can now connect with anybody or anything pretty much anywhere I go, as an autonomous actor rather than a captive dependent on some company’s silo or walled garden. I can also access, accumulate and put to use many kinds of information of relevance to myself and my world.

Some creepy dependencies are still involved, such as the ones I have with ISPs and phone companies. But I believe even those will become substitutable services in the long run, much as the best “cloud” services are also becoming substitutable utilities.

I haven’t said that all this is a Good Thing. In fact I’m not sure it is. Meaning I’m not sure it has been good for us, or our world, that we have drifted so far from the hunting and gathering animals we were when we diasporized out of Africa during the last Ice Age. Perhaps we have adapted well without evolving at all. Think about it.

We are, if nothing else (and yes, we are much else) a pestilence on the planet. Few creatures other than rats and microbes are more widespread, or have done more to eat and alter the Earth’s contents and its living dependents. Sure, I’m enjoying it too. But at some point the party ends. When it does, what do we go home to?

Anyway, this all comes to mind while reading Nick Carr‘s The eternal conference call. His bottom lines are killer:

  The flaw of synchronous communication has been repackaged as the boon of realtime communication. Asynchrony, once our friend, is now our enemy. The transaction costs of interpersonal communication have fallen below zero: It costs more to leave the stream than to stay in it. The approaching Wave promises us the best of both worlds: the realtime immediacy of the phone call with the easy broadcasting capacity of email. Which is also, as we’ll no doubt come to discover, the worst of both worlds. Welcome to the conference call that never ends. Welcome to Wave hell.

It’s the latest among Nick’s Realtime Chronicles. As always, strong stuff.

The original was born during a writing project David Sifry and I were doing for . Late at night David pinged me and said “Look at this,” and I was amazed. It was the first search engine for what we then called The Live Web (and now call Real Time). Basically, it was a search engine that just paid attention to RSS, which back then consisted mostly of blogs. (I welcome corrections from David, or anybody, on that. It’s been awhile.) When David made Technorati a company, he put me on its advisory board, and for awhile I had some influence on where it went and what it did. It was also, for many subjects, my primary search engine. If I wanted to follow conversation about a subject, Technorati was where I went first. I also liked the way it allowed me to look at a topic’s trending over the last few weeks or months. Technorati was also a technical pioneer, introducing tag search, along with new standards and practices around tagging in general. After Google Blogsearch came along, I used both, but Technorati was usually my first choice. I especially liked s.technorati.com, which gave the same results through a plain no-bullshit search UI.

Over the years, however, Technorati came to value popularity and buzz more than the kind of stuff I was looking for. Some of the same functionality was there, but it was buried deeper and deeper. For example, feeds of searches. If I wanted to subscribe to feeds of, say, a search for Nokia N900, I could click on something that said (or meant) “get a feed for this search.” Google Blogsearch had the same feature, and made it easy. Still does, giving me a choice of Blog Alerts, Atom and RSS, under a heading that says “Subscribe”. Twitter search, similarly, has “feed for this query”.

Without being able to find that feed easily, I lost interest in Technorati, only going there when I couldn’t find the results I wanted elsewhere. By that time David and most of the other people I knew at Technorati had moved on, so I didn’t have much interest in volunteering advice.

But I learned this morning (via Twitter, naturally) that Technorati had gone through an overhaul. It’s certainly faster and less cluttered. But I still can’t find feeds for searches. Trending seems to be gone, or hidden where I can’t find it. And I have no idea how to do tag searches with it. Maybe that’s because, as CEO Richard Jalichandra explains here, “We’re eliminating many of Technorati.com‘s annoyances and some features, especially ones people didn’t use enough to justify the cost. Instead, we’re focusing on delivering the value people really want from us: instead of boiling the ocean to make coffee, we’re aiming to deliver the non-fat soy latte you asked for.”

Well, that “you” isn’t me. Which is cool. Technorati has become less a search company and more a media company. They launched Technorati Media at the same time. It’s a way to buy and sell ads. I wish them well with it. (Hey, Techcruch likes it.)

Meanwhile I’ll stick with Google Blogsearch for my live Web searching.

Wonder what the rest of ya’ll think.

JeffersonDependence begets subservience and venality, suffocates the germ of virtue, and prepares fit tools for the designs of ambition. — Thomas Jefferson

gettingpersonal

Near the start of his Institutional Corruption talk the other day, Larry Lessig sourced the quote above, from Thomas Jefferson. Larry was making a point: that the Framers were interested in personal independence, and not just that of a former colony. The Framers operated, however, in advance of the Industrial Revolution, which was won by Industry and lost by the rest of us — or at least by some of the roles we play in the marketplace.

Such as our roles as customers. While being customers gives us choices among products and services, many of the companies behind those products and services make us dependent on them, in ways we would not prefer if we had a choice. For a measure of how little choice we have, ask yourself how many times you’ve clicked “accept” to “Terms of Service” that typically give all advantages to the seller. Or look the number of cookies stored in your browser.

Well, the tide is turning. We’re finally starting to see a few tools that give users control over how data is collected and used. We’re working on some of those in the VRM community. And they’re a subject of discussion at

vroomboston2009_smaller

at 9:30am on Tuesday, at Harvard Law School, starting with the panel in the title graphic above. You can register here. Even if you show up only for the panel, it’ll help us know how many will be there.

There’s lots more about it at Civilizing the Personal Data Frontier, over at the ProjectVRM blog. Hope to see you there.

Tags: , , , , ,

Craig Burton in Open Letter to Steve Ballmer:

  Well F*&% me. Dude, after all of these years, you are still micro managing the Windows release!
  Now I know why Microsoft is now been relegated to insignificance in the identity market.
  The reason is simple. Internal policy, managed by you, prohibits product mangers from keeping up with trends and innovation.
  Let me repeat, if the Federated Identity Group made the required changes to the CardSpace selector today, it will be two years–maybe longer–before it makes it to the market.
  The bottleneck to this problem–and I suspect a slew of others–is you.
  As your friend and long-time competitor/advisor on these issues, I urge you to rethink how this is works. Because it isn’t working.

Craig has such a gentle way of being blunt. My fave line from Craig, addressed to a lame consulting client we shared many years ago: Put down the customer. Step away from the marketplace. I believe that’s what Craig is saying Microsoft is doing here, even if they don’t mean to.

What are we to make of Sidewiki? Is it, as Phil Windley says, a way to build the purpose-centric Web? Or is it, as Mike Arrington suggests, the latest way to “deface” websites?

The arguments here were foreshadowed in the architecture of the Web itself, the essence of which has been lost to history — or at least to search engines.

Look up Wikipedia+Web on Google and you won’t find Wikipedia’s World Wide Web entry on the first page of search results. Nor in the first ten pages. The top current result is for Web browser. Next is Web 2.0. Except for Wikipedia itself, none of the other results on the first page point to a Wikipedia page or one about the Web itself.

This illustrates how far we’ve grown away from the Web’s roots as a “hypertext project”. In Worldwide: Proposal for a Hypertext Project, dated 12 November 1990, Tim Berners-Lee and Robert Callao wrote,

Hypertext is a way to link and access information of various kinds as a web of nodes in which the user can browse at will. Potentially, Hypertext provides a single user-interface to many large classes of stored information such as reports, notes, data-bases, computer documentation and on-line systems help…

…There is a potential large benefit from the integration of a variety of systems in a way which allows a user to follow links pointing from one piece of information to another one. This forming of a web of information nodes rather than a hierarchical tree or an ordered list is the basic concept behind Hypertext…

Here we give a short presentation of hypertext.

A program which provides access to the hypertext world we call a browser. When starting a hypertext browser on your workstation, you will first be presented with a hypertext page which is personal to you: your personal notes, if you like. A hypertext page has pieces of text which refer to other texts. Such references are highlighted and can be selected with a mouse (on dumb terminals, they would appear in a numbered list and selection would be done by entering a number)…

The texts are linked together in a way that one can go from one concept to another to find the information one wants. The network of links is called a web . The web need not be hierarchical, and therefore it is not necessary to “climb up a tree” all the way again before you can go down to a different but related subject. The web is also not complete, since it is hard to imagine that all the possible links would be put in by authors. Yet a small number of links is usually sufficient for getting from anywhere to anywhere else in a small number of hops.

The texts are known as nodes. The process of proceeding from node to node is called navigation. Nodes do not need to be on the same machine: links may point across machine boundaries. Having a world wide web implies some solutions must be found for problems such as different access protocols and different node content formats. These issues are addressed by our proposal.

Nodes can in principle also contain non-text information such as diagrams, pictures, sound, animation etc. The term hypermedia is simply the expansion of the hypertext idea to these other media. Where facilities already exist, we aim to allow graphics interchange, but in this project, we concentrate on the universal readership for text, rather than on graphics.

Thus was outlined, right at the start, a conflict of interests and perspectives. On one side, the writer of texts and other creators of media goods. On the other side, readers and viewers, browsing. Linking the two is hypertext.

Note that, for Tim and Robert, both hypertext and the browser are user interfaces. Both authors and readers are users. As a writer I include hypertext links. As a reader with a browser I can follow them — but do much more. And it’s in that “more” category that Sidewiki lives.

As a writer, Sidewiki kinda creeps me out. As Dave Winer tweeted to @Windley, What if I don’t want it on my site? Phil tweeted back, but it’s not “on” your site. It’s “about” your site & “on” the browser. No?

Yes, but the browser is a lot bigger than it used to be. It’s turning into something of an OS. The lines between the territories of writer and reader, between creator and user, are also getting blurry. Tools for users are growing in power and abundance. So are those for creators, but I’m not sure the latter are keeping up with the former — at least not in respect to what can be done with the creators’ work. All due respect for Lessig, Free Culture and remixing, I want the first sources of my words and images to remain as I created them. Remix all you want. Just don’t do it inside my pants.

I’ll grant to Phil and Google that a Google sidebar is outside the scope of my control, and is not in fact inside my pants. But I do feel encroached upon. Maybe when I see Sidewiki in action I won’t; but for now as a writer I feel a need to make clear where my stuff ends and the rest of the world’s begins. When you’re at my site, my domain, my location on the Web, you’re in my house. My guest, as it were. I have a place here where we can talk, and where you can talk amongst yourselves as well. It’s the comments section below. If you want to talk about me, or the stuff that I write, do it somewhere else.

This is where I would like to add “Not in my sidebar.” Except, as Phil points out, it’s not my sidebar. It’s Google’s. That means it’s not yours, either. You’re in Google-ville in that sidebar. The sidewiki is theirs, not yours.

In Claiming My Right to a Purpose-Centric Web: SideWiki, Phil writes,

I’m an advocate of the techniques Google is using and more. I believe that people will get more from the Web when client-side tools that manipulate Web sites to the individual’s purpose are widely and freely available. A purpose-centric Web requires client-side management of Web sites. SideWiki is a mild example of this.

He adds,

The reaction that “I own this site and you’re defacing it” is rooted in the location metaphor of the Web. Purpose-centric activities don’t do away with the idea that Web sites are things that people and organizations own and control. But it’s silly to think of Web sites the same way we do land. I’m not trespassing when I use HTTP to GET the content of a Web page and I’m not defacing that content when I modify it—in my own browser—to more closely fit my purpose.

Plus a kind of credo:

I claim the right to mash-up, remix, annotate, augment, and otherwise modify Web content for my purposes in my browser using any tool I choose and I extend to everyone else that same privilege.

All of which I agree with—provided there are conventions on the creators’ side that give them means for clarifying their original authorship, and maintaining control over that which is undeniably theirs, whether or not it be called a “domain”.

For example, early in the history of Web, in the place where publishing, browsing and searching began to meet, a convention by which authors of sites could exclude their pages from search results was developed. The convention is now generally known as the Robots Exclusion Standard, and began with robots.txt. In simple terms, it was (and remains) a way to opt out of appearance in search results.

Is there something robots.txt-like that we could create that would reduce the sense of encroachment that writers feel as Google’s toolbar presses down from the top, and Sidewiki presses in from the left? (And who-knows-what from Google — or anybody — presses in from the right?)

I don’t know.

I do know that we need more and better tools in the hands of users — tools that give them independence both from authors like me and intermediaries like Google. That independence can take the form of open protocols (such as SMTP and IMAP, which allow users to do email with or without help from anybody), and it can take the form of substitutable tools and services such as browsers and browser enhancements. Nobody’s forcing anybody to use Google, Mozilla, any of their products or services, or any of the stuff anybody adds to either. This is a Good Thing.

But we’re not at the End of Time here, either. There is much left to be built out, especially on the user’s side. This is the territory where VRM (Vendor Relationship Management) lives. It’s about “equipping customers to be independent leaders and not just captive followers in their relationships with vendors and other parties on the supply side of the marketplace”.

I know Phil and friends are building VRM tools at his new company, Kynetx. I’ll be keynoting Kynetx’ first conference as well, which is on 18-19 November. (Register here.) Meanwhile there is much more to talk about in the whole area of individual autonomy and control — and work already underway in many areas, from music to public media to health care — which is why we’ll have VRooM Boston 2009 on 12-13 October at Harvard Law School. (Register here.)

Lots to talk about. Now, more places to do that as well.

Bonus Links:

[Later…] Lots of excellent comments below. I especially like Chris Berendes’. Pull quote: I better take the lead in remixing “in my pants”, lest Google do it for me. Not fair, but then the advent of the talkies was horribly unfair to Rudolf Valentino, among other silent film stars.

Tags: , , , , , , ,

I like sports, and I enjoy sports talk radio. That’s one reason I have five car radio buttons set on stations carrying games or sports talk: four on AM (WRKO/680, WEEI/850, WAMG/890, WZZN/1510) and one on FM (WBZ-FM/98.5). The other is that sports talk is about 50% advertising, so I like to punch around.

But I wasn’t surprised to read ESPN Radio’s Boston affiliate set to sign off, by Chad Finn in the Boston Globe. It begins, “ESPN Radio’s Boston affiliate, WAMG-AM 890, will go off the air Monday after four years plagued by a weak signal and limited local programming.” In fact, “weak” doesn’t cover it. By day WAMG’s 25,000-watt signal covers the Boston metro pretty well. But at night the station drops to 6,000 watts and a pattern that excludes the whole north side of the metro. The map at that last link doesn’t show how much like a headlight that pattern really is.

Yet that’s not the worst of it. WAMG was able to “drop in” to the market from nowhere in 2005, thanks to a change in FCC rules that protected what were once called (literally) “clear channel” stations. Because signals on the AM band bounce off the ionosphere at night, powerful ones can be heard up to thousands of miles away. Since there were then only 106 channels (every 10KHz from 540 to 1600KHz), a handful were granted “clear channel” status, making them the only stations on those channels at night. Thanks to this rule, I could hear KFI/640 from Los Angeles in New Jersey and WBZ/1030 from Boston in Palo Alto. Here’s the whole list of “clears” as they stood when their status still held.

Since long-distance listening had mostly gone away by the late 1970s, the FCC in 1980 reduced protection for the old “clears” to 750 miles from their transmitters. WLS/890 in Chicago was one of those clears. So you might say that WAMG appeared through a new loophole. Problem was, WLS had not gone away. It often still reached Boston quite well at night, pounding WAMG’s already-weak signal.

This last week I was down in the South portion of Cape Cod, where WAMG puts no signal at all. As a result I could hear WLS quite well on a portable radio, along with other Chicago giants.

The Globe story suggests that WAMG will probably go dark. Given the coverage realities, that might not be the worst thing.

A thought. WAMG is licensed to Dedham, not Boston. It might not be the worst thing for Clear Channel (the name of the company that owns WAMG and a zillion other stations) to sell the licesnse to somebody in the Dedham community, who could cut the power back (to save electricity) and just try to serve the local community itself. Provided, of course, that local radio of the AM sort (which has changed little since the 1920s) still makes sense.

[Later…] Following up on 10 October 2009, WAMG has been off the air for several weeks.

, for which I am a 1K (>100,000 miles per year) flyer, and which I fly so close to exclusively that I’m almost too familiar with their methods, has in the last year added a number of opt-out inconveniences to booking and check-in systems. Here is one for bonus miles that shows up both online and on-screen when going through the “Easy Check-In” process at the airport. Now the passenger has look carefully at the small print before saying no to something he or she doesn’t want.

Worse, one can’t opt out once for this stuff. One has to do it every time.

When I ask people behind the counter how they feel about it, they always say they hate it. It’s one more thing to straighten out with customers who meant to say “no,” but hit “accept” by mistake. Which is, at least partly, the idea.

A couple days ago I responded to a posting on an email list. What I wrote struck a few chords, so I thought I’d repeat it here, with just a few edits, and then add a few additional thoughts as well. Here goes.

Reading _____’s references to ancient electrical power science brings to mind my own technical background, most of which is now also antique. Yet that background still informs of my understanding of the world, and my curiosities about What’s Going On Now, and What We Can Do Next. In fact I suspect that it is because I know so much about old technology that I am bullish about framing What We Can Do Next on both solid modern science and maximal liberation from technically obsolete legal and technical frameworks — even though I struggle as hard as the next geek to escape those.

(Autobiographical digression begins here. If you’re not into geeky stuff, skip.)

As a kid growing up in the 1950s and early ’60s I was obsessed with electricity and radio. I studied electronics and RF transmission and reception, was a ham radio operator, and put an inordinate amount of time into studying how antennas worked and electromagnetic waves propagated. From my home in New Jersey’s blue collar suburbs, I would ride my bike down to visit the transmitters of New York AM stations in the stinky tidewaters flanking the Turnpike, Routes 46 and 17, Paterson Plank Road and the Belleville Pike. (Nobody called them “Meadowlands” until many acres of them were paved in the ’70s to support a sports complex by that name.) I loved hanging with the old guys who manned those transmitters, and who were glad to take me out on the gangways to show how readings were made, how phasing worked (sinusoidal synchronization again), how a night transmitter had to address a dummy load before somebody manually switched from day to night power levels and directional arrays. After I learned to drive, my idea of a fun trip was to visit FM and TV transmitters on the tops of buildings and mountains. (Hell, I still do that.) Thus I came to understand skywaves and groundwaves, soil and salt water conductivity, ground systems, directional arrays and the inverse square law, all in the context of practical applications that required no shortage of engineering vernacular and black art.

I also obsessed on the reception end. In spite of living within sight of nearly every New York AM transmitter (WABC’s tower was close that we could hear its audio in our kitchen toaster), I logged more than 800 AM stations on my 40s-vintage Hammarlund HQ-129x receiver, which is still in storage at my sister’s place. That’s about 8 stations per channel. I came to understand how two-hop skywave reflection off the E layer of the ionosphere favored flat land or open water midway between transmission and reception points . This, I figured, is why I got KSL from Salt Lake City so well, but WOAI from San Antonio hardly at all. (Both were “clear channel” stations in the literal sense — nothing else in North America was on their channels at night, when the ionosphere becomes reflective of signals on the AM band.) Midpoint for the latter lay within the topographical corrugations of the southern Apalachians. Many years later I found this theory supported by listening in Hawaii to AM stations from Western North America, on an ordinary car radio. I’m still not sure why I found those skywave signals fading and distorting (from multiple reflections in the very uneven ionosphere) far less than those over land. I am sure, however, that most of this hardly matters at all to current RF and digital communication science. After I moved to North Carolina, I used Sporadic E reflections to log more than 1200 FM stations, mostly from 800 to 1200 miles away, plus nearly every Channel 3 and 6 (locally, 2,4 and 5 were occupied) in that same range. All those TV signals are now off the air. (Low-band VHF TV — channels 2 to 6 — are not used for digital signals in the U.S.) My knowledge of this old stuff is now mostly of nostalgia value; but seeking it has left me with a continuing curiosity about the physical world and our infrastructural additions to it. This is why much of what looks like photography is actually research. For example, this and this. What you’re looking at there are pictures taken in service to geology and archaeology.

(End of autobiographical digression.)

Speaking of which, I am also busy lately studying the history of copyright, royalties and the music business — mostly so ProjectVRM can avoid banging into any of those. This research amounts to legal and regulatory archaeology. Three preliminary findings stand out, and I would like to share them.

First, regulatory capture is real, and nearly impossible to escape. The best you can do is keep it from spreading. Most regulations protect last week from yesterday, and are driven by the last century’s leading industries. Little if any regulatory lawmaking by established industries — especially if they feel their revenue bases threatened, clears room for future development. Rather, it prevents future development, even for the threatened parties who might need it most. Thus the bulk of conversation and debate, even among the most progressive and original participants, takes place within the bounds of still-captive markets. This is why it is nearly impossible to talk about Net-supportive infrastructure development without employing the conceptual scaffolding of telecom and cablecom. We can rationalize this, for example, by saying that demand for telephone and cable (or satellite TV) services is real and persists, but the deeper and more important fact is that it is very difficult for any of us to exit the framing of those businesses and still make sense.

Second, infrastructure is plastic. The term “infrastructure” suggests physicality of the sturdiest kind, but in fact all of it is doomed to alteration, obsolescence and replacement. Some of it (Roman roads, for example) may last for centuries, but most of it is obsolete in a matter of decades, if not sooner. Consider over-the-air (OTA) TV. It is already a fossil. Numbered channels persist as station brands; but today very few of those stations transmit on their branded analog channels, and most of them are viewed over cable or satellite connections anyway. There are no reasons other than legacy regulatory ones to maintain the fiction that TV station locality is a matter of transmitter siting and signal range. Viewing of OTA TV signals is headed fast toward zero. It doesn’t help that digital signals play hard-to-get, and that the gear required for getting it sucks rocks. Nor does it help that cable and satellite providers that have gone out of their way to exclude OTA receiving circuitry from their latest gear, mostly force subscribing to channels that used to be free. As a result ABC, NBC, CBS, Fox and PBS are now a premium pay TV package. (For an example of how screwed this is, see here.) Among the biggest fossils are thousands of TV towers, some more than 2000 feet high, maintained to continue reifying the concept of “coverage,” and to legitimize “must carry” rules for cable. After live audio stream playing on mobile devices becomes cheap and easy, watch AM and FM radio transmission fossilize in exactly the same ways. (By the way, if you want to do something green and good for the environment, lobby for taking down some of these towers, which are expensive to maintain and hazards to anything that flies. Start with this list here. Note the “UHF/VHF transmission” column. Nearly all these towers were built for analog transmission and many are already abandoned. This one, for example.)

Third, “infrastructure” is a relatively new term and vaguely understood outside arcane uses within various industries. It drifted from military to everyday use in the 1970s, and is still not a field in itself. Try looking for an authoritative reference book on the general subject of infrastructure. There isn’t one. Yet digital technology requires that we challenge the physical anchoring of infrastructure as a concept. Are bits infrastructural? How about the means for arranging and moving them? The Internet (the most widespread means for moving bits) is defined fundamentally by its suite of protocols, not by the physical media over which data travels, even though there are capacity and performance dependencies on the latter. Again, we are in captured territory here. Only in conceptual jails can we sensibly debate whether something is an “information service” or a “telecommunication service”. And yet most of us who care about the internet and infrasructure do exactly that.

That last one is big. Maybe too big. I’ve written often about how hard it is to frame our understanding of the Net. Now I’m beginning to think we should admit that the Internet itself, as concept, is too limiting, and not much less antique than telecom or “power grid”.

“The Internet” is not a thing. It’s a finger pointing in the direction of a thing that isn’t. It is the name we give to the sense of place we get when we go “on” a mesh of unseen connections to interact with other entitites. Even the term “cloud“, labeling a utility data service, betrays the vagueness of our regard toward The Net.

I’ve been on the phone a lot lately with Erik Cecil, a veteran telecom attorney who has been thinking out loud about how networks are something other than the physical paths we reduce them to. He regards network mostly in its verb form: as what we do with our freedom — to enhance our intelligence, our wealth, our productivity, and the rest of what we do as contributors to civilization. To network we need technologies that enable what we do in maximal ways.  This, he says, requires that we re-think all our public utilities — energy, water, communications, transportation, military/security and law, to name a few — within the context of networking as something we do rather than something we have. (Think also of Jonathan Zittrain’s elevation of generativity as a supportive quality of open technology and standards. As verbs here, network and generate might not be too far apart.)

The social production side of this is well covered in Yochai Benkler‘s The Wealth of Networks, but the full challenge of what Erik talks about is to re-think all infrastructure outside all old boxes, including the one we call The Internet.

As we do that, it is essential that we look to employ the innovative capacities of businesses old and new. This is a hat tip in the general direction of ISPs, and to the concerns often expressed by Richard Bennett and Brett Glass: that new Internet regulation may already be antique and unnecessary, and that small ISPs (a WISP in Brett’s case) should be the best connections of high-minded thinkers like yours truly (and others named above) to the real world where rubber meets road.

There is a bigger picture here. We can’t have only some of us painting it.

Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

« Older entries § Newer entries »