infrastructure

You are currently browsing the archive for the infrastructure category.

Rochester, Vermont

My favorite town in Vermont is Rochester. I like to stop there going both ways while driving my kid to summer camp, which means I do that up to four times per summer. It’s one of those postcard-perfect places, rich in history, gracing a lush valley along the White River, deep in the Green Mountains, with a park and a bandstand, pretty white churches and charm to the brim.

My last stop there was on August 20, when I shot the picture above in the front yard of Sandy’s Books & Bakery, after having lunch in the Rochester Cafe across the street. Not shown are the 200+ cyclists (motor and pedal) who had just come through town on the Last Mile Ride to raise funds for the Gifford Medical Center‘s end-of-life care.

After Hurricane Irene came through, one might have wondered if Rochester itself might need the Center’s services. Rochester was one of more than a dozen Vermont towns that were isolated when all its main roads were washed out. This series of photos from The Republican tells just part of the story. The town’s website is devoted entirely to The Situation. Here’s a copy-and-paste of its main text:

Relief For Rochester

Among the town’s losses was a large section of Woodlawn Cemetery, much of which was carved away when a gentle brook turned into a hydraulic mine. Reports Mark Davis of Valley News,

Rochester also suffered a different kind of nightmare. A gentle downtown brook swelled into a torrent and ripped through Woodlawn Cemetery, unearthing about 25 caskets and strewing their remains throughout downtown.

Many of the graves were about 30 years old, and none of the burials was recent.Yesterday, those remains were still outside, covered by blue tarps.

Scattered bones on both sides of Route 100 were marked by small red flags.

“We can’t do anything for these poor people except pick it up,” said Randolph resident Tom Harty, a former state trooper and funeral home director who is leading the effort to recover the remains.

It was more than 48 hours before officials in Rochester — which was cut off from surrounding towns until Tuesday — could turn their attention to the problem: For a time, an open casket lay in the middle of Route 100, the town’s main thoroughfare, the remains plainly visible.

I found that article, like so much else about Vermont, on VPR News, one of Vermont Public Radio‘s many services. When the going gets tough, the tough use radio. During and after natural disasters, radio is the go-to medium. And no radio service covers or serves Vermont better than VPR. The station has five full-size stations covering most of the state, with gaps filled in by five more low-power translators. (VPR also has six classical stations, with their own six translators.) When I drive around the state it’s the single radio source I can get pretty much everywhere. I doubt any other station or network comes close. Ground conductivity in Vermont is extremely low, so AM waves don’t go far, and there aren’t any big stations in Vermont on AM anyway. And no FM station is bigger, or has as many signals, as VPR.

One big reason VPR does so much, so well, is that it serves its customers, which are its listeners. That’s Marketing 101, but it’s also unique to noncommercial radio in the U.S. Commercial radio’s customers are its advertisers.

VPR’s services only begin with what it does on the air. Reporting is boffo too. Here’s VPR’s report on Rochester last Thursday, in several audio forms, as well as by transcription on that Web page. They use the Web exceptionally well, including a thick stream of tweets at @vprnet.

I don’t doubt there are many other media doing great jobs in Vermont. And at the local level I’m sure some stations, papers and online media do as good a job as VPR does state-wide.

But VPR is the one I follow elsewhere as well as in Vermont, and I want to do is make sure it gets the high five it deserves. If you have others (or corrections to the above), tell me in the comments below.

Some additional links:

@ChunkaMui just put up a great post in Forbes: Motorola + Sprint = Google’s AT&T, Verizon and Comcast Killer.

Easy to imagine. Now that Google has “gone hardware” and “gone vertical” with the Motorola deal, why not do the same in the mobile operator space? It makes sense.

According to Chunka, this new deal, and the apps on it,

…would destroy the fiction that internet, cellular and cable TV are separate, overlapping industries. In reality, they are now all just applications riding on top of the same platform. It is just that innovation has been slowed because two slices of those applications, phone and TV, are controlled by aging oligopolies.

AT&T and Verizon survive on the fiction that mobile text and voice are not just another form of data, and customers are charged separately (and exorbitantly) for them. They are also constraining mobile data bandwidth and usage, both to charge more and to manage the demand that their aging networks cannot handle.

Comcast, Time Warner Cable and other cable operators still profit from the fact that consumers have to purchase an entire programming package in order to get a few particular slices of content. This stems from the time when cable companies had a distribution oligopoly, and used that advantageous position to require expensive programming bundles. Computers, phones and tablets, of course, are now just alternative TV screens, and the Internet is an alternative distribution mechanism. It is just a matter of time before competitors unbundle content, and offer movies, sports, news and other forms of video entertainment to consumers.

The limiting factor to change has not been the technology but obsolete business models and the lack of competition.

Before Apple and Google came in, the mobile phone business was evolving at a geological pace. I remember sitting in a room, many years back, with Nokia honchos and a bunch of Internet entrepreneurs who had just vetted a bunch of out-there ideas. One of the top Nokia guys threw a wet blanket over the whole meeting when he explained that he knew exactly what new features would be rolled out on new phones going forward two and three years out, and that these had been worked out carefully between Nokia and its “partners” in the mobile operator business. It was like getting briefed on agreements between the Medici Bank and the Vatican in 1450.

Apple blasted through that old market like a volcano, building a big, vertical, open (just enough to invite half a billion apps) market silo that (together with app developers) completely re-defined what a smartphone — and any other handheld device — can do.

But Apple’s space was still a silo, and that was a problem Google wanted to solve as well. So Google went horizontal with Android, making it possible for any hardware maker to build anything on a whole new (mostly) open mobile operating system. As Cory Doctorow put it in this Guardian piece, Android could fail better, and in more ways, than Apple’s iOS.

But the result for Google was the same problem that Linux had with mobile before Android came along: the market plethorized. There were too many different Android hardware targets. While Android still attracted many developers, it also made them address many differences between phones by Samsung, Motorola, HTC and so on. As Henry Blodget put it here,

Android’s biggest weakness thus far has been its fragmentation: The combination of many different versions, plus many different customizations by different hardware providers, has rendered it a common platform in name only. To gain the full power of “ubiquity”–the strategy that Microsoft used to clobber Apple and everyone else in the PC era–Google needs to unify Android. And perhaps owning a hardware company is the only way to do that.

That’s in response to the question, “Is this an acknowledgment that, in smartphones, Apple’s integrated hardware-software solution is superior to the PC model of a common software platform crossing all hardware providers?” Even if it’s not (and I don’t think it is), Google is now in the integrated hardware-software mobile device business. And we can be sure that de-plethorizing Android is what Larry Page’s means when he talks about “supercharging” the Android ecosystem.

So let’s say the scenario that Chunka describes actually plays out — and then some. For example, what if Google buys,  builds or rents fat pipes out to Sprint cell sites, and either buys or builds its way into the content delivery network (CDN) business, competing with while also supplying Akamai, Limelight and Level3? Suddenly what used to be TV finishes moving “over the top” of cable and onto the Net. And that’s just one of many other huge possible effects.

What room will be left for WISPs, which may be the last fully independent players out there?

I don’t know the answers. I do know that just the thought of Google buying Sprint will fire up the lawyers and lobbyists for AT&T, Comcast and Verizon.

 

The official statement from Google says,

Google Inc. (NASDAQ:GOOG – News) and Motorola Mobility Holdings, Inc. (NYSE:MMI – News) today announced that they have entered into a definitive agreement under which Google will acquire Motorola Mobility for $40.00 per share in cash, or a total of about $12.5 billion, a premium of 63% to the closing price of Motorola Mobility shares on Friday, August 12, 2011. The transaction was unanimously approved by the boards of directors of both companies.

The acquisition of Motorola Mobility, a dedicated Android partner, will enable Google to supercharge the Android ecosystem and will enhance competition in mobile computing. Motorola Mobility will remain a licensee of Android and Android will remain open. Google will run Motorola Mobility as a separate business.

Meanwhile, over in the Google Blog, Larry Page explains,

Since its launch in November 2007, Android has not only dramatically increased consumer choice but also improved the entire mobile experience for users. Today, more than 150 million Android devices have been activated worldwide—with over 550,000 devices now lit up every day—through a network of about 39 manufacturers and 231 carriers in 123 countries. Given Android’s phenomenal success, we are always looking for new ways to supercharge the Android ecosystem. That is why I am so excited today to announce that we have agreed to acquire Motorola.

Motorola has a history of over 80 years of innovation in communications technology and products, and in the development of intellectual property, which have helped drive the remarkable revolution in mobile computing we are all enjoying today. Its many industry milestones include the introduction of the world’s first portable cell phone nearly 30 years ago, and the StarTAC—the smallest and lightest phone on earth at time of launch. In 2007, Motorola was a founding member of the Open Handset Alliance that worked to make Android the first truly open and comprehensive platform for mobile devices. I have loved my Motorola phones from the StarTAC era up to the current DROIDs.

The bold-faces are mine.

First, note how Larry says Google is acquiring Motorola, rather than Motorola Mobility. That’s because mobility is the heart and soul of Motorola, Inc., which has been synonymous with mobile radio since the company was founded by Paul Galvin in 1928. Motorola, Inc.’s other division, Motorola Solutions, is big and blah, selling gear and services to business and government. Now that Motorola Solutions will be 100% of Motorola, Inc., it’s an open question where the Motorola name will go. Since Larry says Google bought Motorola, I’m guessing that the acquisition included the name. Nothing was said about it in either the release or the blog post, but it’s bound to be an issue. I hope somebody’s bringing it up in the shareholder webcast going on right now (starting 8:30 Eastern). If Google got the Motorola name, Motorola solutions will probably go the way of Accenture, which used to be Andersen Consulting.

At the very least, this is patent play. That’s why Larry talked about intellectual property. In mobile, Motorola (I’m guessing, but I’m sure I’m right) has a bigger patent portfolio than anybody else, going back to the dawn of the whole category. Oracle started a patent war a year ago by suing Google, and Google looked a bit weak in that first battle. So now, in buying Motorola, Google is building the biggest patent fort that it can. In that area alone, Google now holds more cards than anybody, especially its arch-rival, Apple.

Until now, Apple actually wasn’t a direct enemy of Google’s, since Google wasn’t in the hardware business. In fact, Android itself was hardly a business at all — just a way to open up the mostly-closed mobile phone business. But now Google is one of the biggest players in mobile hardware. The game changes.

For Google’s Android partners other than Motorola, this has to hurt. (Henry Blodget calls it a “stab in the back.”)

For Windows Mobile, it’s a huge win, because Microsoft is now the only major mobile operating systems supplier that doesn’t also own a hardware company.

Unless, of course, Microsoft buys Nokia.

[Later…]

The conference call with shareholders is now over, and the strategy is now clear. From Business Insider’s notes:

David Drummond, Google’s legal chief: Android under threat from some companies, while I’m not prepped to talk strategies, combining with Motorola and having that portfolio to protect the ecosystem is a good thing.

Sanjay Jha: Over 17,000 issued, over 7,500 applications out there. Much better support to the businesses.

8:47: Android partners, a risk to them?

Andy Rubin: I spoke yesterday to top 5 licensees, all showed enthusiastic support. Android was born as an open system, doesn’t make sense to be a single OEM.

8:48: What convinced you this was optimal solution? Competencies that aren’t core to Google?

Larry Page: I’m excited about this deal, while competencies that aren’t core to us, we plan to operate as a separate business, excited about protecting the Android ecosystem.

Always watch the verbs. “Protecting” is the operative one here.

Eric S. Raymond weighs in, optimistic as ever about Google/Android’s position here:

We’ll see a lot of silly talk about Google getting direct into the handset business while the dust settles, but make no mistake: this purchase is all about Motorola’s patent portfolio. This is Google telling Apple and Microsoft and Oracle “You want to play silly-buggers with junk patents? Bring it on; we’ll countersue you into oblivion.”

Yes, $12 billion is a lot to pay for that privilege. But, unlike the $4.5 billion an Apple/Microsoft-led consortium payed for the Nortel patents not too long ago, that $12 billion buys a lot of other tangible assets that Google can sell off. It wouldn’t surprise me if Google’s expenditure on the deal actually nets out to less – and Motorola’s patents will be much heavier artillery than Nortel’s. Motorola, after all, was making smartphone precursors like the StarTac well before the Danger hiptop or the iPhone; it will have blocking patents.

I don’t think Google is going to get into the handset business in any serious way. It’s not a kind of business they know how to run, and why piss off all their partners in the Android army? Much more likely is that the hardware end of the company will be flogged to the Chinese or Germans and Google will absorb the software engineers. Likely Google’s partners have already been briefed in on this plan, which is why Google is publishing happy-face quotes about the deal from the CEOs of HTC, LG, and Sony Ericsson.

The biggest loser, of course, is Apple; it’s going to have to settle for an armed truce in the IP wars now. This is also a bad hit for Microsoft, which is going to have to fold up the extortion racket that’s been collecting more fees on HTC Android phones than the company makes on WP7. This deal actually drops a nuke on the whole tangle of smartphone-patent lawsuits; expect to see a lot of them softly and silently vanish away before the acquisition even closes.

I don’t think anybody has paid more attention to this whole thing than Eric has, and he brings the perspective of a veteran developer and open source operative as well. (Without Eric, we wouldn’t be talking about open source today.)

On August 17, Holman Jenkins in the Wall Street Journal added this bit of important analysis:

Android has been hugely advantageous for everyone who is a successful phone maker not named Apple. Remember, Apple’s premium smartphone holds up the pricing structure for the whole industry. Samsung, HTC and the rest have been selling phones into this market and pocketing huge margins because they pay nothing for Android.

Google wouldn’t be human if it didn’t want some of this loot, which buying Motorola would enable it to grab. But that doesn’t mean, in the long term or the short term, that other hardware makers will walk away from a relationship that has lined their pockets and propelled them to the top of the rapidly growing and giant new business of making smartphones. Let’s just say that while having Google as a competitor is not ideal, handset makers will learn to live with it.

Jenkins’ columns often rub me the wrong way, but this bit seems spot-on.

Tags: , , , , ,

I just learned from Dan Kelly that Bruce Elving passed away last month. Details are thin, but here’s a short list of links:

Bruce Elving, Ph.D.Bruce and I were frequent correspondents for many years, starting the early ’70s, when Bruce began publishing his FM Atlas, an authoritative compilation of technical details for every FM station in the U.S. — and an essential handbook for everyone who loved to listen to far-away FM radio stations. Those people are called DXers, and I was one of them.

If you’ve ever been surprised to hear on your FM radio a station from halfway across the country, you were DXing. From my homes in New Jersey and North Carolina, I logged many hundreds of FM and TV stations whose signals skipped off the ionosphere’s sporadic E layer.

For DXers, catching far-away stations is kind of like fishing. You don’t want to catch just the easy ones. For that you go to the AM (aka MW) or shortwave (SW) bands, where the big signals are meant to go hundreds or thousands of miles.

WSM from Nashville and KSL from Salt Lake City occupy what used to be call “clear channels”: ones with no other signals at night. That’s why WSM’s Grand Ole Opry, heard for decades (and even today) every night on radios in rural areas throughout The South , literally made country music. (I listened in New Jersey, carefully turning my radio to “null out” interference from New York’s WNBC, now WFAN, which was right next to WSM on the dial.)

But FM and TV are on bands where signals don’t go far beyond the transmitter’s visible horizon, unless the conditions are right, which isn’t often. That’s one reason DXing FM and TV was more fun for the likes of Bruce Elving and me.

In its heyday (or heydecade), DXing on FM was about hooking relatively rare and slightly exotic fish. The best months to fish were in late spring and summer, when warm calm summer mornings would bring tropospheric (or “tropo”) conditions, in which FM and TV signals would bend along the Earth’s curve, and coast to distances far beyond the horizon. Thus my home in Chapel Hill, NC was often treated to signals from hundreds of miles away. I recall days when I’d pick up WDUQ from the Pittsburgh on 90.5 with the antenna pointed north, then spin the antenna west to get WETS from Johnson City, Tennessee on 89.5, then spin just north of east to get WTGM (now WHRV) from Hampton Roads, Virginia, on the same channel.

Tropo is cool, but the best FM fishing is in times of sporadic-E propagation , when the E-layer of the ionosphere becomes slightly refractive of VHF frequencies, bending them down at an angle of just a few degrees, so that the signals “skip” to distances of 800-1200 miles. This also tends to happen most often in late spring and early summer, typically in the late afternoon and evening.

Thanks to sporadic-E, we would watch Channel 3 TV stations from Louisiana, Texas, Nebraska, Minnesota, Cuba and various places in Canada. But, more often, I would also carefully log FM stations I identified in Bruce Elving’s FM Atlas. From 1974 to 1985 (after which I lived in California, where FM and TV DXing conditions were very rare), I logged more than 800 FM stations, most of which came from more than 800 miles away. Bruce said he’d logged more than 2000 from his home in Duluth, Minnesota. I’m sure that’s a record that will stand forever. (Bear in mind that there were only about 10,000 FM signals in the U.S. at the time.)

For Bruce, FM was also a cause: an underdog he fought for, even after it became an overdog with his help. See, up until the early ’60s, FM was the secondary radio band in the U.S. The sound was better, but most cars didn’t have FM radios, and most cheap home and portable radios didn’t either. Transistor radios were the iPods of the ’50s and ’60s, and most of those were AM-only. Bruce championed FM, and his newsletter, FMedia, was a tireless advocate of FM, long after FM won the fight with AM, and then the Internet had begun to win the fight with both.

I remember telling Bruce that he needed to go digital with PCs, and then take advantage of the Net; and he eventually did, to some degree. But he was still pasting up FM Atlas the old-fashioned way (far as I know) well into the ’90s.

I pretty much quit DXing when I came to Silicon Valley in ’85, though I kept up with Bruce for another decade or so after that. Learning about his passing, I regret that we didn’t stay in closer touch. Though we never met in person, I considered him a good friend, and I enjoyed supporting his work.

With Bruce gone, an era passes. TV DXing was effectively killed when the U.S. digital transition moved nearly every signal off VHF and onto UHF (which skips off the sky too rarely to matter). The FM band is now as crowded as the AM band became, making DXing harder than ever. Programming is also dull and homogenous, compared to the Olde Days. And the Internet obsolesces a key motivation for DXing, which is being able to receive and learn interesting things from distant signals.

A core virtue of the Internet is its virtual erasure of distance. Anybody can hear or watch streams from pretty much anywhere, any time, over any connection faster than dial-up. The stream also tends to stay where it is, and sound pretty good. (For a fun treat, play around with radio.garden, which lets you “tune” between stations by rotating a globe.)

What remains, at least for me, is an understanding of geography and regional qualities that is deep and abiding. This began when I was a kid, sitting up late at night, listening to far-away stations on the headphones of my Hammarlund HQ-129X, hooked up to a 40-meter ham radio antenna in my back yard, with a map spread out on my desk, and encyclopedia volumes opened to whatever city or state a station happened to come from. It grew when I was a young adult, curious about what was happening in Newfoundland, Bermuda, Texas, Winnipeg, or other sources of FM and TV signals I happened to be getting on my KLH Model 18 tuner or whatever old black-and-white TV set I was using at the time.

When it was over, and other technical matters fascinated me more, I’d gained a great education. And no professor had more influence on that education than Bruce Elving, Ph.D.

Tags: ,

So I signed up for . I added some friends from the roster already there (my Gmail contacts, I guess). Created a small circle to discuss VRM. Nothing happened there that I know of right now, but I haven’t checked yet. I’m about to (see below), but first I’ll go through my other impressions.

First, the noise level in my email already rivals that of Facebook‘s and LinkedIn’s, both of which are thick with notices of interest in friending (or whatever) from people I don’t know or barely know. On Facebook, which I hardly visit, I see that I have 145 messages from (I guess) among my 857 friends. I also have 709 friend requests. Just said okay to a couple, ignored the rest.

Second, when I look at https://plus.google.com, the look is mighty similar to Facebook’s. Expected, I guess.

Third, I see now that “circles” means streams. Kind of like lists in Twitter. I had thought that cirlces would be a discussion thing, and I guess it is. But I prefer the threading in a good email client. Or just in email. I’m so tired of doing this kind of thing in silos. Email is mine. Google+ is Google’s. In terms of location, I feel like I’m in a corporate setting in Google+, and I feel like I’m at home when I’m in email. The reason, aside from design differences, is that email is free-as-in-freedom. Its protocols are NEA: Nobody owns them, Everybody can use them, and Anybody can improve them. Not the case with these commercial Web dairy farms.

I don’t mean ‘dairy farms’ as an insult, but as a working metaphor. We are not free there. We are the equivalent of cattle on a ranch.

The problem remains client-server, which is cow-calf, and was a euphemism in the first place (I’ve been told) for slave-master.

We’ve gone about as far as we can go with that. We need freedom now, and none of these dairies can give it to us. Yet another site/service can’t work, by the nature of its server-based design. Asking Google, or Yahoo, or Microsoft, or Apple, or a typical new start-up, with yet another site-based service, to make us free, is like asking a railroad to make us a car.

Email is one kind of primitive car. Or maybe just a primitive way of getting along on the road. (It is, after all, a collection of protocols, like the Net and the Web themselves.) We need more vehicles. More tools. Instruments of independence and sovereignty, as Moxy Tongue suggests here and I riff on here.

I’m thinking more about infrastructure these days. Facebook, LInkedIn, Google+ and Twitter are all good at what they do, but they are neither necessary nor sufficient as infrastructural elements supporting personal independence and real social interaction, like the kind we’ve always had offline, and in marketplaces since the days of Ur. Right now nearly all the sites and services we call “social” are platforms for advertising. That’s their business model. Follow the money and that’s where you end up. Then start there to see where they’ll all go. (LinkedIn, to its credit is an exception here. They have a serious set of professional personal services.) Yes, a lot of good in the world gets done with ad-supported social sites and services. But they are still built on the dairy model. And everything new we do on that model will have the same problem.

There are alternatives.

Kynetx’ execution model, for example, transcends the calf-cow model, even as it works alongside it. RSS always has supported personal independence, because it’s something that gives me (or anybody) the power to syndicate — without locking anybody into some company’s dairy. There are other tools, protocols and technologies as well, but I’ll stop naming my own votes here. Add your own in the comments below.

I wrote A World of Producers in December 2008. At the time I was talking about camcorders and increased bandwidth demand in both directions:

And as camcorder quality goes up, more of us will be producing rather than consuming our video. More importantly, we will be co-producing that video with other people. We will be producers as well as consumers. This is already the case, but the results that appear on YouTube are purposely compressed to a low quality compared to HDTV. In time the demand for better will prevail. When that happens we’ll need upstream as well as downstream capacity.

Since then phones have largely replaced camcorders as first-option video recording devices — not only because they’re more handy and good enough quality-wise, but because iOS and Android serve well as platforms for collaborative video production, and even of distribution. One proof of this pudding is CollabraCam, described as “The world’s first multicam video production iPhone app with live editing and director-to-camera communication.”

The bandwidth problem here is no longer just with fixed-connection ISPs, but with mobile data service providers: AT&T, Verizon, Vodafone, T-Mobile, Orange, O2 and the rest of them.

For all ISPs, there are now two big problems that should rather be seen as opportunities. One is the movement of pure-consumption video watching — television, basically — from TVs to everything else, especially mobile devices. The other is increased production from users who are now producers and not just consumers. This is the most important message to the market from CollabraCam and other developments like it.

The Cloud has a similar message. As more of our digital interactivity and data traffic move between our devices and various clouds of storage and services (especially through APIs), we’re going to need more symmetrical data traffic capacities than old-fashioned ADSL and cable systems provide. (More on this from Gigaom.)

Personally, I don’t have a problem with usage-based pricing of those capacities, so long as it —

  • isn’t biased toward consumption alone (the TV model)
  • doesn’t make whole markets go “bonk!” when the most enterprising individuals and companies run into ceilings in the form of usage caps or “bill shocks” from hockey-stick price increases at usage thesholds,
  • doesn’t bury actual pricing in “plans” that are so complicated that nobody other than the phone companies can fully understand them (and in practice are a kind of shell game, and a bet that customers just aren’t going to bother challenging the bills), and
  • doesn’t foreclose innovations and services from independent (non-phone and non-cable) ISPs, especially wireless ones.

What matters is that the video production horse has long since left Hollywood’s barn. The choice for Hollywood and its allies in the old distribution system (the same one from which we still buy Internet access and traffic capacities) is a simple one:

  1. Serve those wild horses, and let them take the lead in all the directions the market might go, or
  2. Keep trying to capture them and limiting market sizes and activities to what can be controlled in top-down ways.

My bet is that there’s more money in free markets than in captive ones. And that we — the wild horses, and the companies that understand us — will prove that in the long run.

The first time I heard the term “Sepulveda pass,” I thought it was a medical procedure. I mean, 405I was still new to The Coast, and sepulveda sounded like one of those oddball body parts, like uvula or something. (Not speaking of which, I no longer have an uvula. No idea why. It used to be there, but now it’s gone. Strange.)

Anyway, Carmageddon is going on right now, and the Sepulveda pass, a section of the 405 Freeway in Los Angeles, is shut down. My fave links on the matter so far are here, here and here. One of which is that to which Tony Pierce points.

It’ll all be over on Monday. When it comes to fixing freeways, L.A. doesn’t fuck around. No ‘fence, but the Bay Area does.

We had a controlled study of the difference with a pair of earthquakes. In 1989 the Loma Prieta quake dropped a hunk of freeway (called the Cypress Structure) in Oakland, plus a piece of the Bay Bridge. It also damaged several freeways in San Francisco, including the Embarcadero Freeway and the 101-280 interchange. So, what did they do? They got rid of the Embarcadero and the Cypress Structure, took more than a few days to fix the Bay Bridge… and then took years to fix the 101-280 interchange. Years. Lots of them. Meanwhile, when the Northridge quake dropped a hunk of the Santa Monica Freeway in Los Angeles, they got the thing back up in a month or something. (If I have time later I’ll add the links. Right now I’m in Florence, where traffic is Cuissinart of pedestrians, motorcycles, taxis, bicycles and stubby busses. Kind of like the rest of urban Italy, only with a higher ratio of tourists to everything else.)

By the way, the best video you’ll ever see about The 405 is called 405, and was done in 2000 by Bruce Branit and Jeremy Hunt, who also stars in it. The whole thing is just three minutes long, and it’s perfect. Especially right now. Dig.

Last week we spent a lot of time here, in Venice:

Bancogiro, Rialto Mercado, Venice

The triangular marble plaza on the edge of the Grand Canal of Venice is known informally as Bancogiro, once one of Italy’s landmark banks, and now the name of an osteria there. The plaza is part of Rialto Mercado, the marketplace where Marco Polo was based and prospered when he wasn’t out opening trade routes to the east. It’s also where Shakespeare set The Merchant of Venice, and where Luca Pacioli studied double entry bookkeeping, which he described in Summa de arithmetica, geometria, proportioni et proportionalità (Venice 1494), one of the first textbooks written in the vernacular (rather than Latin), and an early success story of the printing press.

Here’s a photo set of the place.

Here’s a 360° view. (While it’s called “Fondamenta de la Preson,” that’s just the cockeyed white building in the map above — a former womens prison — in the corner of the plaza.)

Note that Google Maps tells us little about the location, but plenty about the commercial establishments there. When I go for a less fancy view, the problem gets worse:

Bancogiro, Rialto Mercado, Venice

In that pull-down menu (where it says “Traffic”) I can turn on webcams, photos and other stuff from the Long Tail; but there’s no way to turn on labels for the Grand Canal, the Bancogiro plaza, the Rialto Mercado vaporetto (water bus) stop, the Rialto Mercado itself, the Fondamenta de la Preson (women’s prison, labeled, sort of, in the upper view but not the lower), or even the @#$% street names. The only non-commercial item on the map is the Arciconfraternita Di San Cristoforo E Della Misericordia, which is an organization more than a place.

(My wife just said “You know those hotel maps they give away, that only show hotels? It’s like that, only worse. The hotel maps at least give you some street names.”)

For example, try to find information about the Bancogiro: that is, about the original historic bank, rather than the osteria or the other commercial places with that name. (Here’s one lookup.) For awhile I thought the best information I could find on the Web was text from the restaurant menu, which I posted here. That says the bank was founded in 1157. But this scholarly document says 1617. Another seems to agree. But both are buried under commercial links.

The problem here is that the Web has become commercialized at the cost of other needs of use. And Google itself is leading the way — to the point where it is beginning to fail in its mission to “organize the world‘s information and make it universally accessible and useful.”

This is understandable, and easily rationalized. Google is a commercial enterprise. It makes money by selling advertising, and placing commercial information in settings like the ones above. This has been good in many ways, and funds many free services. But it has subordinated purely useful purposes, such as finding the name of a street, a canal, or a bus stop.

There are (at least) two central problems here for Google and other giants like it. One is that we’re not always buying something, or looking only for commercial information. The other is that advertising should not be the only business model for the likes of Google, and all who depend on it are at risk while it remains so.

One missing piece is a direct market for useful information. Toward that end I’ll put this out there: I am willing to pay for at least some of the information I want. I don’t expect all information to be free. I don’t think the fact that information is easily copied and re-used means information “wants” to be free. In other words, I think there is a market here. And I don’t think the lack of one is proof that one can’t be built.

What we need first isn’t better offerings from Google, but better signaling from the demand side of the marketplace. That’s what I’m try to do right now, by signaling my willingness to pay something for information that nobody is currently selling at any price. We need to work on systems that make both signaling and paying possible — on the buyer’s terms, and not just the seller’s.

This is a big part of what VRM, or Vendor Relationship Management is about. Development is going on here. EmanciPay, for example, should be of interest to anybody who would like to see less money left on the market’s table.

Bonus link.

 

While arguments over network neutrality have steadily misdirected attention toward Washington, phone and cable companies have quietly lobbied one state after another to throttle back or forbid cities, towns and small commercial and non-commercial entities from building out broadband facilities. This Community Broadband Preemption Map, from Community Broadband Networks, tells you how successful they’ve been so far: Broadband Preemption Map Now they’re the verge of succeeding in North Carolina too.

This issue isn’t just close to home for me. I lived in North Carolina for nearly two decades, and I have more blood relatives there than in any other state. (Not to mention countless friends.) Not one of them tells me how great their broadband is. More than a few complain about it. And I can guarantee that the complaints won’t stop once the Governor signs the misleadingly-named “Level Playing Field/Local Gov’t Competition act” (H129), which the cable industry has already been lobbied through the assembly.

The “free market” the phone and cable companies claim to operate in, and which they mostly occupy as a duopoly, is in fact a regulatory zoo where the biggest animals run the place. Neither half of the phone/cable duopoly has ever experienced anything close to a truly free market; but they sure know how to thrive in the highly regulated one they have — at the federal, state and local levels. Here’s Ars on the matter:

Let’s be even clearer about what is at stake in this fight. Muni networks are providing locally based broadband infrastructures that leave cable and telco ISPs in the dust. Nearby Chattanooga, Tennessee’scity owned EPB Fiber Optics service now advertises 1,000Mbps. Wilson, North Carolina is home to the Greenlight Community Network, which offers pay TV, phone service, and as much as 100Mbps Internet to subscribers (the more typical package goes at 20Mbps). Several other North Carolina cities have followed suit, launching their own networks. In comparison, Time Warner’s Road Runner plan advertises “blazing speeds” of 15Mbps max to Wilson area consumers. When asked why the cable company didn’t offer more competitive throughput rates, its spokesperson told a technology newsletter back in 2009 that TWC didn’t think anyone around there wanted faster service. When it comes to price per megabyte, GigaOm recently crunched some numbers and found out that North Carolina cities hold an amazing 7 of 10 spots on the “most expensive broadband in the US” list.

And here’s what Wally Bowen and Tim Karr say in the News & Observer:

North Carolina has a long tradition of self-help and self-reliance, from founding the nation’s first public university to building Research Triangle Park. Befitting the state’s rural heritage, North Carolinians routinely take self-help measures to foster economic growth and provide essential local services such as drinking water and electric power. Statesville built the state’s first municipal power system in 1889, and over the years 50 North Carolina cities and towns followed suit. In 1936, the state’s first rural electric cooperative was launched in Tarboro to serve Edgecombe and Martin counties. Today, 26 nonprofit electric networks serve more than 2.5 million North Carolinians in 93 counties. Strangely, this self-help tradition is under attack. The General Assembly just passed a bill to restrict municipalities from building and operating broadband Internet systems to attract industry and create local jobs. Although pushed by the cable and telephone lobby, similar bills were defeated in previous legislative sessions. But the influx of freshmen legislators and new leadership in both houses created an opening for the dubiously titled “Level Playing Field” bill (HB 129).

No one disputes the importance of broadband access for economic growth and job creation. That’s why five cities – Wilson, Salisbury, Morganton, Davidson and Mooresville – invoked their self-help traditions to build and operate broadband systems after years of neglect from for-profit providers, which focus their investments in more affluent and densely populated areas. Not coincidentally, all five cities own and operate their own power systems or have ties to nonprofit electric cooperatives. (While the bill does not outlaw these five municipal networks, it restricts their expansion and requires them to make annual tax payments to the state as if they were for-profit companies.) How does a state that values independence, self-reliance and economic prosperity allow absentee-owned corporations to pass a law essentially granting two industries – cable and telephone – the power to dictate North Carolina’s broadband future? This question will be moot if Gov. Beverly Perdue exercises her veto power and sends this bill where it belongs: to the dustbin of history.

We don’t need more laws restricting anything around Internet infrastructure build-outs in the U.S. That’s the simple argument here.

We need the phone and cable companies to improve what they can, and we need to encourage and thank them for their good work. (As I sometimes do with Verizon FiOS, over which I am connected here in Massachusetts.)

We also need to recognize that the Internet is a utility and not just the third act (after phone and TV) in the “triple play” that phone and cable companies sell. The Net is more like roads, water, electricity and gas than like TV or telephony (both of which it subsumes). It’s not just about “content” delivered from Hollywood to “consumers,” or about a better way to do metered calls on the old Ma Bell model. It’s about everything you can possibly do with a connection to the rest of the world. The fatter that connection, the more you can do, and the more business can do.

Cities and regions blessed with fat pipes to the Internet are ports on the ocean of bits that now comprise the networked world. If citizens can’t get phone and cable companies to build out those ports, it’s perfectly legitimate for those citizens to do it themselves. That’s what municipal broadband build out is about, pure and simple. Would it be better to privatize those utilities eventually? Maybe. But in the meantime let’s not hamstring the only outlet for enterprise these citizens have found.

Here’s a simple fact for Governor Perdue to ponder: In the U.S. today, the leading innovators in Internet build-out are cities, not phone and cable companies. Look at Chatanooga and Lafayette — two red state cities that are doing an outstanding job of building infrastructure that attracts and supports new businesses of all kinds. Both are doing what no phone or cable companies seems able or willing to do. And both are succeeding in spite of massive opposition by those same incumbent duopolists.

The Internet is a rising tide that lifts all economic boats. At this stage in U.S. history, this fact seems to be fully motivating to enterprises mostly at the local level, and mostly in small cities. (Hi, Brett.) Their customers here are citizens who have direct and personal relationships with their cities and with actual or potential providers there, including the cities themselves. They want and need a level of Internet capacity that phone and cable companies (for whatever reason) are not yet giving them. These small cities provide good examples of The Market at work.

It isn’t government that’s competing with cable and phone companies here. Its people. Citizens.

No, these new build-outs are not perfect. None are, or can be. Often they’re messy. But nothing about them requires intervention by the state. Especially so early in whatever game this will end up being.

I urge friends, relatives and readers in North Carolina to Call Governor Perdue at (800) 662-7952, and to send her emails at  governor.office at nc.gov. Tell her to veto this bill, and to keep North Carolina from turning pink or red on the map above. Tell her to keep the market for broadband as free as it’s been from the beginning.

Bonus link.

[Later, as the last hour approaches…]

Larry Lesig has published an open letter to Governor Perdue, Here is most of it:

Dear Governor Perdue:

On your desk is a bill passed by the overwhelmingly Republican North Carolina legislature to ban local communities from building or supporting community broadband networks. (H.129). By midnight tonight, you must decide whether to veto that bill, and force the legislature to take a second look.

North Carolina is an overwhelmingly rural state. Relative to the communities it competes with around the globe, it has among the slowest and most expensive Internet service. No economy will thrive in the 21st century without fast, cheap broadband, linking citizens, and enabling businesses to compete. And thus many communities throughout your state have contracted with private businesses to build their own community broadband networks.

These networks have been extraordinarily effective. The prices they offer North Carolinians is a fraction of the comparable cost of commercial network providers. The speed they offer is also much much faster.

This single picture, prepared by the Institute for Local Self Reliance, says it all: The yellow and green dots represent the download (x-axis) and upload (y-axis) speeds provided by two community networks in North Carolina. Their size represents their price. As you can see, community networks provide faster, cheaper service than their commercial competitors. And they provide much faster service overall.

2011-05-20-broadbandgraph.png

 

Local competition in broadband service benefits the citizens who have demanded it. For that reason, community after community in North Carolina have passed resolutions asking you to give them the chance to provide the Internet service that the national quasi-monopolies have not. It is why businesses from across the nation have opposed the bill, and business leaders from your state, including Red Hat VP Michael Tiemann, have called upon you to veto the bill.

Commercial broadband providers are not happy with this new competition, however. After spending millions in lobbying and campaign contributions in North Carolina, they convinced your legislature to override the will of local North Carolina communities, and ban these faster, cheaper broadband networks. Rather than compete with better service, and better prices, they secured a government-granted protection against competition. And now, unless you veto H. 129, that protection against competition will become law.

Opponents of community broadband argue that it is “unfair” for broadband companies to have to compete against community-supported networks. But the same might be said of companies that would like to provide private roads. Or private fire protection. Or private police protection. Or private street lights. These companies too would face real competition from communities that choose to provide these services themselves. But no one would say that we should close down public fire departments just to be “fair” to potential private first-responders.

The reason is obvious to economists and scholars of telecommunications policy. As, for example, Professor Brett Frischmann argues, the Internet is essential infrastructure for the 21st century. And communities that rely solely upon private companies to provide public infrastructure will always have second-rate, or inferior, service.

In other nations around the world, strong rules forcing networks to compete guarantee faster, cheaper Internet than the private market alone would. Yet our FCC has abdicated its responsibility to create the conditions under which true private broadband competition might flourish in the United States. Instead, the United States has become a broadband backwater, out-competed not only by nations such as Japan and Korea, but also Britain, Germany and even France. According to a study by the Harvard Berkman Center completed last year, we rank 19th among OECD countries in combined prices for next generation Internet, and 19th for average advertised speeds. Overall, we rank below every major democratic competitor — including Spain — and just above Italy.

In a world in which FCC commissioners retire from the commission and take jobs with the companies they regulate (as Commissioner Baker has announced that she will do, by joining Comcast as a lobbyist, and as former FCC Chairman Powell has done, becoming a cable industry lobbyist), it is perhaps not surprising that these networks are protected from real competition.

But whether surprising or not, the real heroes in this story are the local communities that have chosen not to wait for federal regulators to wake up, and who have decided to create competition of their own. No community bans private networks. No community is unfairly subsidizing public service. Instead, local North Carolina communities are simply contracting to build 21st-century technology, so that citizens throughout the state can have 21st-century broadband at a price they can afford.

As an academic who has studied this question for more than a decade, I join many in believing that H.129 is terrible public policy…

Be a different kind of Democrat, Governor Perdue. I know you’ve received thousands of comments from citizens of North Carolina asking you to veto H.129. I know that given the size of the Republican majority in the legislature, it would be hard for your veto to be sustained.

But if you took this position of principle, regardless of whether or not you will ultimately prevail, you would inspire hundreds of thousands to join with you in a fight that is critical to the economic future of not just North Carolina, but the nation. And you would have shown Republicans and Democrats alike that it is possible for a leader to stand up against endless corporate campaign cash.

There is no defeat in standing for what you believe in. So stand with the majority of North Carolina’s citizens, and affirm the right of communities to provide not just the infrastructure of yesterday — schools, roads, public lighting, public police forces, and fire departments — but also the infrastructure of tomorrow — by driving competition to provide the 21st century’s information superhighway.

With respect,

Lawrence Lessig

To contact the governor, you can email her. If you’re from North Carolina, this link will take you to a tool to call the governor’s office. You can follow this fight on Twitter at @communitynets
You can follow similar fights on Twitter by searching #rootstrikers.

Well put, as usual. Hope it works.

Just about everybody I know who has heard about the sale of Skype to Microsoft has groaned about it. Myself included.

No doubt it makes sense for the entities involved. eBay, various investors and the founders all make money on the deal. Microsoft/Nokia now gets to be Microsoft/Nokia/Skype. Those not involved, including Google, Apple, and all carriers other than those partnering with SkypeNoSoft get nothing.

What the world will get is a set of services that work best only on Nokia’s Windows Mobile devices. Also count on fees for new and old Skype services, with complicated and confusing plans from the carriers.

Add involvements by the ITU (a Microsoft site, Silverlight and all) and governments that like tariffs on calls and data services, and we’ll see the Internet further subordinated to the same telecom business we’ve had since telegraphy. Same meatloaf, new gravy.

Also count on appealing alternatives coming out of Apple and Google, sooner rather than later.

As for Facebook, I have no idea. They’re well-placed to become some kind of player in the telecom business, whatever it becomes, but I don’t see them doing much more than continuing to be AOL 2.x.

I’d say more, but I have a book to finish. If you’re wondering why blogging has been slow lately, that’s why.

[Later…] I love Don Marti’s take:

Really, this is good news. While users are trying to figure out whether to download “Skype Live Small Business Edition” or “Skype For Windows Professional Platinum 7.0”, some startup will eat their lunch.

« Older entries § Newer entries »