infrastructure

You are currently browsing the archive for the infrastructure category.

Ford River Rouge plant

Got my first good clear look at Detroit and Windsor from altitude on a recent trip back from somewhere. Here’s a series of shots. What impressed me most, amidst all that flat snow-dusted spread of city streets, a patch of grids on the flatland of Michigan and Ontario, flanking the Detroit River and its islands, was what looked like a dark smudge. Looking at it more closely, and matching it up with Reality, I discovered that this was Ford’s famous River Rouge Complex in the city of Dearborn.

Says Wikipedia,

The Rouge measures 1.5 miles (2.4 km) wide by 1 mile (1.6 km) long, including 93 buildings with nearly 16 million square feet (1.5 km²) of factory floor space. With its own docks in the dredged Rouge River, 100 miles (160 km) of interior railroad track, its own electricity plant, and ore processing, the titanic Rouge was able to turn raw materials into running vehicles within this single complex, a prime example of vertical-integration production. Over 100,000 workers were employed there in the 1930s.

As an inveterate infrastructure freak, I would love to see this thing sometime.

Blogging, emailing and messaging aren’t owned by anybody.  Tweeting is owned by Twitter. That’s a problem.

In all fairness, this probably wasn’t the plan when Twitter’s founders started the service. But that’s where they (and we) are now. Twitter has become de facto infrastructure, and that’s bad, because Twitter is failing.

Getting 20,500,000 Google Image search results for “twitter fail” paints a picture that should be convincing enough. (See Danny Sullivan‘s comment below for a correct caveat about this metric.) Twitter’s own search results for “hourly usage limit”+wtf wraps the case. I posted my own frustrations with this the other day. After Eric Leone recommended that I debug things by going to https://twitter.com/settings/connections and turning off anything suspicious, I found the only sure way to trouble-shoot was to turn everything off (there were about twenty other sites/services listed with dependencies on Twitter), and then turn each one back on again, one at a time, to see which one (or ones) were causing the problem. So I turned them all off; and then Twitter made the whole list disappear, so I couldn’t go back and turn any of them on again.

Meanwhile I still get the “hourly usage limit” message, and/or worse:

twitter fail

So Twitter has become borderline-useless for me. Same goes for all the stuff that depended on Twitter that I turned off.

In that same thread Evan Prodromou graciously offered to help set up my own Status.Net server. I’m going for it, soon as I get back from my week here in Santa Barbara.

Meanwhile, I’m also raising a cheer for whatever Dave is doing toward “building a microblog platform without a company in the middle”.

Tweeting without Twitter. I like the sound of that.

 

 

That’s my Idea For a Better Internet. Here’s what I entered in the form at http://bit.ly/i4bicfp:

Define the Internet.

There is not yet an agreed-upon definition. Bell-heads think it’s a “network of networks,” all owned by private or public entities that each need to protect their investments and interests. Net-heads (that’s us) think it’s a collection of protocols and general characteristics that transcend physical infrastructure and parochial interests. If you disagree with either of the last two sentences, you demonstrate the problem, and why so many arguments about, say, “net neutrality,” go nowhere.

The idea is to assign defining the Internet to students in different disciplines: linguistics, urban planning, computer science, law, business, engineering, etc. Then bring them together to discuss and reconcile their results, with the purpose of informing arguments about policy, business, and infrastructure development. The result will be better policy, better business and better deployments. Or, as per instructions, “a better place for everyone.”

There should be fun research possibilities in the midst of that as well.

It’s a Berkman project, but I applied in my capacity as a CITS fellow at UCSB. I’ll be back in Santa Barbara for the next week, and the focus of my work there for the duration has been Internet and Infrastructure. (And, if all goes as planned, the subject the book after the one I’m writing now.)

So we’ll see where it goes. Even if it’s nowhere, it’s still a good idea, because there are huge disagreements about what the Internet is, and that’s holding us back.

I gave Why Internet & Infrastructure Need to be Fields of Study as my background link. It’s in sore need of copy editing, but it gets the points across.

Today’s the deadline. Midnight Pacific. If you’ve got a good idea, submit it soon.

After your taxes, of course. (Richard, below, points out that Monday is the actual Tax Day.)

I don’t envy anybody in the airline business. There is so much to do right, and the costs of doing things wrong can be incalculably high. Required capital investments are immense, and the regulatory framework is both complex and costly. Yet the people I’ve met in the business tend to be dedicated professionals who care about serving people, and not just about making a buck or putting in time. And the few bad experiences I’ve had are so anomalous that I’m inclined to disregard them. So, on the whole, I cut them all some slack.

By now I have close to a million miles with United, which is now the largest airline in the world, thanks to its merger with Continental. As it happens I’m sitting in a Continental lounge right now, though I’ll be flying in a couple hours to Salt Lake City on Delta. My original flights with United (from Boston through Chicago) were delayed by snow (yes, it’s snowing here, on the first day of Spring). The Continental club lounge is available so here I sit. For what it’s worth, the Continental lounge is nicer than United’s. In fact, pretty much everything about Continental is nicer, by a small margin. That’s a pat on Continental’s back, rather than a knock on United, which I’ve come to regard with some affection over many years of flying with them. One reason for all that flying is that they made lifetime membership in their club lounge available for a good price two decades ago, and that’s been a tie-breaker for us — in United’s favor — ever since. (Sadly, the offer was discontinued.)

The merger is moving slowly. Most of both airlines’ planes now say United on the side and keep the Continental globe symbol on the tail. (Minimal paint jobs for both, basically.) But the operations are still separate, which in some ways they have to be, since in many locations they occupy separate airport terminals. Their computer systems are also surely different and hard to merge. But, while there is some time left before the merger completes, I thought I’d put out a few public suggestions for both airlines as they gradually become one. Here goes:

  1. Keep Channel 9. That’s the United audio channel that carries cockpit air traffic audio. Like a lot of frequent fliers, aviation is a passion of mine, and listening in on that chatter is a familiar, comforting and engaging experience. Sharing it with passengers is up to the pilots, and I always go out of my way to thank the pilots who choose to share the channel with passengers. I’ve met many other passengers over the years who also love the service. In many cases these passengers are either current or former pilots themselves. Of course it’s not necessary to keep it on that same audio channel; but at least make it available.
  2. Make seat choices easier online. Say what kind of airplane the flight takes, and whether or not there are actually windows by the window seat (on some planes there are some window seats with blank walls). Consider providing links to SeatExpert or SeatGuru.
  3. Allow more conditional choices for upgrades. I like window seats on the shaded side of the plane, and usually choose those seats with great care. So, for example on a United 777, where all the premium coach seating with extra legroom is in seats over the wing. I’m willing to sit in the back with less legroom, just to have an unobstructed view out the window. But often I’ll get an automatic upgrade (as a frequent flyer) to a business class seat that is either an aisle seat or a window seat on the sunny side of the plane, where the view is never as good. In those cases I’ll usually prefer to stay in coach.
  4. Provide Internet connectivity by wi-fi. Put it on all but the small short-haul planes.
  5. Power outlets are nice too. Some airlines have them for all seats. United should be one of them.
  6. The DirectTV system on some Continental planes is nice. So is the completely different system on some other Continental planes (one I flew from Houston to Frankfurt had a zillion movies, but no easy way to navigate all the choices). Whatever you standardize on, make it relatively open to future improvements. And make the headset plugs standard 1/8″ ones, so passengers can use their own headsets.
  7. Get apps going on Android, iPhone and other handheld devices. Continental has some now. United doesn’t yet, though it does now have the paperless boarding pass.
  8. Get Jeff Smisek to cut a new merger progress announcement to run for passengers. The old one has been talking about “changes in the coming months” for about a year now.
  9. In the lounges, upgrade the food, or provide better food you charge for (like you do for drinks at the bar). Right now in the Continental President’s club, there are apples, three kinds of chips in bags, bottom-quality shrink-wrapped cheeses and tiny plastic-wrapped sesame crackers. The United clubs will have the same apples, plus maybe the same crackers and chips, and some nut/candy mixes in dispensers. This Continental club doesn’t have an espresso/cappuccino machine, while United club at the same airport does. (And it’s a much better model than the awful one they had for a decade or more.) Meanwhile at Star Alliance lounges, and in lounges of international airlines such as Scandinavian, there will be a spread of sandwich makings, pastries, fresh baked breads and other good stuff. United and Continental charge a lot for the lounges, yet don’t allow food to be brought in. So at least offer something more than the minimal, food-wise. Free wi-fi in the lounges is also cool. Both United and Continental offer it, but Continental makes it simple: it’s just there, a free open access point. United’s is a complicated sign-on to T-Mobile.
  10. Go back to Continental’s simple and straightforward rules for device use on planes. United’s old rules were ambiguous, all-text and hard to read. Continental had little grapics that showed the allowed devices. That’s what persists in the current (March) Hemispheres magazine is the United text. You almost need to be a lawyer to make sense of this line here: “Any voice, audio, video or other photography (motion or still), recording while on any United Airlines aircraft is strictly prohibited, except to the extent specifically permitted by United Airlines.” Only twice in my many flights on United have I been told not to shoot pictures out the window from altitude, and in the second case the head flight attendant apologized later and offered me a bottle of wine for my trouble. From what I understand, photography is specifically permitted, provided it is not of other people or equipment inside the plane. I’ve also been told “It’s at the pilot’s discretion.” Whatever the rules are, the old Continental ones were much better, and unambiguous.
  11. Email receipts for onboard charges. This especially goes for ones where promos are involved and one can’t tell otherwise if the promo discount went through. For example, Chase bank customers were supposed to get $2 off on the $6 charge for using a Chase bank card to pay for watching DirectTV on the flight I took two Thursdays ago from Boston to Houston. Did I get the discount? I still don’t know.
  12. On the personal video screens, provide flight maps with travel data such as time to destination and altitude. Love those, especially when they aren’t interrupted with duty-free promos on international flights.
  13. Avoid lock-ins with proprietary partners. Example: Zune on United: http://www.zune.net/united. Right now over half of the devices being used in this lounge are non-PCs (iPads, Androids, Macs, etc.). Why leave those people out? And, of course, Zune is a dead platform walking.

Anyway, that’s a quick brain dump in the midst of other stuff, encouraged by conversation with other passengers here. I’m looking forward to seeing how things go.


One of the things I’ve always liked about is listening to Austin radio while I’m in town. I remember discovering KGSR on my first visit in 2006, and there are always new surprises. Here’s what I blogged back then:

Great radio lives

at /107.1 in Austin. Entertainment Weekly called it “an only-in-Austin blend of alt-country, hippie jams, singer-songwriters, and lots of Willie Nelson, of course.” (Sorry, no link.) It doesn’t seem to have the non-stop funky personality of KPIG, but the music is in the same league. They don’t play anything I don’t like, or anything I’m very familiar with, which is an amazing combination.
Wow, they just played Hot Tuna, Willie Nelson (“Shotgun Willie”, an early one, from an album by the same name I’ve long since lost), Stevie Ray Vaughan (I have all his stuff, I thought, but this one wasn’t familiar to me), a new Bonnie Raitt. Creedence (“Midnight Special”). Now they’re playing a local artist; missed the name, but awfully good.
They’re not the biggest station in town: 39,000 watts at about 500 feet, from a tower 16 miles southeast of Austin, near Bastrop, the station’s actual city of license. But they put a city-grade signal over Austin. Does the job.
Says here they’re tied for #9 in all listeners 12+, but I’ll be they’re strong in demographics that matter to advertisers. Hope they are, anyway, so they live.

On this latest trip to Austin (I was there from Thursday to Monday, March 10-14), I was worried at first when I found KGSR missing on 107.1, replaced by a Spanish station. But I quickly discovered that KGSR had moved to 93.3, and a much bigger signal. (This wasn’t KGSR’s first move. It’s long history is explained in Wikipedia.) Other new and old radio finds were:

  • the variously eclectic (and very locally-focused) and , sharing time on 91.7, and on 88.7;
  • classical on 89.5;
  • alternative (101x) on 101.5;
  • landmark news/public/music on 90.5; and
  • old-fashioned “beautiful music” (aka “easy listening”) over on 91.3.

Back to KGSR. I didn’t hear them bragging, but what they have now is the biggest FM signal in town. (now KLZT) was 49,000 watts at 499 feet above average terrain. is 100,000 watts at 1927 feet above average terrain — only 73 feet below the legal maximum height of 2000 feet. With more than twice the power and nearly four times the height (both matter on FM), the coverage area is much bigger. Other stations in the market equal KGSR’s power, but none radiate from the same height. (There are coverage maps at both those last two links.)

Another fun find is that KUT kicks butt in the ratings. Check this out. KUT is tops in Austin in January with a 9.3 share of 12+ listening. Far as I know there are no other public stations in the country that come out #1 in the ratings, over and over, which KUT appears to be doing. KGSR is pretty far back, with a 2.3. KMFA gets a 2.4. KROX gets a 3.3. KNCT gets a 1.8. KOOP gets an 0.2. KAZI and KVRX are no-shows. KLZT, the Mexican music station that now radiates from KGSR’s old transmitter, gets a 5.3. It’s also cool to see five streams listed in the ratings, which is impressive just at the factual level.

What sent me to the ratings was this September 2009 piece in the Austin Post by , about KGSR’s move to 93.3. Writes Jim, “According to Arbitron, the #1 Radio station is KLBJ AM, broadcasting news and information, recently in the news for its decision to reinstate the Todd and Don Show.  The show had been cancelled earlier this year after Don Pryor used the slur “wetback” repeated for about an hour on the air with no management stepping in to stop it.  The station is still #1 with a 7.1 rating.  The #2 station is breezy KKMJ FM.”

Used to be Arbitron didn’t publish noncommercial numbers (and I’m guessing they didn’t when Jim wrote that piece), but now they do, at least through http://radio-info.com. If you’re reading this, Jim, go here: http://www.radio-info.com/markets/austin . Lots of interesting Austin radio story fodder in that list.

For most of my life all I knew about Austin radio was that KLBJ’s story was tied up with its former owner, Lady Bird Johnson, and her husband Lyndon Baines Johnson, the former President. Writes the KLBJ history page, “In December 1942, a buyer, armed with limited capital, a dream, a journalism degree from the University of Texas, and no broadcasting experience, became the new licensee – Lady Bird Johnson.” But there’s more to that story. Here’s Wikipedia:

In January-February 1943, Ladybird Johnson spent $17,500 of her inheritance to purchase ,[3] an Austin radio station that was in debt. She bought the radio station from a three-man partnership which included a future and a future , .

She served as President of the company, LBJ Holding Co., and her husband negotiated an agreement with the CBS radio network. Lady Bird decided to expand by buying a television station in 1952 despite Lyndon’s objections, reminding him that she could do as she wished with her inheritance.[6] The station, KTBC-TV/7 (then affiliated with CBS as well), would make the Johnsons millionaires as Austin’s monopoly VHF franchise.[27] Over the years, journalists have written about how Lyndon used his influence in the Senate to influence the Federal Communications Commission into granting the monopoly license, which was in Lady Bird’s name.[28][29]

Eventually, Johnson’s initial $41,000 investment turned into more than $150 million for the LBJ Holding Company.[30] Johnson remained involved with the company until she was in her 80s.[6] She was the first president’s wife to become a millionaire in her own right.[3]

That squares with my own recollection of the story, from  back when I was involved in broadcasting, in the 1970s.

KLBJ is on 590 on the AM dial, radiating 5000 watts by day and 1000 by night. The night signal is also directional, with dents (“nulls”) to the north and the southeast. From my window seat on the flight out to Houston, I spotted KLBJ’s four-tower transmitter , and got this series of pix, which I’ve posted at the Infrastructure collection on Flickr.

By day, KLBJ’s primary coverage area stretches from Waco to San Antonio, 90 miles in opposite directions. Secondary coverage includes Dallas-Fort Worth and Houston. Fringe coverage reaches across most of Texas and into Oklahoma to the north and Mexico to the south. And that’s with just 5000 watts, or 1/10th the legal limit. The reason is ground conductivity. Texas has some of the best in the country. (Here’s a station in Atlanta on the same channel with more than twice the power. And it basically covers North Georgia and that’s it.)

Here’s Jim McNabb on what has happened to KLBJ since he served as news director there 35 years ago: that it’s become another mostly-right-wing foghorn. (Here’s a schedule.) The same can be said about countless other news/talk stations, of course.

Back on FM, the most anomalous station I heard was also the most anachronistic: , out of in Killeen. Its format is “beautiful music,” or what we once called “.” This was the “mood music” often disparaged as “elevator music” or “music on hold” back in the decades. I didn’t miss it when it went away, but it did kinda give me the warm fuzzies to hear it again. Sadly, the station doesn’t stream, or you could sample it.

Anyway, I just wanted to dump my thoughts on Austin radio before moving on to other matters, also involving broadcasting.

An 8.9-magnitude earthquake that struck Japan yesterday, and a tsunami is spreading, right now, across the Pacific ocean. Thus we have much news that is best consumed live and uncooked. Here’s mine, right now:

aljazeera

Not many of us carry radios in our pockets any more. Small portable TVs became passé decades ago. Smartphones, tablets and other portable Net-connected devices are now the closest things we have to universal receivers and transmitters of live news. They’re what we have in our pockets, purses and carry-bags.

The quake is coming to be called the 2011 Sendai Earthquake and Tsunami, and your best portable media to keep up with it are these:

  1. Al Jazeera English, for continuous live TV coverage (interrupted by war coverage from Libya)
  2. Twitter, for continuous brief reports and pointage to sources
  3. Wikipedia, for a continuously updated static page called 2011 Sendai Earthquake and Tsunami, with links to authoritative sources

I just looked at ABC, NBC, CBS, Fox, CNN, CBC and BBC online, and all have recorded reports. None have live coverage on the Net. They are, after all, TV networks; and all TV networks are prevented from broadcasting live on the Net, either by commercial arrangements with cable and satellite TV distributors, or by laws that exclude viewing from IP addresses outside of national boundaries.

Television has become almost entirely an entertainment system, rather than a news one. Yes, news matters to TV networks, but it’s gravy. Mostly they’re entertainment businesses that also do news. This is even true (though to a lesser degree) for CNN.

At NBC.com, you won’t find that anything newsworthy has happened. The website is a bunch of promos for TV shows. Same with CBS.com, Fox.com and ABC.com. Each has news departments, of course, which you’ll find, for example, at Foxnews.com (which is currently broken, at least for me). Like CNN and BBC, these have have many written and recorded reports, but no live coverage (that you can get outside the U.K, anyway, in the case of BBC). Thus TV on the Net is no different than print media such as the New York Times. None. Hey, the Times has video reports too.

NPR has the same problem. You don’t get live radio from them. Still, you do get live radio from nearly all its member stations. Not true for TV. Lots of TV stations have iPhone, iPad and Android apps, but none feature live network video feeds, again because the networks don’t want anything going “over the top” (of the cable system) through Net-connected devices. This is a dumb stance, in the long run, which gets much shorter with each major breaking news story.

Here’s the take-away: emergencies such as wars and earthquakes demonstrate a simple and permanent fact of media life: that the Net is the new TV and the new radio, because it has subsumed both. It would be best for both TV and radio to normalize to the Net and quit protecting their old distribution systems.

Another angle: the Live Web has finally branched off the Static Web (as I wrote about in Linux Journal, back in 2005), and is fast becoming our primary means for viewing and listening to news. To borrow a geologic metaphor, the vast tectonic plates of TV and radio are being subsumed along their leading edges by the Live Web. Thus today’s wars and earthquakes are tectonic events for media old and new. The mountain ranges and civilizations that will build up along the new margins will be on the Live Web’s plate, not the old TV, radio and print plates.

A plug… Those  worried about how to pay for the change should support the VRM community’s development of EmanciPay. We believe the best consumers of media will become the best customers of media only by means that the consumers themselves control. For free media that’s worth more than nothing (as earthquake and war coverage certainly are), the pricing gun needs to be in the hands of the customer, not just the vendor (all of which have their own different ways of being paid, or no means at all). We need a single standard way that users can say “I like that and want to pay for it, and here’s how I’m going to do that.” Which is what EmanciPay proposes. The demand side needs its own ways and means, and those cannot (and should not) be provided only by the supply side, or it will continue to be fractured into a billion silos. (That number is a rough estimate of commercial sites on the Web.) More about all this in another post soon. (It’s at the front of my mind right now, because some of us will be meeting to talk about it here in Austin at SXSW.)

Meanwhile, back to your irregularly unscheduled programs.

[Later…]  I’ll add notes here…

  • Joey Trotz reports that http://cnn.com/live has four live streams. And, as others say below, so does the BBC. All can be viewed on a browser with Flash, and a disabled popup window blocker. Therefore some laptops and Android devices should also be covered, to a degree; but it’s all bit of a kluge. To me the standard is a live stream using at least a relatively open standard like .mp3 for audio and whatever-it-is that Al Jazeera is using for video (on the iPhone and iPad, at least, it can’t be Flash, so what is it?). The key: ease of viewing (fewest clicks) or listening. This means an app, usually, as of today. Note that nearly all smartphones in use today will be old hat two years from now.
  • I just downloaded and added the CNN app to my iPhone. It has “live” in its tabs, but the picture isn’t moving for me. Not sure what that means.
  • Thanks to Danilo, in the comments below, for suggesting that I make clear some distinctions that at least a couple commenters have missed. I do that in this comment here, and I’ll say it here as well. This post is not a slam on the good work that broadcasters do. Nor am I declaring the death of TV and radio as we know it. I am using AND logic here, not OR. When I say the Net is subsuming radio and TV, and that broadcasters need to normalize to the Net, I am saying that the Net is becoming the base medium. Broadcasters need to be streaming online as well as over the air and over cable. Back when he renewed his contract with SiriusXM, Howard Stern said as much about satellite radio. The new base medium for Howard’s SiriusXM channels, as well as all the other channels in the satellite radio lineup, is the Internet. Satellite distribution will become the backup live stream service, rather than the main distribution system. This is why Howard has been out stumping on TV talk shows for the SiriusXM smartphone app. Yes, it is true that the satellite system will cover many areas that the cell and wi-fi distribution system will not. But the reverse will also be true. SiriusXM on the Net is a global service, rather than one restricted to North America. The service is also not capacity-limited in the number of files and streams that can be offered, which is the case with satellite alone. Another point I’m making is that TV networks especially are restricted in their ability to stream by the deals they have with cable companies, and (in the case of, say, the BBC) by blocked use over IP addresses outside national boundaries. These are severely limiting as more and more viewing moves to hand-held devices. And those limitations need to be faced. Al Jazeera shows what can be done when the limits aren’t there.

Here’s a great idea for local TV news departments: start streaming, 24/7/365, on the Net. You don’t need to have first-rate stuff, and it doesn’t all have to be live. Loop fifteen minutes of news, weather and sports to start. Bring in local placeblog and social media volunteers. Whatever it takes: you figure it out.  Just make it constant, because that’s what TV was in the first place, and that’s what it will remain after the Internet finishes absorbing it, which will happen eventually. Now’s the time to get ahead of the curve.

Here’s why I thought of this idea:

. Far as I know it’s the only serious TV that’s live, streaming 24/7/365 on the Net. I watch it on the iPad wherever we have it… in the car, on a cabinet in the bedroom, or — in this case — on the kitchen counter, next to the stove, where I was watching it while making breakfast yesterday morning. That’s when I shot the photo.

At our place we don’t have a TV any more. Nor do a growing number of other people. Young people especially are migrating their video viewing to the Net. Meanwhile, all the national “content” producers and distributors are tied up by obligations and regulations. Try to watch NBC, CBS, ABC, TNT, BBC or any other three- or four-letter network source on a mobile device. The best you can get are short clips on apps designed not to compete with their cable channels. Most are so hamstrung by the need to stay inside paid cable distribution systems (or their own national borders) that they can’t sit at the table where Al Jazeera alone is playing the game.

That table is a whole new marketplace — one free of all the old obligations to networks and government agencies. No worries about blackouts, must-carries and crazy copyright mazes, as long as it’s all the station’s own stuff, or easily permitted from available sources (which are many).

Savor the irony here. Al Jazeera English is the only real, old-fashioned TV channel you can get on a pad or a smartphone here in the U.S. It’s also the best window on the most important stuff happening in the world today. And it’s not on cable, which is an increasingly sclerotic and soon-to-be marginalized entertainment wasteland. A smart local TV station can widen the opportunities that Al Jazeeera is breaking open here.

Speaking as one viewer, I would love it if , , , , or had a live round-the-clock stream of news, sports, weather and other matters of local interest. We happen to live at a moment in history — and it won’t last long — when ordinary folks like me still look to TV stations for that kind of stuff, and want to see it on a glowing rectangle. Now is the time to satisfy that interest, on rectangles other than those hooked up to antennas or set-top boxes.

And if the TV stations don’t wake up, newspapers and radio stations have the same opportunity. Hey, already puts Dennis and Calahan on . Why not put them on the Net? And if NESN doesn’t like that (because they’re onwed by Comcast), WBZ can put  on a stream. The could play here.  So could and . ‘BUR already has an iPhone app. Adding video would be way cool too.

The key is to make the stations’ video streams a go-to source for info, even if the content isn’t always live. What matters is that it leverages expectations we still have of TV, while we still have them.

And hey, TV stations, think of this: you don’t have to interrupt programming for ads. Run them in the margins. Localize them. Partner with Foursquare, Groupon, Google or the local paper. Whatever. Have fun experimenting.

Yesterday , the king of local TV consultants (and a good friend) put up a post titled The Tactical Use of Beachheads. Here are his central points and recommendations:

There is, I believe, a way to drive the car and fix it at the same time, but it requires managers to step outside their comfort zone and behave more like leaders. The mission is to establish beachheads ahead of everybody else, so that when the vision materializes, they’ll be prepared to monetize it. This is a risk, of course. There’s no spreadsheet, no revenue projections to manage, no best practices, no charts and graphs, because it’s not about seeing who can outsmart, outthink or outspend the next guy; it’s all about anticipating new value and going for it. The risk, however, can be mitigated if the beachheads are based on broad trends.

This can be very tough for certain groups, because we’re so used to being able to hedge bets with facts and processes. Here, we’re leapfrogging processes to intercept a moving target. It’s Wayne Gretzky’s brilliant tactic of “skating to where the puck is going to be,” instead of following its current position.

In our war for future relevance, here are five beachheads we need to establish in order to drive our car and fix it at the same time. Four of them relate to content that, we hope, will be somehow monetized. The fifth deals specifically with enabling commerce via a form of advertising.

  1. Real Time Beach — It is absolutely essential that media companies understand that news and information is moving to real time, and that real time streams are what will really matter tomorrow. It’s already happening today, but until somebody makes big money with it, we’ll continue to emphasize that which we CAN make money with, the front-end design of our websites. These streams take place throughout the back end of the Web, and they will make their way to the front end, and soon. There are early signs of advertising in the stream, and we should be experimenting with this, too. This is an unmistakable trend, and if we don’t move and move fast, it’s one I’m afraid we’ll lose.
  2. Curation Beach — Examples like Topix above show that curation beach is really already here, although I’d call those types of applications “aggregators.” They’re dumb in that they’re simply mechanical aggregators of that which is — for the most part — being published by others. Curation is more the concept of helping customers make sense out of all the real time streams that are in place. We’re all using the streams of social media, for example, to “broadcast,” but the real value is to pay attention and curate. This is a beachhead ready for the taking.
  3. Events Beach — One of the key local niches still left for the taking is the organizing of all events into an application that helps people find and participate. The ultimate user application here will be portable, for it must meet the needs of people already on-the-go. I refer to this beachhead as “event-driven news,” and it is largely created and maintained by the community itself. Since many events dovetail with retail seasons, this is easily low-hanging beachhead fruit.
  4. Personal Branding Beach — If everybody is a media company then media is everybody. This is a fundamental reality within which we’re doing business today, and it presents a unique opportunity for us and our employees. The aggregation of personal brands is a winning formula for online media, and we should be exploiting it before somebody else does. Our people are our strongest asset for competing in the everybody’s-a-media-company world, and we have the advantage of a bully pulpit from which to advance their personal brands. This is more important than most people think, because the dynamic local news brands of tomorrow will be associated with the individual brands of the community. The time to begin establishing this beachhead is now.
  5. Proximity Advertising Beach — The mobile beachhead is both obvious but obscured, because we’re all waiting for somebody to show us how to do it. This could be a real problem, for we know what happened when we allowed the ad industry itself to commodify banner advertising. Outsiders set the value for our products. The same thing is likely to happen here, unless we stake out territory for ourselves downstream first. There are predictions that mobile CPMs will hold at between $15-$25, and that’s enough to make any mobile content creator smile, but I would argue that the real money hasn’t even been discovered yet, because these CPMs are merely targeted display. Remember that the Mobile Web is the same Web as the one that’s wired, and it behaves the same way. The new value for mobile is proximity, and that’s where we need to be focusing. Let’s do what we can to make money with mobile content, but let’s also establish a beachhead in the proximity marketing arena, too, because that’s where this particular puck is headed.

If we approach these beachheads entirely with the question “where’s the money,” we’re likely to miss the boat. This strategy is to get us ahead of that and let the revenue grow into it. None of these will break the bank, and they’ll position us to move quickly regardless of which direction things move or how fast.

Live local streaming on the Net is a huge beachhead. I see it on that kitchen iPad, which only gives me Al Jazeera when I want to know what’s going on in the world. The next best thing, in terms of moving images, is looking out the window while listening to the radio. Local TV can storm the beach here, and build a nice new business on the shore. And navigating the copyright mess is likely to be lot easier locally over the Net than it is nationally over the air or cable. (Thank you, regulators and their captors.)

And hey, maybe this can give Al Jazeera some real competition. Or at least some company on TV’s new dial.

[Later…] Harl‘s comment below made me dig a little, so I’m adding some of my learnings here.

First, if you’re getting TV over the Net, you’re in a zone that phone and cable companies call “over the top,” or OTT.  ITV Dictionary defines it this way:

Over-the-top – (OTT, Over-the-top Video, Over-the-Internet Video) – Over-the-top is a general term for service that you utilize over a network that is not offered by that network operator. It’s often referred to as “over-the-top” because these services ride on top of the service you already get and don’t require any business or technology affiliations with your network operator. Sprint is an “over-the-top long distance service as they primarily offer long distance over other phone company’s phone lines. Often there are similarities to the service your network operator offers and the over-the-top provider offers.

Over-the-top services could play a significant role in the proliferation of Internet television and Internet-connected TVs.

This term has been used to (perhaps incorrectly) describe IPTV video also. See Internet (Broadband) TV.

But all the attention within the broadcast industry so far has been on something else with a similar name: over-the-top TV (not just video) which is what you get, say, with Netflix, Hulu, plus Apple’s and Google TV set top boxes. Here’s ITV Dictionary’s definition:

Over-the-top-TV – (OTT) – Over-The-Top Home Entertainment Media – Electronic device manufacturers are providing DVD players, video game consoles and TVs with built-in wireless connectivity. These devices piggy back on an existing wireless network, pull content from the Internet and deliver it to the TV set. Typically these devices need no additional wires, hardware or advanced knowledge on how to operate. Content suited for TV can be delivered via the Internet. These OTT applications include Facebook and YouTube. Also see Internet-connected TVs.

No wonder TVNewsCheck reports Over-The-Top TV at Bottom of Station Plans. Stations are still thinking inside the box, even after the box has morphed into a flat screen. That is, they still think TV is about couch potato farming. The iPhone and the iPad changed that. Android-based devices will change it a lot more. Count on it.

Since Al Jazeera English is distributed over the top by , I checked to see what else LiveStation has. They say they have apps for CNBC, BBC World News and two other Al Jazeera channels, but on iTunes (at least here in the U.S.) only the three Al Jazeera channels are listed as LiveStation offerings. LiveStation does have its own app for computers (Linux, Mac and Windows), though; and it has a number of channels (not including CNBC) at . I just tried NASA TV there on my iPhone, and it looks good.

Still, apps are the new dial, at least for now, so iPhone and Android apps remain the better beachhead for local stations looking for a new top, after their towers and cable TV get drowned by the Net.

I first heard about the “World Live Web” when my son Allen dropped the phrase casually in conversation, back in 2003. His case was simple: the Web we had then was underdeveloped and inadequate. dnaSpecifically, it was static. Yes, it changed over time, but not in a real-time way. For example, we could search in real time, but search engine indexes were essentially archives, no matter how often they were updated. So it was common for Google’s indexes, even of blogs, to be a day or more old. , PubSub and other live RSS-fed search engines came along to address that issue, as did  as well. But they mostly covered blogs and sites with RSS feeds. (Which made sense, since blogs were the most live part of the Web back then. And RSS is still a Live Web thing.)

At the time Allen had a company that made live connections between people with questions and people with answers — an ancestor of  and @Replyz, basically. The Web wasn’t ready for his idea then, even if the Net was.

The difference between the Web and the Net is still an important one — not only because the Web isn’t fully built out (and never will be), but because our concept of the Web remains locked inside the conceptual framework of static things called sites, each with its own servers and services.

We do have live workarounds , for example with APIs, which are good for knitting together sites, services and data. But we’re still stuck inside the client-server world of requests and responses, where we — the users — play submissive roles. The dominant roles are played by the sites and site owners. To clarify this, consider your position in a relationship with a site when you click on one of these:

Your position is, literally, submissive. You know, like this:

But rather than dwell on client-server design issues, I’d rather look at ways we can break out of the submissive-dominant mold, which I believe we have to do in order for the Live Web to get built out for real. That means not inside anybody’s silo or walled garden.

I’ve written about the Live Web a number of times over the years. This Linux Journal piece in 2005 still does the best job, I think, of positioning the Live Web:

There’s a split in the Web. It’s been there from the beginning, like an elm grown from a seed that carried the promise of a trunk that forks twenty feet up toward the sky.

The main trunk is the static Web. We understand and describe the static Web in terms of real estate. It has “sites” with “addresses” and “locations” in “domains” we “develop” with the help of “architects”, “designers” and “builders”. Like homes and office buildings, our sites have “visitors” unless, of course, they are “under construction”.

One layer down, we describe the Net in terms of shipping. “Transport” protocols govern the “routing” of “packets” between end points where unpacked data resides in “storage”. Back when we still spoke of the Net as an “information highway”, we used “information” to label the goods we stored on our hard drives and Web sites. Today “information” has become passé. Instead we call it “content”.

Publishers, broadcasters and educators are now all in the business of “delivering content”. Many Web sites are now organized by “content management systems”.

The word content connotes substance. It’s a material that can be made, shaped, bought, sold, shipped, stored and combined with other material. “Content” is less human than “information” and less technical than “data”, and more handy than either. Like “solution” or the blank tiles in Scrabble, you can use it anywhere, though it adds no other value.

I’ve often written about the problems that arise when we reduce human expression to cargo, but that’s not where I’m going this time. Instead I’m making the simple point that large portions of the Web are either static or conveniently understood in static terms that reduce everything within it to a form that is easily managed, easily searched, easily understood: sites, transport, content.

The static Web hasn’t changed much since the first browsers and search engines showed up. Yes, the “content” we make and ship is far more varied and complex than the “pages” we “authored” in 1996, when we were still guided by Tim Berners-Lee’s original vision of the Web: a world of documents connected by hyperlinks. But the way we value hyperlinks hasn’t changed much at all. In fact, it was Sergey Brin’s and Larry Page’s insights about the meaning of links that led them to build Google: a search engine that finds what we want by giving maximal weighting to sites with the most inbound links from other sites that have the most inbound links. Although Google’s PageRank algorithm now includes many dozens of variables, its founding insight has proven extremely valid and durable. Links have value. More than anything else, this accounts for the success of Google and the search engines modeled on it.

Among the unchanging characteristics of the static Web is its nature as a haystack. The Web does have a rudimentary directory with the Domain Name Service (DNS), but beyond that, everything to the right of the first single slash is a big “whatever”. UNIX paths (/whatever/whatever/whatever/) make order a local option of each domain. Of all the ways there are to organize things—chronologically, alphabetically, categorically, spatially, geographically, numerically—none prevails in the static Web. Organization is left entirely up to whoever manages the content inside a domain. Outside those domains, the sum is a chaotic mass beyond human (and perhaps even machine) comprehension.

Although the Web isn’t organized, it can be searched as it is in the countless conditional hierarchies implied by links. These hierarchies, most of them small, are what allow search engines to find needles in the World Wide Haystack. In fact, search engines do this so well that we hardly pause to contemplate the casually miraculous nature of what they do. I assume that when I look up linux journal diy-it (no boolean operators, no quotes, no tricks, just those three words), any of the big search engines will lead me to the columns I wrote on that subject for the January and February 2004 issues of Linux Journal. In fact, they probably do a better job of finding old editorial than our own internal searchware. “You can look it up on Google” is the most common excuse for not providing a search facility for a domain’s own haystack.

I bring this up because one effect of the search engines’ success has been to concretize our understanding of the Web as a static kind of place, not unlike a public library. The fact that the static Web’s library lacks anything resembling a card catalog doesn’t matter a bit. The search engines are virtual librarians who take your order and retrieve documents from the stacks in less time than it takes your browser to load the next page.

In the midst of that library, however, there are forms of activity that are too new, too volatile, too unpredictable for conventional Web search to understand fully. These compose the live Web that’s now branching off the static one.

The live Web is defined by standards and practices that were nowhere in sight when Tim Berners-Lee was thinking up the Web, when the “browser war” broke out between Netscape and Microsoft, or even when Google began its march toward Web search domination. The standards include XML, RSS, OPML and a growing pile of others, most of which are coming from small and independent developers, rather than from big companies. The practices are blogging and syndication. Lately podcasting (with OPML-organized directories) has come into the mix as well.

These standards and practices are about time and people, rather than about sites and content. Of course blogs still look like sites and content to the static Web search engines, but to see blogs in static terms is to miss something fundamentally different about them: they are alive. Their live nature, and their humanity, defines the liveWeb.

This was before  not only made the Web live, but did it in part by tying it to SMS on mobile phones. After all, phones work in the real live world.

Since then we’ve come to expect real-time performance out of websites and services. Search not only needs to be up-to-date, but up-to-now. APIs need to perform in real time. And many do. But that’s not enough. And people get that.

For example, has a piece titled Life in 2020: Your smartphone will do your laundry. It’s a good future-oriented piece, but it has two problems that go back to a Static Web view of the world. The first problem is that it sees the future being built by big companies: Ericsson, IBM, Facebook, IBM, Microsoft and Qualcomm. The second problem is that it sees the Web, ideally, as a private thing. There’s no other way to interpret this:

“What we’re doing is creating the Facebook of devices,” said IBM Director of Consumer Electronics Scott Burnett. “Everything wants to be its friend, and then it’s connected to the network of your other device. For instance, your electric car will want to ‘friend’ your electric meter, which will ‘friend’ the electric company.”

Gag me with one of these:

This social shit is going way too far. We don’t need the “Facebook” of anything besides Facebook. In fact, not all of us need it, and that’s how the world should be.

gagged on this too. In A Completely Connected World Depends on Loosely Coupled Architectures, he writes,

This is how these articles always are: “everything will have a network connection” and then they stop. News flash: giving something a network connection isn’t sufficient to make this network of things useful. I’ll admit the “Facebook of things” comment points to a strategy. IBM, or Qualcomm, or ATT, or someone else would love to build a big site that all our things connect to. Imagine being at the center of that. While it might be some IBM product manager’s idea of heaven, it sounds like distopian dyspepsia to me.

Ths reminds me of a May 2001 Scientific American article on the Semantic Web where Tim Berners-Lee, James Hendler, and Ora Lassila give the following scenario:

“The entertainment system was belting out the Beatles’ ‘We Can Work It Out’ when the phone rang. When Pete answered, his phone turned the sound down by sending a message to all the other local devices that had a volume control. His sister, Lucy, was on the line from the doctor’s office: …”

Sound familiar? How does the phone know what devices have volume controls? How does the phone know you want the volume to turn down? Why would you program your phone to turn down the volume on your stereo? Isn’t the more natural place to do that on the stereo? While I love the vision, the implementation and user experience is a nightmare.

The problem with the idea of a big Facebook of Things kind of site is the tight coupling that it implies. I have to take charge of my devices. I have to “friend” them. And remember, these are devices, so I’m going to be doing the work of managing them. I’m going to have to tell my stereo about my phone. I’m going to have to make sure I buy a stereo system that understands the “mute the sound” command that my phone sends. I’m going to have to tell my phone that it should send “mute the sound” commands to the phone and “pause the movie” commands to my DVR and “turn up the lights” to my home lighting system. No thanks.

The reason these visions fall short and end up sounding like nightmares instead of Disneyland is that we have a tough time breaking out of the request-response pattern of distributed devices that we’re all too familiar and comfortable with.

tried to get us uncomfortable early in the last decade, with his book Small Pieces Loosely Joined. One of its points: “The Web is doing more than just speeding up our interactions and communications. It’s threading and weaving our time, and giving us more control over it.” Says Phil,

…the only way these visions will come to pass is with a new model that supports more loosely coupled modes of interaction between the thousands of things I’m likely to have connected.

Consider the preceding scenario from Sir Tim modified slightly.

“The entertainment system was belting out the Beatles’ ‘We Can Work It Out’ when the phone rang. When Pete answered, his phone broadcasts a message to all local devices indicating it has received a call. His stereo responded by turning down the volume. His DVR responded by pausing the program he was watching. His sister, Lucy, …”

In the second scenario, the phone doesn’t have to know anything about other local devices. The phone need only indicate that it has received a call. Each device can interpret that message however it sees fit or ignore it altogether. This significantly reduces the complexity of the overall system because individual devices are loosely coupled. The phone software is much simpler and the infrastructure to pass messages between devices is much less complex than an infrastructure that supports semantic discovery of capabilities and commands.

Events, the messages about things that have happened are the key to this simple, loosely coupled scenario. If we can build an open, ubiquitous eventing protocol similar to the open, ubiquitous request protocol we have in HTTP, the vision of a network of things can come to pass in a way that doesn’t require constant tweaking of connections and doesn’t give any one silo (company) control it. We’ve done this before with the Web. It’s time to do it again with the network of things. We don’t need a Facebook of Things. We need an Internet of Things.

I call this vision “The Live Web.” The term was first coined by Doc Searls’ son Allen to describe a Web where timeliness and context matter as much as relevance. I’m in the middle (literally half done) with a book I’m calling The Live Web: Putting Cloud Computing to Work for People . The book describes how events and event-based systems can more easily create the Internet of Things than the traditional request-response-style of building Web sites. Im excited for it to be done. Look for a summer ublishing date. In the meantime, if you’re interested I’d be happy to get your feedback on what I’ve got so far.

Again, Phil’s whole post is here.

I compiled a list of other posts that deal with various VRM issues, including Live Web ones, at the ProjectVRM blog.

If you know about other Live Web developments, list them below. Here’s the key: They can’t depend on any one company’s server or services. That is, the user — you — have to be the driver, and to be independent. This is not to say there can’t be dependencies. It is to say that we need to build out the Web that David Weinberger describes in Small Pieces. As Dave Winer says in The Internet is for Revolution, don’t just think decentralized. (Or re-decentralized, though that’s a fine thing. As is rebooting.) Think distributed. As I explained last year here,

… the Net is not centralized. It is distributed: a heterarchy rather than a hierarchy. At the most basic level, the Net’s existence relies on protocols rather than on how any .com, .org, .edu or .gov puts those protocols to use.

The Net’s protocols are not servers, clouds, wires, routers or code bases. They are agreements about how data flows to and from any one end point and any other. This makes the Internet a world of endsrather than a world of governments, companies and .whatevers. It cannot be reduced to any of those things, any more than time can be reduced to a clock. The Net is as oblivious to usage as are language and mathematics — and just as supportive of every use to which it is put. And, because of this oblivity, The Net supports all without favor to any.

Paul Baran contrasted centralized systems (such as governments), decentralized ones (such as Twitter+Facebook+Google, etc.) and distributed ones, using this drawing in 1964:

Design C became the Internet. Except the Internet is actually more like D in this version here:

Because on the Internet you don’t have to be connected all the time. And any one node can connect to any other node. Or to many nodes at once. Optionality verges on the absolute.

Time to start living. Not just submitting.

Al Jazeera story

Cable companies: Add Al Jazeera English *now* Jeff Jarvis commands, correctly, on his blog — and also in , under the headine . For me now was a few minutes ago, when I read both items on the family iPad, which has been our main news portal since the quit coming and I suspended my efforts to reach them by Web or phone. (The Globe also wants a bunch of ID crap when I go there on the iPad, so they’re silent that way too.) So I went to the App store, looked up , saw something called Al Jazeera English Live was available for free, got it, and began watching live protest coverage from Cairo.

We don’t have cable here. We dumped it after network news turned to shit, and we found it was easier to watch movies on Netflix. We still like to watch sports, but cable for sports alone is too expensive, because it’s always bundled with junk we don’t want and not available à la carte. (You know, like stuff is on the Web.) When we want TV news, we go online or get local TV through an gizmo plugged into an old Mac laptop. Works well, but it’s still TV.

And so is Al Jazeera on an iPad/iPhone, Samsung Wave or a Nokia phone. (See http://english.aljazeera.net/mobile/for details. No Android or Blackberry yet, appaerently.) The difference is that real news s happening in Egypt, and if you want live news coverage in video form, Al Jazeera is your best choice. As Jeff puts it, “Vital, world-changing news is occurring in the Middle East and no one — not the xenophobic or celebrity-obsessed or cut-to-the-bone American media — can bring the perspective, insight, and on-the-scene reporting Al Jazeera English can.”

And it’s very good. , “If you’re watching Al Jazeera, you’re seeing uninterrupted live video of the demonstrations, along with reporting from people actually on the scene, and not “analysis” from people in a studio. The cops were threatening to knock down the door of one of its reporters minutes ago. Fox has moved on to anchor babies. CNN reports that the ruling party building is on fire, but Al Jazeera is showing the fire live.”

In fact six Al Jazeera journalists are now being detained (I just learned). That kind of thing happens when your news organization is actually involved in a mess like this. CNN used to be that kind of organization, but has been in decline for years, along with other U.S. network news organizations. As Jeff says, “What the Gulf War was to CNN, the people’s revolutions of the Middle East are to Al Jazeera English. But in the U.S., in a sad vestige of the era of Freedom Fries, hardly anyone can watch the channel on cable TV.”

And that’s a Good Thing, because cable is a mostly shit in a pipe, sphinctered through a “set top box” that’s actually a computer crippled in ways that maximize control by the cable company and minimize choice for the user. Fifteen years ago, the promise of TV was “five hundred channels”. We have that now, but we also have billions of sources — not just “channels” — over the Net. Cream rises to the top, and right now that cream is Al Jazeera and the top is a hand-held device.

The message cable should be getting is not just “carry Al Jazeera,” but “normalize to the Internet.” Open the pipes. Give us à la carte choices. Let us get and pay for what we want, not just what gets force-fed in bundles. Let your market — your viewers — decide what’s worth watching, and how they want to watch it. And quit calling Internet video “over the top”. The Internet is the new bottom, and old-fashioned channel-based TV is a limping legacy.

A few days ago, President Obama spoke about the country’s “Sputnik moment”. Well, that’s what Al Jazeera in Egypt is for cable TV. It’s a wake-up call from the future. In that future we’ll realize that TV is nothing more than a glowing rectangle with a boat-anchor business model. Time to cut that anchor and move on.

Here’s another message from the future, from one former cable TV viewer: I’d gladly pay for Al Jazeera. Even when I can also get it for free. All we need is the mechanism, and I’m glad to help with that.

Tags: , ,

So now KDFC is on 90.3 and 88.9, while KUSF is off the air. (Though it does have a Live365 stream.) Radio Valencia, a pirate radiating out of the Mission district on 87.9, has expressed sympathy with KUSF’s exiled volunteers, and has provided some airtime as well. The University of San Francisco, which sold the 90.3 license to the University of Southern California, currently has KUSF.org re-directing to this 9-day-old press release.

In my last post I suggested that KUSF’s volunteers apply for 87.7 as a licensed low power TV station. (As fate has it, the audio for Channel 6 TV is roughly on 87.7). I had forgotten about Radio Valencia when I wrote that. Perhaps the two groups can get together and go after 87.7, if that window is actually open.

The KUSF community (at SaveKUSF.org) remains committed to getting their frequency back. The likelihood of this rounds to zero, but I wish them luck. (They’re having some with SF supervisors.) I still think the future of radio is over the Net in any case. Going forward in that direction, a big question for KUSF’s community is how it can keep dealing with USF, which will provide the streaming, the studio, the record library and other essentials, such as the KUSF brand, which is the university’s intellectual property. I’ll be interested in hearing how that non-divorce works out.

Meanwhile there is the matter of expanding KDFC. On KQED’s Forum last week, Brenda Barnes, president of USC radio (which bought KUSF’s license is moving KDFC there) and managing director of the Classical Public Radio Network (which will operate KDFC locally), said many times that her organizations are looking to buy a signal, or signals, in the South Bay, where KDFC can’t be heard from either of its new facilities (the old KUSF on 90.3 and the old KNDL in Anguin on 89.9).

It could be that the USC people are also already thinking about 87.7 (the Channel 6 TV hack) in the South Bay. If that radiates from one of the mountains down there, it would do a good job. (The signal would be weak, but reach far, kind of like KFJC does now). That would be the best solution, I think; but it would also foreclose the 87.7 option for KUSF-in-exile, essentially screwing them over a second time. (So, there’s an assignment for both KUSF and Radio Valencia. Hurry up and see what can be done.)

The more likely option for KDFC is finding a college or university that would rather have money than continue operating a radio station, especially when a buyer comes calling. That’s the option USF took, and it’s a certain bet that Brenda Barnes and friends are already hard at work selling the same options to one or more of these FMs in the South Bay:

  • 89.1 KCEA Atherton, owned by Menlo-Atherton High School. Broadcasts with 100 watts  from a ridge  San Carlos. Small signal.
  • 89.3 KOHL Fremot, owned by Ohlone Community College. Covers the eastern part of the South Bay with 145 watts from the college campus in the foothills.
  • 89.7 KFJC Los Altos, owned by Foothill Junior College. Covers the South Bay well, from Black Mountain, with just 108 watts. This is the KUSF of the South Bay, and the station/community with the most to worry about.
  • 90.1 KZSU Stanford, owned by Stanford University. Covers Palo Alto and the central Peninsula with 500 watts from a hill on The Farm. KDFC’s 90.3 signal in San Franciso protects KZSU with a null in the direction of Stanford. The option here for the KDFC folks would be to buy KZSU and turn it into a KDFC repeater, or to take it dark and crank up the San Francisco signal. But then, there’s also…
  • 90.5 KSJS San Jose, owned by San Jose State University. This too has a commuity. And it covers the San Jose end of the South Bay well with 1500 watts on a high hill on the south side of town. 90.3 in The City also protects KSJS, so the same options for KDFC apply here as with KZSU.
  • 91.1 KCSM San Mateo, owned by the College of San Mateo. This is the Bay Area’s much-loved jazz station, and covers the Peninsula and Mid-South Bay pretty well, plus Oakland-Berkeley. Wattage-wise, it’s the most powerful of the options (11,000 watts), though the transmitter is not on a high site.
  • 91.5 KKUP Cupertino, owned by the Assurance Science Foundation. With 200 watts on Loma Prieta Mountain, KKUP reaches a large area, including all of Monterey Bay (Santa Cruz, Salinas, etc.) as well as the south part of the South Bay.

Another possibility for KDFC is buying a commercial station in the South Bay. There are many of those to choose from, if any is willing to sell. None will be cheap, but most would be better than the options above, with the conditional exceptions of KCSM and KFJC. For example, KCNL on 104.9, which Clear Channel unloaded last year for $5 million, would have been a good deal for the USC people. It serves the South Bay quite well with a 6,000 watt signal from the foothills near San Jose. KRTY from Los Gatos on 95.3 is another one with a similar-sized signal.

In any case, we know who is on the hunt and why. If they succeed, KDFC listeners should be happy. Listeners to the replaced station, or stations, will not be. Looking at the ratings, I am betting that there are more of the former than the latter. In the most recent rating period, KDFC was Number 7 overall (out of many dozens of signals), with a 3.9% share of Average Quater Hour listening, which is great for any station and huge for a classical one. It also had a cumulative audience of 632,000 people, none of which can get the station today on the signal they listened to during that ratings period.

[Later…] A february 10 post at RadioSurvivor.com.

« Older entries § Newer entries »