2009

You are currently browsing the yearly archive for 2009.

So I just followed this tweet by Chris Messina to Mike Arrington‘s The End of Hand Crafted Content. The tweet-bite: “The rise of fast food content is upon us, and it’s going to get ugly.” Meaning that FFC “will surely, over time, destroy the mom and pop operations that hand craft their content today. It’s the rise of cheap, disposable content on a mass scale, force fed to us by the portals and search engines.”

Just as an aside, I’ve been hand-crafting (actually just typing) my “content” for about twenty years now, and I haven’t been destroyed by a damn thing. I kinda don’t think FFC is going to shut down serious writers (no matter where and how they write) any more than McDonald’s killed the market for serious chefs.

Mike explains, “On one end you have AOL and their Toyota Strategy of building thousand of niche content sites via the work of cast-offs from old media. That leads to a whole lot of really, really crappy content being highlighted right on the massive AOL home page… On the other end you have Demand Media and companies like it. See Wired’s ‘Demand Media and the Fast, Disposable, and Profitable as Hell Media Model‘… They push SEO juice to this content, which is made as quickly and cheaply as possible, and pray for traffic. It works like a charm, apparently.” By “works” I suppose Mike means that they make money.

His penultimate point:

My advice to readers is just this — get ready for it, because you’ll be reading McDonalds five times a day in the near future. My advice to content creators is more subtle. Figure out an even more disruptive way to win, or die. Or just give up on making money doing what you do. If you write for passion, not dollars, you’ll still have fun. Even if everything you write is immediately ripped off without attribution, and the search engines don’t give you the attention they used to. You may have to continue your hobby in the evening and get a real job, of course.

Good advice. In my own case, I sometimes make money writing, but usually I don’t. I do get paid well for my counsel (and my speaking), mostly because of what I’ve been writing in places like this. SEO for me is linking and crediting generously. That works like a charm, too. And I have fun doing what I trust is good work in the world. That has SEO qualities as well. (None of it is a hobby, though. At least I don’t think of it that way. And if I don’t, it isn’t.)

Mike concludes, “Forget fair and unfair, right and wrong. This is simply happening. The disruptors are getting disrupted, and everyone has to adapt to it or face the consequences. Hand crafted content is dead. Long live fast food content, it’s here to stay.”

Well, no. Nothing with real real value is dead, so long as it can be found on the Web and there are links to it. Humans are the ones with hands. Not intermediaries. Not AOL, or TechCrunch, or HuffPo, or Google or the New York Freaking Times. The Net is the means to our ends, not The Media, whether they be new disruptors or old disruptees. The Net and the Web liberate individuals. They welcome intermediators, but they do not require them. Even in cases were we start with intermediation — and get to use really good ones — what matters most is what each of us as individuals bring to the Net’s table. Not the freight system that helps us bring it there, no matter how established or disruptive that system is.

The title of this post plays off the 1971 poem/song “The Revolution Will Not Be Televised”, by Gil Scott-Heron. The passage that stands out for me is this one:

The revolution will not be right back after a message
about a white tornado, white lightning, or white people.
You will not have to worry about a dove in your
bedroom, a tiger in your tank, or the giant in your toilet bowl.
The revolution will not go better with Coke.
The revolution will not fight the germs that may cause bad breath.
The revolution will put you in the driver’s seat.

The lyrics were not addressed to me, a white guy from the suburbs, but they spoke to me all the same. Especially that last line.

We still seem to think that progress on the Net is the work of “brands” creating and disrupting and doing other cool stuff. Those may help, but what matters most is what each of us does better than anybody or anything else. The term “content” insults the nature of that work. And of its sources.

The revolution that matters — the one that will not be intermediated — is the one that puts each of us in the driver’s seat, rather than in the back of the bus. Or on a bus at all.

Egging on

Nice piece in the Baltimore Sun (front page of the Arts section, above the fold) about The Crystal Egg, a new production we’ll be seeing this weekend. Written and directed by Colette Searls. More here.

stpaul_paternoster
So I’ve been out and about London the 2-3 days. Had a great time. Beautiful city in Christmas season, even (or perhaps especially) in the rain. Not much connectivity, or time to connect, actually. The above is one of the few pix I took, before breakfast with JP Rangaswami this (or yesterday, depending) morning. Shot it with a little pocket camera. Not bad, considering. Moon over a spire of St. Paul’s Cathedral from Paternoster Square, one of my haunts there. I leave in a few hours for DC, then Boston. See ya’ll stateside.

I was gonna tweet this, but Twitter’s down again. #LeWeb, I guess.

Tags: , , , , , , , ,

Empowering the Internet One American at a Time is an excellent post by Erik Cecil, a battle-hardened telecom lawyer whose vision of the Big Picture and around all curves continues to delight me. The post first appeared on a mail list, and is addressed primarily to fellow Internet and telecom obsessives (myself included). Here are its opening paragraphs:

From this lawyer’s perspective, regulation mostly puts the legal power in the hands of carriers and regulators. The Internet puts technology in the hands of everyday people. There’s a mismatch. I’ve offered here and in other places simple ways to fix that near term, but as you may see from discussions in policy, legal, technical, and economic circles, we get into all sorts of interesting chats about history and this and that, but few actually take on the political realities and industry issues head-on. Connectivity sucks in every state because we subsidize to the tune of billions of dollars per year ancient technologies, force new ones into those shoehorns, and drive costs through the roof. Industry, particularly competitive industry is hemmed in on one side by what by any monetary measure is monopoly and on the other by regulators. Since industry is terrified of getting under the skin of the regulators (with good reason in many respects – they can be vindictive at times; happy to take anyone through any dozen briefs, recommended decisions and commission decisions), there’s a lot of dancing around the issue, but few, IMHO, really run it to ground.

Very simply: federalize regulation BUT put the rights in the hands of individuals rather than the always hyper-political state PUCs, which, as you note and as has been discussed on this list and other lists for years, tend to be self-serving in how they cut up their data. Unless and until we flatten regulation, it will continue to flatten us. The little guys cannot afford the legal and political horsepower it takes to compete. Trust me; I’ve run some of the biggest ones around (at least from the competitive side) and I still deal with this on a daily basis.

More fodder for this morning’s session at Supernova.

Tags: , ,

Yesterday the FCC released a public notice seeking comment on the “transition from circuit switched network to all-IP network.” (Here’s the .pdf. Here’s the .txt version.) Translation: from the phone system to the Internet.

This is huge. Really. Freaking. Huge.

Or maybe not. Could be it’s all just posturing or worse. But I don’t think so. Or I hope not.

Either way, it matters. For better and worse, the Internet reposes in legal as well as technical infrastructures.

The money text:

The intent of this Public Notice is to set the stage for the Commission to consider whether to issue a Notice of Inquiry (NOI) relating to the appropriate policy framework to facilitate and respond to the market-led transition in technology and services, from the circuit switched PSTN system to an IP-based communications world.

In the spirit of understanding the scope and breadth of the policy issues associated with this transition, we seek public comment to identify the relevant policy questions that an NOI on this topic should raise in order to assist the Commission in considering how best to monitor and plan for this transition.

In identifying the appropriate areas of inquiry, we seek to understand which policies and regulatory structures may facilitate, and which may hinder, the efficient migration to an all IP world. In addition, we seek to identify and understand what aspects of traditional policy frameworks are important to consider, address, and possibly modify in an effort to protect the public interest in an all-IP world.

The italics are mine.

There is a high degree of presumption here. I mean, are we really migrating to an all-IP world? All? Most of us still watch plenty of television. And, in the immortal words of Wierd Al Yankovic, we all have cell phones. Neither TV nor cellular telephony are even close to an “all-IP world.” IP might be involved, but … there is some distance to cover here. And not much motivation by phone companies to make the move.

Still, we can see it happening. Your smartphone today is a data device that happens to run a lot of applications, which include both telephony and television. Yet the bill you get for using your phone (no matter how smart it is) comes from a phone company. The underlying infrastucture, including 3G, is largely a phone system. It handles data, and it’s mostly digital, but it is not fundamentally a data system. It’s a phone system built for billing by the minute. Or even the second.

Can we change phone systems into all-IP data systems? I would hope so.

But before I go any deeper, I want to plug my panel tomorrow morning (8:30am Pacific) at Supernova (#sn09). The title is Telecom as Software. Any questions you want me to ask, or topics you want me to cover, put them below.

“I make my living off the Evening News
Just give me something: something I can use
People love it when you lose
They love dirty laundry.

Don Henley, “Dirty Laundry”

Look up “Wikipedia loses” (with the quotes) and you get 20,800 results. Look up “Wikipedia has lost” and you get 56,900. (Or at least that’s what I got this morning.) Most of those results tell a story, which is what news reports do. “What’s the story?” may be the most common question asked of reporters by their managing editors. As humans, we are interested in stories — even if they’re contrived, which is what we have with all “reality” television shows.

Lately Wikipedia itself is the subject of a story about losing editors. The coverage snowball apparently started rolling with Volunteers Log Off as Wikipedia Ages, by Julia Angwin and Geoffrey A. Fowler in The Wall Street Journal. It begins,

Wikipedia.org is the fifth-most-popular Web site in the world, with roughly 325 million monthly visitors. But unprecedented numbers of the millions of online volunteers who write, edit and police it are quitting.

That could have significant implications for the brand of democratization that Wikipedia helped to unleash over the Internet — the empowerment of the amateur.

Volunteers have been departing the project that bills itself as “the free encyclopedia that anyone can edit” faster than new ones have been joining, and the net losses have accelerated over the past year. In the first three months of 2009, the English-language Wikipedia …

That’s all you get without paying. Still, it’s enough.

Three elements make stories interesting: 1) a protagonist we know, or is at least interesting; 2) a struggle of some kind; and 3) movement (or possible movement) toward a resolution. Struggle is at the heart of a story. There has to be a problem (what to do with Afghanistan), a conflict (a game between good teams, going to the final seconds), a mystery (wtf was Tiger Woods’ accident all about?), a wealth of complications (Brad and Angelina), a crazy success (the iPhone), failings of the mighty (Nixon and Watergate). The Journal‘s Wikipedia story is of the Mighty Falling variety.

The Journal’s source is Wikipedia: A Quantitative Analysis, a doctoral thesis by José Phillipe Ortega of Universidad Rey San Carlos in Madrid. (The graphic at the top of this post is one among many from the study.) In Wikipedia’s Volunteer Story, Erik Moeller and Erik Zachte of the Wikimedia Foundation write,

First, it’s important to note that Dr. Ortega’s study of editing patterns defines as an editor anyone who has made a single edit, however experimental. This results in a total count of three million editors across all languages.  In our own analytics, we choose to define editors as people who have made at least 5 edits. By our narrower definition, just under a million people can be counted as editors across all languages combined.  Both numbers include both active and inactive editors.  It’s not yet clear how the patterns observed in Dr. Ortega’s analysis could change if focused only on editors who have moved past initial experimentation.

Even more importantly, the findings reported by the Wall Street Journal are not a measure of the number of people participating in a given month. Rather, they come from the part of Dr. Ortega’s research that attempts to measure when individual Wikipedia volunteers start editing, and when they stop. Because it’s impossible to make a determination that a person has left and will never edit again, there are methodological challenges with determining the long term trend of joining and leaving: Dr. Ortega qualifies as the editor’s “log-off date” the last time they contributed. This is a snapshot in time and doesn’t predict whether the same person will make an edit in the future, nor does it reflect the actual number of active editors in that month.

Dr. Ortega supplements this research with data about the actual participation (number of changes, number of editors) in the different language editions of our projects. His findings regarding actual participation are generally consistent with our own, as well as those of other researchers such as Xerox PARC’s Augmented Social Cognition research group.

What do those numbers show?  Studying the number of actual participants in a given month shows that Wikipedia participation as a whole has declined slightly from its peak 2.5 years ago, and has remained stable since then. (See WikiStats data for all Wikipedia languages combined.) On the English Wikipedia, the peak number of active editors (5 edits per month) was 54,510 in March 2007. After a more significant decline by about 25%, it has been stable over the last year at a level of approximately 40,000. (See WikiStats data for the English Wikipedia.) Many other Wikipedia language editions saw a rise in the number of editors in the same time period. As a result the overall number of editors on all projects combined has been stable at a high level over recent years. We’re continuing to work with Dr. Ortega to specifically better understand the long-term trend in editor retention, and whether this trend may result in a decrease of the number of editors in the future.

They add details that amount to not much of a story, if you consider all the factors involved, including the maturity of Wikipedia itself.

As it happens I’m an editor of Wikipedia, at least by the organization’s own definitions. I’ve made fourteen contributions, starting with one in April 2006, and ending, for the moment, with one I made this morning. Most involve a subject I know something about: radio. In particular, radio stations, and rules around broadcast engineering. The one this morning involved edits to the WQXR-FM entry. The edits took a lot longer than I intended — about an hour, total — and were less extensive than I would have made, had I given the job more time and had I been more adept at editing references and citations. (It’s pretty freaking complicated.) The preview method of copy editing is also time consuming as well as endlessly iterative. It was sobering to see how many times I needed to go back and forth between edits and previews before I felt comfortable that I had contributed accurate and well-written copy.

In fact, as I look back over my fourteen editing efforts, I can see that most of them were to some degree experimental. I wanted to see if I had what it took to be a dedicated Wikipedia editor, because I regard that as a High Calling. The answer so far is a qualified no. I’ll continue to help where I can. But on the whole my time is better spent doing other things, some of which also have leverage with Wikipedia, but not of the sort that Dr. Ortega measured in his study.

For example, photography.

As of today you can find 113 photos on Wikimedia Commons that I shot. Most of these have also found use in Wikipedia. (Click “Check Usage” at the top of any shot to see how it’s been used, and where.) I didn’t put any of these shots in Wikimedia Commons, nor have I put any of them in Wikipedia. Other people did all of that. To the limited degree I can bother to tell, I don’t know anybody who has done any of that work. All I do is upload shots to my Flickr site, caption and tag them as completely as time allows, and let nature take its course. I have confidence that at least some of the shots I take will be useful. And the labor involved on my part is low.

I also spent about half an hour looking through Dr. Ortega’s study. My take-away is that Wikipedia has reached a kind of maturity, and that the fall-off in participation is no big deal. This is not to say that Wikipedia doesn’t have problems. It has plenty. But I see most of those as features rather than as bugs, even if they sometimes manifest, at least superficially, as the latter. That’s not much of a story, but it’s a hell of an accomplishment.

Tags: , , , , , , , , , , , ,

I just posted Rupert Murdoch vs. The Web, over at Linux Journal. In it I suggest that the Murdoch story (played mostly as Bing vs Google) is a red herring, and that the real challenge is to free the Web and ourselves from dependencies from giant companies I liken to volcanoes:

We’re Pompeians, Krakatoans, Montserratans, building cities and tilling farms on the slopes of active volcanoes. Always suckers for stories, we’d rather take sides in wars between competing volcanoes than build civilization on more flat and solid ground where there’s room enough for everybody.

Google and Bing are both volcanoes. Both grace the Web’s landscape with lots of fresh and fertile ground. They are good to have in many ways. But they are not the Earth below. They are not what gives us gravity.

I think one problem here is a disconnect between belief systems about markets, and the stories that arise from them.

One system believes a free market is Your Choice of Captor. In this camp I put both the make-it/take-it mentality (where “winners” are rewarded and “losers” punished) of the Wall Street Journal (which a few months ago looked upon the regulated duopolies for Internet access as the “free market” at work) and those who see business (or corporations, or capitalism, or all three) as a problem and look to government — another monopoly — for remedy from these evils in the marketplace. In other words, I lump both the left and the right in here, along with the conflicts between them.

The other system sees markets as settings for human activity: the locations, both real and virtual, where people and their organizations meet to do business, make culture, and build civilization. Here I put nearly everybody who contributed the structural agreements that made the Internet possible, and who truly understand what it is and how it works, even if they can’t all agree on what metaphors to use for it. I also include all who have contributed, and continue to contribute, to the free and open code bases with which we are building out our networked world. While political beliefs among members of this system may sort somewhere along the right-vs.-left axis, what they do to build the world is orthogonal to that axis. That’s one big reason why that work escapes notice.

The distinction I see here aligns well with Virginia Postrel‘s contrast between “stasists” and “dynamists”. The difference is that much of what gets done to make the networked world (and to support its dynamism) isn’t “dynamic” in the active and dramatic sense of the word — except in its second-order effects. For example, SMTP and IMAP are not dynamic. (Being mannerly technical agreements, protocols don’t do that.) But on those protocols (and related ones) email happened, and the world hasn’t been the same since.

With that distinction in mind, I suggest that too much oxygen suckage is wasted on “wars” between the stasists (some of whom are also into the superficially dynamistic attention-suck of vendor sports — here’s an oldie but goodie that still makes my point), and not enough on constructive work done by geeks and entrepreneurs who quietly build the original and useful stuff that serves as solid infrastructure on which countless public goods (including wealth creation beyond measure) can be generated.

We have the same problem in most net neutrality arguments. The right hates it, the left loves it. One looks to protect the “free market” of phone and cable companies (currently a Your-Choice-of-Captor system) while the other looks to government (meet your new captor) for relief. When in fact the whole thing has happened all along within what Bob Frankston calls The Regultorium.

The primary dynamism of the Internet — what gave us the Net in the first place, and what holds the most promise in the long run — doesn’t just come from those parties, and can’t be found in the arguments they’re having. It comes from low-box-office geekery that supports enormous new business opportunities (along with many public benefits, with or without business).

It’ll take time to see this, I guess. Just hope we don’t drown in lava in the meantime.

Bonus red herring: A lot of news really isn’t.

Tags: , , , , , , , , , , , , , , , ,

@robpatrob (Robert Paterson) asks (responding to this tweet and this post) “Why would GBH line up against BUR? Why have a war between 2 Pub stations in same city?” (In this tweet and this one, Dan Kennedy asks pretty much the same thing.)

The short answer is, Because it wouldn’t be a war. Boston is the world’s largest college town. There are already a pile of home-grown radio-ready program-filling goods here, if one bothers to dig and develop. The standard NPR line-up could also use a challenge from other producers. WGBH is already doing that in the mornings by putting The Takeaway up against Morning Edition. That succeeds for me because now I have more choices. I can jump back and forth between those two (which I do, and Howard Stern as well).

The longer answer is that it gives GBH a start on the inevitable replacement of signal-based radio by multiple streams and podcast line-ups. WGBH has an exemplary record as a producer of televsion programming, but it’s not setting the pace in other media, including radio. The story is apparent in the first four paragraphs of its About page (which is sure to change):

WGBH is PBS’s single largest producer of content for television (prime-time and children’s programs) and the Web. Some of your favorite series and websites — Nova, Masterpiece, Frontline, Antiques Roadshow, Curious George, Arthur, and The Victory Garden, to name a few — are produced here in our Boston studios.

WGBH also is a major supplier of programs heard nationally on public radio, including The World. And we’re a pioneer in educational multimedia and in media access technologies for people with hearing or vision loss.

Our community ties run deep. We’re a local public broadcaster serving southern New England, with 11 public television services and three public radio services — and productions (from Greater Boston to Jazz with Eric in the Evening) that reflect the issues and cultural riches of our region. We’re a member station of PBS and an affiliate of both NPR and PRI.

In today’s fast-changing media landscape, we’re making sure you can find our content when and where you choose — on TV, radio, the Web, podcasts, vodcasts, streaming audio and video, iPhone applications, groundbreaking teaching tools, and more. Our reach and impact keep growing.

Note the order: TV first, radio second, the rest of it third. But where WGBH needs to lead in the future is with #3: that last paragraph. Look at WGBH’s annual report. It’s very TV-heavy. Compare its radio productions to those of Chicago Public Radio or WNYC. Very strong in classical music (now moving over to WCRB, at least on the air), and okay-but-not-great in other stuff.

Public TV has already become a ghetto of geezers and kids, while the audience between those extrmes is diffusing across cable TV and other media. An increasingly negligible sum of people watch over-the-air (OTA) TV. Here WGBH lost out too. It’s old signal on Channel 2 was huge, reaching more households than any other in New England. Now it’s just another UHF digital signal — like its own WGBX/44, with no special advantages. Public radio is in better shape, for now, because its band isn’t the ever-growing accordion file that cable TV has become; and because most of it still lives in a regulated protectorate at the bottom fifth of the FM band. It also helps public radio that the rest of both the FM and the AM bands suck so royally. (Only sports and political talk are holding their own. Music programming is losing to file sharing and iPods. All-news stations are yielding to iPhone programs that offer better news, weather and traffic reporting. In Boston WBZ is still a landmark news station, but it has to worry a bit with WGBH going in the same direction.)

So the timing is right. WGBH needs to start sinking new wells into the aquifer of smart, talented and original people and organizations here in the Boston area — and taking the lead in producing great new programming with what they find. I’ll put in another plug for Chris Lydon‘s Open Source, which is currently available only in podcast/Web form. And there is much more, including Cambridge-based PRX‘s enormous portfolio of goods.  (Disclosure: my work with the Berkman Center is partially funded through PRX — and those folks, like Chris, are good friends.)

In the long run what will matter are sources, listeners, and the finite amount of time the latter can devote to the former. Not old-fashioned signals.

P.S. to Dan Kennedy’s tweeted question, “Is there another city in the country where two big-time public radio stations go head-to-head on news? Can’t think of one.” Here are a few (though I’d broaden the answer beyond “news,” since WBUR isn’t just that):

All with qualifications, of course. In some cases you can add in Pacifica (which, even though my hero Larry Josephson once called it a “foghorn for political correctness,” qualifies as competition). Still, my point is that there is room for more than one mostly-talk (or news) public radio station in most well-populated regions. Even in Boston, where WBUR has been king of the hill for many years. Hey, other things being equal (and they never are), the biggest signal still tends to win. And in Boston, WGBH has a bigger signal than WBUR: almost 100,000 watts vs. 12,000 watts. WBUR radiates from a higher elevaiton, but its signal is directional. On AM that means it’s stronger than the listed power in some directions and weaker in others; but on FM it means no more than the listed power in some directions and weaker in others. See the FCC’s relative field polar plot to see how WBUR’s signal is dented in every direction other than a stretch from just west of North to Southeast. In other words, toward all but about a third of its coverage area. To sum up, WGBH has a much punchier signal. I’m sure the GBH people also have this in mind when they think about how they’ll compete with BUR.

Tags: , , , , , , , , , , , , , , , , , , , , , ,

The longest thread in the history of this blog belongs to Why WQXR is better off as a public radio station, which I posted on July 26, and still has comments this month. The post followed a complex deal by which the New York Times divested its legacy classical music station, WQXR — and by which the station’s format, call letters, record library and some of its personnel survived as a noncommercial outlet of WNYC, on a different channel with a weaker signal. From the comments one might gather that more listeners were unhappy than happy with the deal. My post mostly presented the upside.

Now here in Boston a similar move is underway. WGBH, “Boston’s NPR arts and culture station” will go the way of WNYC-FM, which phased out classical music starting in 2002, eventually shunting it to HD side-channels and Internet streams while populating the FM signal (as well as its AM one) with news and information programming, which tends to be more popular and to attract more money in listener contributions. By saving WQXR, WNYC returned classical music to the airwaves (although the city was still down one classical station, or two if you want to go back to the very late WNCN.) WGBH clearly had the same thing in mind when it bought WCRB, which was already weakened in the Boston metro when it moved from its old local channel (102.5) to its current channel (99.5) in Lowell. (Wikipedia has good background poop on WCRB’s own long saga.) While both WCRB signals have about the same range, the old 102.5 signal radiates from the Boston FM and TV antenna farm in nearby Needham, while the new one on 99.5 comes from a hill overlooking the I-495/I93 intersection, far to the north near the New Hampshire border.

So now WGBH plans to move its classical programming to WCRB, whch will become a non-commercial station (as did WQXR), and to do more news and information programming on its own home signal (89.7), which is grandfathered at 100,000 watts on Great Blue Hill (hence the call letters) in Milton, on the south side of Boston. In terms of wattage alone, WGBH is New England’s most powerful station. (The largest coverage belongs to WHOM/94.9 on Mt. Washington in New Hampshire, which puts out 49,000 watts from the highest peak in the Northeast.) As a result WGBH can go head-to-head with WBUR/90.9, which is the incumbent public radio leader in Boston. (I’ve looked at the ratings, and WBUR has kicked WGBH’s butt for years — a fact that I am sure has rankled the latter.)

Still, many listeners are not happy. And not just about losing classical music.

WGBH is doing its best to gloss over the signal loss for classical (and other arts & culture) listeners, especially in the southern reaches of Eastern Massachusetts, where WGBH has a very strong signal and WCRB is mostly absent. To demonstrate, here is a comparison of coverage for WGBH, WCRB and WBUR, calculated by Radio-Locator.com:

gbh-crb-bur

Click on the image for a legible full-size version.

Still, my own take in the WGBH/WCRB case is the same as it was for WNYC/WQXR: this is the best that could be done for classical music on Boston airwaves — and it offers opportunities not possible for WCRB had it remained a commercial station. Go back to that first link if you want to see what those are.

As for me, I expect to be more likely to listen to a ‘GBH-run noncommercial WCRB than I did to the commercial one. First, the commercials were (and, at this writing, still are) annoying. Second, the WCRB repertoire was pretty close to all-hits, rather than the more varied and challenging fare found on WGBH. There should be a happy medium between the two, and I’m sure ‘GBH will work hard to find it.

But I’m privileged to live on the north side of the metro, so I get WCRB just fine. I think it’s a safe bet that more than one half of WGBH’s listening area won’t get a useful signal out of WCRB. And the area within which listeners can get WGBH’s HD stream is a subset of WGBH’s coverage area.

A digressive word about HD radio. I got one recently — a $99 Teac unit — at Costco. The tuner is remarkably good, and it gets most local stations’ HD side-channels. But “tuning” HD is a counter-intuitive chore. You tune in the partent station, wait for the HD symbol to appear, and then tune to the one or two HD channels of the station. It’s a multi-step selection process, with delays along the way. I’d be curious to know if anybody (beside those who pick a channel and stay put) has had a positive experience with tuning it.

For those who want to compare apples with apples, here’s some data:

One last thing. I for one (and I am sure there are many more) would love to hear Chris Lydon return to Boston’s airwaves. He has been a podcasting pioneer with an outstanding show. But coming on a live station would be fabulous.

Hey, how about Larry Josephson too?

Tags: , , , , , , , , ,

Catching up

I’m back in Boston after a great few days in Utah at the Kynetx Impact conference, where VRM and related stuff was brought up and discussed at length. It was an inaugural effort by Kynetx, which has what I think is a novel and profound take on the future of the Web.

The only bad thing that happened on the trip was a crash on my laptop that trashed my email and some other files. One result is that much of the email sent to my Berkman address  cyber.law.harvard.edu) since late Monday was lost. (Glad I back up almost constantly here at home. I do offsite as well, but lacked the connectivity speed during the trip to fix the problem.)

So if you sent me any email that mattered during that time, please send it again. Thanks.

« Older entries § Newer entries »