problems

You are currently browsing the archive for the problems category.

I’ve written a lot of stuff on the Web, and when I need to find some of it, Google is where I go. Lately, however, the going hasn’t been quite as good, because Google most of the time asks me if I want to spell my surname differently. For example, if I look up searls infrastructure, I get “Did you mean: Searles infrastructure“? I never used to get that. Now I do.

The former brings up 251,000 results, by the way, while the latter brings up 11,600. And the top result is a guy named Searle.

On that search, by the way, Bing does a better job. At least for me. Same with Yahoo.

[Later…] See the comments below. Looks like we got some debugging of sorts done here. And thanks to Matt and Pandu for responding, and so quickly. Well done.

I’m a born researcher. Studying stuff is a lot of what I do, whether I’m looking out the window of an airplaine, asking a question at a meeting, browsing through the Web and correspondence, or digging through books and journals in libraries.

Most of my library work, however, isn’t in library buildings. I work on my own screen. And there, much of what I’ve been studying lately is in Google scans of books.

I appreciate that Google has done Google Books. I also find the Google Books searching and reading process difficult in much the same way that looking at microfiche is difficult. The difference is that microfiche was in its time the best that could be done, while Google Books is great technology crippled by necessary compromise.

Much of that compromise — still ongoing — is around protecting both libraries and copyright holders. Contention around that topic has been large and complicated. A couple weeks back I hung out at Alternative Approaches to Open Digital Libraries in the Shadow of the Google Book Search Settlement: An Open Workshop at Harvard Law School, and left it better informed and less settled than ever.

In the Huffington Post, Pamela Samuelson, one of the world’s top copyright authorities, has a piece titled The Audacity of the Google Book Search Settlement, that begins,

  Sorry, Kindle. The Google Book Search settlement will be, if approved, the most significant book industry development in the modern era. Exploiting an opportunity made possible by lawsuits brought by a small number of plaintiffs on one narrow issue, Google has negotiated a settlement agreement designed to give it a compulsory license to all books in copyright throughout the world forever. This settlement will transform the future of the book industry and of public access to the cultural heritage of mankind embodied in books. How audacious is that?

She adds,

  Under the settlement, the Authors Guild and AAP are tasked with creating a new collecting society, the Book Rights Registry, which is supposed to find class members, sign them up, and pay them from a revenue stream that Google intends to generate from its commercialization of these books…
  Google will pay to the Registry 63 percent of the revenues it receives from its commercialization efforts of out-of-print books. After deducing its expenses, the Registry will pay royalties to those who have registered with it. Yet, the agreement also authorizes the Registry to pay out unclaimed funds from orphan and other unregistered works to registered owners, even though they are neither the authors nor the publishers of potentially millions of books.

It gets far more icky and complicated than that. Pamela continues,

  However, much larger questions call into question whether the settlement should be approved. One is whether the Authors Guild and AAP fairly represented the interests of all authors and publishers of in-copyright books during the negotiations that led up to the settlement agreement. A second is whether going forward, they and the newly created Registry to which they will give birth will fairly represent the interests of those on whose behalf the Registry will be receiving revenues from Google. As well-intentioned as they may be, the Authors Guild and AAP have negotiated an agreement that serves the interests of the core members of their organizational constituencies, not the thousands of times larger and more diverse class of authors and publishers of books from all over the world.

In What the Google Books Settlement Agreement Says About Privacy, Eric Hellman writes,

  Google, as presently constituted, has every reason to be concerned about user privacy and guard it vigilantly; its business would be severely compromised by any perception that it intrudes on the privacy of its users. As Larry Lessig pointed out at the Berkman workshop, that doesn’t mean that the Google of the future will behave similarly. Privacy concerns should be addressed; the main question has been how and where to address them. My reading of the settlement agreement is that it may be possible to address these concerns through the agreement’s Security Standard review mechanism, through oversight of the Registry, and through state and federal laws governing library patron privacy.

There’s a story this morning on NPR about how Google is building “the prospect of a virtual super-library”. Privacy is the angle on that one too. It’s also been the angle of the EFF for a long time. They’re looking for legally binding privacy guarantees. Google thinks a copyright conflict agreement would be a “wierd” place to put those guarantees.

It is a fortuitous but odd conflation. As Todd Carpenter tweets, “I don’t dismiss privacy concerns (have disabled WhysperSync on my #kindle for privacy) There are just bigger issues at stake.” Todd runs NISO, a publishing standards organization (he is also, by small-world coincidence in this thread — since, oddly, we’ve hardly talked about it, at least so far — my son-in-law). He also blogs here.

Here’s the larger issue for me: Google is a monopoly. One example. I’m looking right now at an AR&D case study (a .pdf I can’t find on the Web at the moment) of Jerry Damson Automotive Group, which the report says is the largest automobile dealer in Alabama. Here’s an excerpt:

  So where is the Damson group’s focus, if not on local media?
  “Every minute of every day is spent thinking about the consequences of our decisions as it relates to Google.” This remarkable statement is one that more advertisers will be making as they, too, grow in their un-derstanding of the Web and how advertising works in a hyperconnected universe. Boles is far ahead of most, but others will not be far behind, for people like him are paving the way for a future generation of strategies and tactics that enable commerce. “We begin each chunk (morning, mid-day, afternoon and evening) of the day with Google Analytics.”

Substitute libraries for “local media”, and you get a sense of the impact here .

Here at Harvard we have Hollis, one of the world’s largest searchable library catalogs. Maybe the largest, I dunno. But it’s a big one, and it matters. When I search through the Hollis catalog, which I do nearly every day through a search thing in my browser toolbar, many of the results are accompanied by a book cover graphic and a link that reads, “Discover more in Google Books”. That pops me out of Hollis and into Google Books itself. In other searches (through the new catalog, which is fancier), I get no mention of Google Books, but when I click on the picture of a book cover, Google Books is where I go. It’s in a different window, but still I get the impression that Google Books is part of Hollis. And that creeps me out a bit, handy as it is in some ways.

Siva Vaidhyanathan is writing a book called The Googlization of Everything: How one companyh is disrupting culture, commerce and community — and why we should worry. He spoke at the workshop as well, and has lots of deep and good things to say.

Lessig says this settlement moves books down the path of documentary films: access encumbered by a bunch of agreements, without a guarantee of future access. It is “worse that a digital bookstore.” It brings us to “an excessive permission culture” produced by “a structure of oligopolies”. A “tendency to access” but not of free access. He suggests that we are turning our culture over to tigers when they still look like kittens.

There is not an easy answer. Or set of answers. So I’ll stand right now on the questions raised at the end of this Seth Finkelstein essay in The Guardian:

  Amid all the reactions, an overall lesson should be how little can be determined by legalism, and how much remains unsettled as new technology causes shifts in markets and power. There’s some value in enemy-of-my-enemy opposition, where the interests of an advertising near-monopoly are a counterweight to a content cartel. But battles between behemoth businesses should not be mistaken for friendship to libraries, authors or public interest.

2close2nstar

Mark Finnern has a great idea: Wikipedia papers. Specifically,

Every student that takes a class has to create or improve a Wikipedia page to the topic of the class. It shouldn’t be the only deliverable, but an important one.

The Wikimedia organization could help the professors with tools, that highlight the changes that a certain user has done on a page. You only pass, when the professor is satisfied with the scientific validity of the page. One could even mark the pages that went through this vetting process differently.

Instead of creating papers that end up in a drawer, you would create pages that you even feel ownership of and would make sure that they stay current and don’t get vandalized. You could even link to them on you LinkedIn profile.

It would make an enormous difference to the quality of Wikipedia year over year. One can think of wiki-how and other pages that could be improved using the same model.

There are other reasons. For example, Wikipedia has holes. Not all of these line up with classes being taught, but some might. Let’s take one example…

811

Wikipedia has an entry for 5-1-1, the phone number one calls in some U.S. states for road conditions. It also has an entry for 9-1-1, the number one calls in North America for emergency services. And, while it has an entry for 8-1-1, the “call before you dig” number in the U.S., it’s kinda stale. One paragraph:

All 811 services in the U.S. will end up using 611 by early 2007, as the United States Federal Communications Commission (FCC) in March 2005 made 811 the universal number for the 71 regional services that coordinate location services for underground public utilities in the U.S.[1][dated info] Currently, each of these “call before you dig” services, has its own 800 number, and the FCC and others want to make it as easy as possible for everyone planning an excavation to call first. This safety measure not only prevents damage that interrupts telecommunications, but also the cutting of electricity, water mains, and natural gas pipes. Establishment of an abbreviated dialing number for this purpose was required by the Pipeline Safety Improvement Act of 2002.

That last link takes you to one of those “Wikipedia does not have an article with this exact name” places. The “call before you dig” link redirects to Utility location. There you’ll find this paragraph:

One-call, Miss Utility, or Underground Service Alert are services that allow construction workers to contact utility companies, who will then denote where underground utilities are located via color-coding those locations. As required by law and assigned by the FCC, the 8-1-1 telephone number will soon be used for this purpose across the United States.

Well, it’s already being used. And it’s way freaking complicated, because there’s this very uneven overlap of entities — federal government, state goverenments, regional associations, and commercial entities, to name a few — that all have something to say.

For example, the U.S. Department of Transportation Pipeline and Hazardous Materials Safety Administration, or PHMSA. Right on their front page, they tell you April is Safe Digging Month. Good to know. April of what year? Next to a blurred emblem with an 811 over a shovel (a poor version the above, which comes from the Utility Notification Center of Colorado) and a horribly blurred graphic proclaiming WE SUPPORT SAFE DIGGING MONTH, a Call Before You Dig link leads to a page that explains,

Guidance for implementing safe and effective damage prevention for underground utilities was established by the Common Ground Alliance (CGA), a national organization representing all underground utility stakeholders. Calling before you dig is the first rule to remember when conducting underground related activities, no matter what the job is. The law requires you to phone the “One-Call” center at 8-1-1 at least two days prior to conducting any form of digging activity.

No link to the Common Ground Alliance. That org (a domain squatter has its .org URL, so it’s a .com) explains that it’s “a member-driven association dedicated to ensuring public safety, environmental protection, and the integrity of services by promoting effective damage prevention practices.” Its news page mentions that, among other things, August 11 is “8-11 Day”. It has a press release template in Word format. It also has news that “MGH Hired as CGA 811 Awareness Contractor” in .pdf. Within that one finds MGH’s website URL, where one finds that the agency is @mghus, which may be the hippest thing in this whole mess.

Digging farther, one finds that there is an call811.com, which appears to be another face of the Common Ground Alliance. (If you’re interested, here are its “sponsors and ambassadors”.)

Also involved is the American Public Works Association. Apparently the APWA is the outfit behind what LAonecall (one of the zillion of these with similar names) calls “the ULCC Uniform Color Code using the ANSI standard Z53.1 Safety Colors”. APWA must have published it at one point, but you won’t find it on its website. Hey, Google doesn’t. Though it does find lots of other sites that have it. Most are local or regional governmental entities. Or utilities like, say, Panhandle Energy. Here’s the graphic:

colorcode

Here in New England (all of it other than Connecticut, anyway), the public face of this is Dig Safe System, Inc., which appears to be a nonprofit association, but there’s nothing on the site that says wtf it is — though it is informative in other respects. It does say, on its index page,

What is Dig Safe ®?

State laws require anyone who digs to notify utility companies before starting, and for good reason. Digging can be dangerous and costly without knowing where underground facilities are located.

Dig Safe ystem, Inc. is a communication network, assisting excavators, contractors and property owners in complying with state law by notifying the appropriate utilities before digging. Dig Safe®, a free service, notifies member companies of proposed excavation projects. In turn, these member utilities respond to the work area and identify the location of underground facilities. Callers are given a permit number as confirmation.

Member utilities, or contracted private locators, use paint, stakes or flags to identify the location of buried facilities. Color coding is used to identify the type of underground facilities… (and the same color coding as above)

I found out all of this — and much more — while I was researching for my column in the November issue of Linux Journal, which has Infrastructure the issue’s theme. I’m leveraging my leftovers here, closing one tab after another in my browser.

I’m also interested in approximately everything, one of which is the official-looking public graffiti on the ground all over the place. These are known locally as “dig safe markings”. At least that piece of the scattered one-call/call-before-you-dig/8-1-1 branding effort has taken root, at least here.

Anyway, I’d love to see a Wikipedia entry or two that pulls all this together. Maybe I should write it, but I’m busy. Hey, I’ve done this much already. Some actual experts ought to pick up the ball and post with it.

Which brings us back to Mark’s suggestion in the first place. Have a class do it.

Hey, @mghus, since you’re in Baltimore, how about  suggesting a Wikipedia page project to The Civil & Environmental Engineering Department at UMBC?

Maybe for 8-11 Day?

Tags: , , , , , , , , , , , , , , , ,

Test #2

Well, the first try at the other blog failed. Let’s see if I unscrewed what I lost at this blog. Yep. Did. Backups are a good thing to have.

Okay, just imported all my categories. That was cool too. I think I’ll stop pressing my luck now. It’s good just to have the outliner working again.

One of the reasons I liked Dish Network (to the extent anybody can like a purely commercial entertainment utility) was that their satellite receivers included an over-the-air tuner. It nicely folded your over-the-air (OTA) stations in with others in the system’s channel guide. Here’s how it looked:

dish_guide1

Well, the week before last I discovered that our Dish receiver was having trouble seeing and using its broadband connection — and, for that matter, the phone line as well. That receiver was this one here…

vip622-lrg

… a ViP 622. Vintage 2006. Top of Dish’s line at the time. Note the round jack on the far left of the back side. That’s where your outside (or inside) over-the-air antenna plugged in. We’ll be revisiting the subject shortly.

So Dish sent a guy out. He replaced the ViP 622 with Dish’s latest (or so he said): a ViP 722. I looked it up on the Web and ran across “DISH Network’s forthcoming DVRs get detailed: hints of Sling all over“, by Darren Murph, posted May 18th 2008. Among other things it said, “The forthcoming ViP 722 will be the first HD DVR from the outfit with loads of Sling technology built in — not too shocking considering the recent acquisition. Additionally, the box is said to feature an all new interface and the ability to browse to (select) websites, double as a SlingCatcher and even handle Clip & Sling duties.”

So here it was, July 2009, and I had a ViP 722 hooked up to my nice Sony flat screen, and … no hint of anything remotely suggestive of a Sling feature. When I asked the Dish guy about it, he didn’t have a clue. Sling? What’s that? Didn’t matter anyway, because the thing couldn’t use our broadband. The guy thought it might be my firewall, but I don’t have one of those.  Just a straight Net connection, through a router and a switch in a wiring closet that works fine for every other Net-aware device hooked up to it. We tested the receiver’s connection with a laptop: 18Mb down, 4Mb up. No problems. The receiver gets an IP address from the router (and can display it), and lights blink by the ethernet jack. But… it doesn’t communicate. The Dish guy said the broadband was only used for pay-per-view, and we don’t care about that, it doesn’t much matter. But we do care about customer support. Dish has buttons and menu choices for that, but—get this—has to dial out on a phone line to get the information you want. I had thought this was just a retro feature of the old ViP 622, but when I called Dish they said no, it’s still a feature of ALL Dish receivers.

It’s 2009, and these things are still dialing out. On a land line. Amazing.

So a couple days ago my wife called me from the house (I’m back in Boston) and said that the ViP 722 was dead. Tot. Mort. We tried re-setting it, unplugging and plugging it back in. Nothing. Then yesterday Dish came out to fix the thing, found was indeed croaked, and put in a new one: a ViP 722k, Dish’s “advanced, state-of-the-art” reciever of the moment.

Well, it may be advanced in lots of ways, but it’s retarded in one that royally pisses me off: no over-the-air receiver. That jack in the back I pointed out above? Not there.  So, no longer can I plug in my roof antenna to watch over-the-air TV. To do that I’ll have to bypass the receiver and plug the antenna cable straight into the TV. (That has never worked either, because Sony makes the channel-tuning impossible to understand, much less operate. On that TV, switching between satellite and anything else, such as the DVD, is a freaking ordeal.) Oh, and I won’t be able to record over-the-air programs, either. Unless I get a second DVR that’s not Dish’s.

Okay, so I just did some looking around, and found through this video that the ViP 722K has an optional “MT2 OTA module” that gets you over-the-air TV on the ViP 722k. Here’s some more confusing shit about it. Here’s more from Dishuser.org. Here’s the product brochure (pdf). Digging in, I see it’s two ATSC (digital TV) tuners in one, with two antenna inputs, and it goes in a drawer in the back of the set. It costs $30. I don’t think the Dish installer even knew about it. He told me that the feature had been eliminated on the 722K, and that I was SOL.

Bonus bummer: The VIP 722k also features a much more complicated remote control. This reduces another long-standing advantage of Dish: remote controls so simple to use that you could operate them in the dark. Bye to that too.

So. Why did Dish subtract value like that? I can think of only two reasons. One is that approximately nobody still watches over-the-air TV. (This is true. I’m one of the very few exceptions. Color me retro.) The other is that Dish charges $5.99/month for local channels. They did that before, but now they can force the purchase. “Yes, we blew off your antenna, but now you can get the same channels over satellite for six bucks a month.” Except for us it’s not the same channels. We live in Santa Barbara, but can’t get the local over-the-air channels. Instead we watch San Diego’s. Dish doesn’t offer us those, at any price.

The final irony is that the ViP 722k can’t use our broadband or our phone line either. Nobody ever figured out that problem. That means this whole adventure was for worse than naught. We’d have been better off if with our old ViP 622. There was nothing wrong with it that isn’t still wrong with its replacements.

Later my wife shared a conversation she had with a couple other people in town who had gone through similar craziness at their homes. “What happened to TV?” one of them said. “It’s gotten so freaking complicated. I just hate it.”

What’s happening is a dying industry milking its customers. That much is clear. The rest is all snow.

Tags: , , , , ,

It helps to recognize that the is exactly what its name denotes: an association of presses. Specifically, newspapers. Fifteen hundred of them. Needless to say, newspapers are having a hard time. (Hell, I gave them some, myself, yesterday.) So we might cut them a little slack for getting kinda testy and paranoid.

Reading the AP’s paranoid jive brings to mind Jim Clark on stage at the first (only?) Netscape conference. Asked by an audience member why he said stuff about Microsoft that might have a “polarizing effect”, Jim rose out of his chair and yelled at the questioner, “THEY’RE TRYING TO KILL US. THAT HAS A POLARIZING EFFECT!” I sometimes think that’s the way the AP feels toward bloggers. Hey, when you’re being eaten alive, everything looks like a pirhana.

But last week the AP, probably without intending it, did something cool. You can read about it in “Associated Press to build news registry to protect content“, a press release that manages to half-conceal some constructive open source possibilities within a pile of prose that seems mostly to be about locking down content and tracking down violators of AP usage policies. Ars Technica unpacks some of the possibilities. Good piece.

Over in Linux Journal I just posted AP Launches Open Source Ascribenation Project, in which I look at how the AP’s “tracking and tagging” technology, which is open source, can help lay the foundations for a journalistic world where everybody gets credit for what they contribute to the greater sphere of news and comment — and can get paid for it too, easily — if readers feel like doing that.

The process of giving credit where due we call , and the system by which readers (or listeners, or viewers) choose to pay for it we call .

Regardless of what we call it, that’s where we’re going to end up. The system that began when the AP was formed in 1846 isn’t going to go away, but it will have to adapt. And adopt. It’s good to see it doing the latter. The former will be harder. But it has to be done.

I’d say more here, but I already said it over there.

Tags: , , ,

“Saving newspapers” is beginning to look like saving caterpillars. Or worse, like caterpillars saving themselves. That’s was the message I got from Rick Edmonds’ API Report to Exec Summit: Paid Content Is the Future for News Web Sites, in Poynter, back in early June. In The Nichepaper Manifesto Umair Haque points toward a possible future butterfly stage for newspapers. Sez Umair,  “Nichepapers aren’t a new product, service, or business model. They are a new institution.”

He gives examples: Talking Points Memo. Huffington Post. Perez Hilton. Business Insider. He’s careful to say that these may not be the first or the best but are “avenues that radical innovators are already exploring to reconceive news for the 21st century.”

These, however, are limited as news sites, and not the best models of future nichepapers. Yes, they’re interesting and in some cases valuable sources of information; but they all also have axes to grind. In this sense they’re more like the old model (papers always had axes too) than the new one(s).

To help think about where news is going, let’s talk about one cause of serious news: wildfires. In Southern California we have lots of wildfires. They flare up quickly, then threaten to wipe out dozens, hundreds or thousands of homes, and too often do exactly that. Look up San Diego Fire, Day Fire, Gap Fire, Tea Fire, Jesusita Fire. The results paint a mosaic, or perhaps even a pointillist, picture of news sourced, reported, and re-reported by many different people, organizations and means. These are each portraits of an emerging ecosystem within  which newspapers must adapt of die.

Umair says, “In the 21st century, it’s time, again for newspapers to learn how to profit with stakeholders — instead of extracting profits from them. The 21st century’s great challenge isn’t selling the same old “product” better: it’s learning to make radically better stuff in the first place.”

Exactly. And that “making” will be as radically different as crawling and flying.

Tags: , , ,

[Later, on 1 October 2009… This matter has been resolved. The charge for going over has been dropped, the service restored and good will along with it. Thanks to both @sprintcares and the chat person at My Sprint.]

So I just got a “courtesy call” from Sprint, a company I’ve been talking up for a couple years because I’ve had nothing but positive experience with my Sprint EvDO data card.

Well, that’s over. The call was to inform me that I’d gone over the 5Gb monthly usage limit for my data card, to the tune of 10,241,704.22kb, for which I was to be charged $500, on top of my $59.99 (plus $1.24 tax) monthly charge.

I didn’t know about the 5Gb limit. (In fact, I believed Sprint had an unlimited data plan, which is one reason I used them.) Kent German in CNET explains why in Sprint to limit data usaga on Everything plans. He begins,

When is unlimited not unlimited? Apparently when it comes from Sprint. Though the carrier has been very active about touting its new “simply everything” plan, which includes unlimited mobile Internet and messaging, it plans to place a cap on monthly data usage next month. Sprint will limit its simply everything customers to 5GB of data usage per month, plus 300MB per month for off-network data roaming.

A Sprint representative told BetaNews that the cap is needed to ensure a great customer experience.

O ya. By “great” they must mean bill size. Kent continues,

“The use of voice and data roaming by a small minority of customers is generating a disproportionately large level of operating expense for the company,” the representative said. “This limit is well within the range of what a typical customer would normally use each month.”…

BetaNews said Sprint began notifying customers in monthly bills that were mailed this week. The change will go into effect 30 days after customers receive the note. Also, the carrier said it will call customers next month to make sure they’re aware of the changes.

Well, I don’t read my bills. They go to my bookkeeper, who pays them and tosses whatever BS comes along inside the envelopes. I also don’t have a Sprint phone, or phone number. Maybe that’s why I never got that call.

Why did I go over? Possibly because I had little or no reliable landline (cable) Internet connectivity at my house in Santa Barbara for weeks after I got back there in June. I wrote about that here, here, here, here and here. So I used my Sprint datacard a lot. In fact it was something of a life-saver.

Earth to Sprint: that “small minority of customers” is the future of your company. You should invest in them, and in your relationships with them.

The Sprint person on the “courtesy call” knocked $350 off the bill. That was because she was ready to “work” with me on the matter. I asked her how she arrived at that number. She said she couldn’t say.

I hope they work zero in to their future calculations. Because that’s what they’re getting from me as soon as I find a better deal elsewhere.

I’m not sure how to price the good will they’ve lost. In fact, I’m not sure that has a price.

Tags: , , , ,

In his comment to my last post about the sale of WQXR to WNYC (and in his own blog post here), Sean Reiser makes an important point:

One of the unique things about the QXR was it’s relationship with the Times. The Times owned QXR before the FCC regulations prohibiting newspapers ownership of a radio station were enacted. Because of this relationship, QXR’s newsroom was located in the NY Times building and news gathering resources were shared. In a precursor to newspaper reporters doing podcasts, Times columnists and arts reporters would often appear on the air doing segments.

It’s true. The Times selling WQXR seems a bit like the New Yorker dropping poetry, or GE (née RCA) closing the Rainbow Room. (Which has already happened… how many times?) To cultured veteran New Yorkers, the Times selling WQXR seems more like a partial lobotomy than a heavy heirloom being thrown off a sinking ship.

For much of the history of both, great newspapers owned great radio stations. The Times had WQXR. The Chicago Tribune had (and still has) WGN (yes, “World’s Greatest Newspaper”). The Washington Post had WTOP. (In fact, the Post got back into the radio game with Washington Post Radio, on WTOP’s legacy 50,000-watt signal at 1500 AM. That lasted from 2006-2008.). Trust me, the list is long.

The problem is, both newspapers and radio stations are suffering. Most newspapers are partially (or, in a few cases — such as this one — totally) lobotomized versions of their former selves. Commercial radio’s golden age passed decades ago. WQXR, its beloved classical format, and its staff, have been on life support for years. Most other cities have lost their legacy commercial classical stations (e.g. WFMR in Milwaukee), or lucked out to various degrees when the call letters and formats were saved by moving to lesser signals, sometimes on the market’s outskirts (e.g. WCRB in Boston). In most of the best cases classical formats were saved by moving to noncommercial channels and becomimg public radio stations. In Los Angeles, KUSC took over for KFAC (grabbing the latter’s record library) and KOGO/K-Mozart. In Raleigh, WCPE took over for WUNC and WDBS. In Washington, WETA took over for WGMS. Not all of these moves were pretty, but all of them kept classical music alive on their cities’ FM bands.

In some cases, however, “saved’ is an understatement. KUSC, for example, has a bigger signal footprint and far more to offer, than KFAC and its commercial successors did. In addition to a first-rate signal in Los Angeles, KUSC is carried on full-size stations in Palm Springs, Thousand Oaks, Santa Barbara and San Luis Obispo — giving it stong coverage of more population than any other station in Los Angeles, including the city’s substantial AM stations. KUSC also runs HD programs on the same channels, has an excellent live stream on the Web, and is highly involved in Southern California’s cultural life.

I bring that up because the substantial advantages of public radio over commercial radio — especially for classical music — are largely ignored amidst all the hand-wringing (thick with completely wrong assumptions) by those who lament the loss  — or threatened loss — of a cultural landmark such as WQXR. So I thought I’d list some of the advantages of public radio in the classical music game.

  1. No commercials. Sure, public radio has its pitches for funding, but those tend to be during fund drives rather than between every music set.
  2. More room for coverage growth. The rules for signals in the noncommercial end of the band (from 88 to 92) are far more flexible than those in the commercial band. And noncommercial signals in the commercial band (such as WQXR’s new one at 105.9) can much more easily be augmented by translators at the fringes of their coverage areas — and beyond. Commercial stations can only use translators within their coverage areas. Noncommercial stations can stick them anywhere in the whole country. If WNYC wants to be aggressive about it, you might end up hearing WQXR in Maine and Montana. (And you can bet it’ll be on the Public Radio Player, meaning you can get it wherever there’s a cell signal.)
  3. Life in a buyer’s market. Noncommercial radio stations are taking advantage of bargain prices for commercial stations. That’s what KUSC did when it bought what’s now KESC on 99.7FM in San Luis Obispo. It’s what KCLU did when it bought 1340AM in Santa Barbara.
  4. Creative and resourceful engineering. While commercial radio continues to cheap out while advertising revenues slump away, noncommercial radio is pioneering all over the place. They’re doing it with HD Radio, with webcasting (including multiple streams for many stations), with boosters and translators, with RDS — to name just a few. This is why I have no doubt that WNYC will expand WQXR’s reach even if they can’t crank up the power on the Empire State Building transmitter.
  5. Direct Listener Involvement. Commercial radio has had a huge disadvantage for the duration: its customers and its consumers are different populations. As businesses, commercial radio stations are primarily accountable to advertisers, not to listeners. Public radio is directly accoutable to its listeners, because those are also its customers. As public stations make greater use of the Web, and of the growing roster of tools available for listener engagement (including tools on the listeners’ side, such as those we are developing at ProjectVRM), this advantage over commercial radio will only grow. This means WQXR’s listeners have more more opportunity to contribute positively to the station’s growth than they ever had when it was a commercial station. (Or if, like WCRB, it lived on as a lesser commercial station.) So, if you’re a loyal WQXR listener, send a few bucks to WNYC. Tell them thanks for saving the station, and tell them what you’d like them to do with the station as well.

I could add more points (and maybe I will later), but that should suffice for now. I need to crash and then get up early for a quick round trip to northern Vermont this morning. Meanwhile, hope that helps.

Tags: , , , , , , , , , , , , , , , , , , , , , ,

How Teenagers Consume Media: the report that shook the City carries approximately no news for anybody who watches the changing tastes and habits of teenagers. What makes it special is that it was authored by a fifteen-year old intern at Morgan Stanley in London, and then published by the company.

It says teens like big TVs, dislike intrusive advertising, find a fun side to viral marketing, blow off Twitter, ignore all but the free tabloid newspapers, watch anime on YouTube and so on.

All these are momentary arrangements of patterns on the surface of a growing ocean of bits. (For why it grows, see Kevin Kelly.) What’s most productive to contemplate, I think, is how we will learn to thrive in a vast and growing bit-commons whilst (to borrow a favorite preposition of this teen) trying to make money in the midst.

Which brings me to Chris Anderson‘s new book, Free: the Future of a Radical Price. Malcolm Gladwell dissed it in The New Yorker, while Seth Godin said Malcolm is Wrong and Virginia Postrel gives it a mixed review in The New York Times. But I’m holding off for the simple reason that I haven’t finished reading it. If I write something about it afterward, it will likely be along the lines of what I wrote in Linux Journal as a long response to Tom Friedman’s The World is Flat. (Here are Part I and Part II, totaling more than 10,000 words.)

Tags: , , , , , ,

« Older entries § Newer entries »