Quote

You are currently browsing the archive for the Quote category.

In Chatbots were the next big thing: what happened?, Justin Lee (@justinleejw) nicely unpacks how chatbots were overhyped to begin with and continue to fail their Turing tests, especially since humans in nearly all cases would  rather talk to humans than to mechanical substitutes.

There’s also a bigger and more fundamental reason why bots still aren’t a big thing: we don’t have them. If we did, they’d be our robot assistants, going out to shop for us, to get things fixed, or to do whatever.

Why didn’t we get bots of our own?

I can pinpoint the exact time and place where bots of our own failed to happen, and all conversation and development went sideways, away from the vector that takes us to bots of our own (hashtag: #booo), and instead toward big companies doing more than ever to deal with us robotically, mostly to sell us shit.

The time was April 2016, and the place was Facebook’s F8 conference. It was on stage there that Mark Zuckerberg introduced “messenger bots”. He began,

Now that Messenger has scaled, we’re starting to develop ecosystems around it. And the first thing we’re doing is exploring how you can all communicate with businesses.

Note his use of the second person you. He’s speaking to audience members as individual human beings. He continued,

You probably interact with dozens of businesses every day. And some of them are probably really meaningful to you. But I’ve never met anyone who likes calling a business. And no one wants to have to install a new app for every service or business they want to interact with. So we think there’s gotta be a better way to do this.

We think you should be able to message a business the same way you message a friend. You should get a quick response, and it shouldn’t take your full attention, like a phone call would. And you shouldn’t have to install a new app.

This promised pure VRM: a way for a customer to relate to a vendor. For example, to issue a service request, or to intentcast for bids on a new washing machine or a car.

So at this point Mark seemed to be talking about a new communication channel that could relieve the typical pains of being a customer while also opening the floodgates of demand notifying supply when it’s ready to buy. Now here’s where it goes sideways:

So today we’re launching Messenger Platform. So you can build bots for Messenger.

By “you” Zuck now means developers. He continues,

And it’s a simple platform, powered by artificial intelligence, so you can build natural language services to communicate directly with people. So let’s take a look.

See the shift there? Up until that last sentence, he seemed to be promising something for people, for customers, for you and me: a better way to deal with business. But alas, it’s just shit:

CNN, for example, is going to be able to send you a daily digest of stories, right into messenger. And the more you use it, the more personalized it will get. And if you want to learn more about a specific topic, say a Supreme Court nomination or the zika virus, you just send a message and it will send you that information.

And right there the opportunity was lost. And all the promise, up there at the to of the hype cycle. Note how Aaron Batalion uses the word “reach” in  ‘Bot’ is the wrong name…and why people who think it’s silly are wrong, written not long after Zuck’s F8 speech: “In a micro app world, you build one experience on the Facebook platform and reach 1B people.”

What we needed, and still need, is for reach to go the other way: a standard bot design that would let lots of developers give us better ways to reach businesses. Today lots of developers compete to give us better ways to use the standards-based tools we call browsers and email clients. The same should be true of bots.

In Market intelligence that flows both ways, I describe one such approach, based on open source code, that doesn’t require locating your soul inside a giant personal data extraction business.

Here’s a diagram that shows how one person (me in this case) can relate to a company whose moccasins he owns:

vrmcrmconduit

The moccasins have their own pico: a cloud on the Net for a thing in the physical world: one that becomes a standard-issue conduit between customer and company.

A pico of this type might come in to being when the customer assigns a QR code to the moccasins and scans it. The customer and company can then share records about the product, or notify the other party when there’s a problem, a bargain on a new pair, or whatever. It’s tabula rasa: wide open.

The current code for this is called Wrangler. It’s open source and in Github. For the curious, Phil Windley explains how picos work in Reactive Programming With Picos.

It’s not bots yet, but it’s a helluva lot better place to start re-thinking and re-developing what bots should have been in the first place. Let’s start developing there, and not inside giant silos.

[Note: the image at the top is from this 2014 video by Capgemini explaining #VRM. Maybe now that Facebook is doing face-plants in the face of the GDPR, and privacy is finally a thing, the time is ripe, not only for #booos, but for the rest of the #VRM portfolio of unfinished and un-begun work on the personal side.]

archimedes120

On a mailing list that obsesses about All Things Networking, another member cited what he called “the Doc Searls approach” to something. Since it was a little off (though kind and well-intended), I responded with this (lightly edited):

The Doc Searls approach is to put as much agency as possible in the hands of individuals first, and self-organized groups of individuals second. In other words, equip demand to engage and drive supply on customers’ own terms and in their own ways.

This is supported by the wide-open design of TCP/IP in the first place, which at least models (even if providers don’t fully give us) an Archimedean place to stand, and a wide-open market for levers that help us move the world—one in which the practical distance between everyone and everything rounds to zero.

To me this is a greenfield that has been mostly fallow for the duration. There are exceptions (and encouraging those is my personal mission), but mostly what we live with are industrial age models that assume from the start that the most leveraged agency is central, and that all the most useful intelligence (lately with AI and ML being the most hyper-focused on and fantasized about) should naturally be isolated inside corporate giants with immense data holdings and compute factories.

Government oversight of these giants and what they do is nigh unthinkable, much less do-able. While regulators aplenty know and investigate the workings of oil refineries and nuclear power plants, there are no equivalents for Google’s, Facebook’s or Amazon’s vast refineries of data and plants doing AI, ML and much more. All the expertise is working for those companies or selling their skills in the marketplace. (The public minded work in universities, I suppose.) I don’t lament this, by the way. I just note that it pretty much can’t happen.

More importantly, we have seen, over and over, that compute powers of many kinds will be far more leveraged for all when individuals can apply them. We saw that when computing got personal, when the Internet gave everybody a place to operate on a common network that spanned the world, and when both could fit in a hand-held rectangle.

The ability for each of us to not only drive prices individually, but to retrieve the virtues of the bazaar to the networked marketplace, will eventually win out. In the meantime it appears the best we can do is imagine that the full graces of computing and networks are what only big companies can do for (and to) us.

Bonus link: a talk I gave last week in Munich.

So I thought it might be good to surface that here. At least it partly explains why I’ve been working more and blogging less lately.

Fort Lee has been in the news lately. Seems traffic access to the George Washington Bridge from Fort Lee was sphinctered for political purposes, at the spot marked “B” on this map here:

(This was later the place where “bridgegate” took place.) The spot marked “A” is the site of my first home: 2063 Hoyt Avenue. Here’s how it looked in 1920:

My grandfather, George W. Searls, built it in 1900 or so. He and grandma, Ethel F. (née Englert) Searls, raised thee children there: Ethel M. Searls, born in 1905, Allen H. Searls (my father), born in 1908, and Grace (née Searls) Apgar, born in 1912. Grandpa died in 1935, but Grandma and Aunt Ethel lived here until 1955, when I was eight years old.

It was in a fine old neighborhood of similar mansard-roofed homes, most of which were built before the George Washington Bridge showed up and became the town’s landmark feature. Pop, who grew up climbing the Palisades and had no fear of heights, helped build the bridge, mostly by rigging cables.

Not long after finding a place to stay in New York in Fall of 2012, my wife and I took a walk across the bridge to visit the old neighborhood. I knew the old house was gone, the land under it paved over by Bruce Reynolds Boulevard. What I didn’t expect was finding that the entire neighborhood had been erased. See the brown area on the map above, between the highway and Main Street? That was it. Palisade Avenue, behind Hoyt, is now a house-less strip of rotting pavement flanked and veined by wild grass. The only animal life we spotted was a large groundhog that ran to an old storm drain when we approached.

Little of the Fort Lee I knew as a kid is still there. The only familiar sights on Main Street are City Hall and the old fire station. Dig this: City Hall also shows up in the background of this shot of Mom with my cousin Paul and I, when we were both a few months old, in April 1948. This street too has been obliterated: replaced by stores and parking lots, with no trace of its old self.

When I was a kid in the ’50s, my grandparents’ generation — all born in the second half of the 19th Century — was still going strong. One relative I remember well was great-aunt Eva Quackenbush, Grandpa Searls’ older sister. Here she is with Mom, and the baby me. Eva was born in 1853, and was twelve years old when President Lincoln was shot — and event she talked about. She visited often from her home in St. Louis, and died just a few days short of 100 years old, in 1953. Living long is a Searls family trait. Grandma made it to 107 and Aunt Grace to 101 (she passed just last month, fun and lucid to the end).

So to me the world before cars, electricity and other modern graces was a familiar one, because I heard so many stories about it. Grandma grew up in The Bronx, at 742 East 142nd Street, when it looked like this:

Today, according to Google’s StreetView, it looks like this:

The red A marks 732. On the left, behind that wall, is a “towed car” lot. It sits atop a mound of rubble that was once “old Lincoln Hospital”:

According to the Wikipedia article on Lincoln Hospital, “In 1895, after more than half a century of occupying various sites in Manhattan, the Board of Trustees purchased a large lot in the South Bronx—then a semi-rural area of the city—at the corner of 141st Street and Southern Boulevard.” This is a morning view, lit from the southeast, looking north across 141st Street. Grandma’s place was on the back side of the hospital. Amazing to think that this scene came and went between the two shots above it.

Grandma’s father, Henry Roman Englert, was the head of the Steel and Copper Plate Engravers Union in the city. His trade was also destroyed by industrial progress, but was an art in its time. Here he is, as a sharp young man with a waxed mustache:

Henry was a fastidious dude, meaning highly disciplinary as well as disciplined. Grandma told a story about how her father, on arriving home from work, would summon his four daughters to appear and stand in a row. (A fifth daughter, Grace, died at age 1. My aunt Grace, mentioned above, granddaughter of Henry and Kitty, lived to 101.) He would then run his white glove over some horizontal surface and wipe it on a white shoulder of a daughter’s dress, expecting no dust to leave a mark on either glove or girl. Henry was the son of German immigrants: Christian Englert and Jacobina Rung, both of Alsace, now part of France. They were brewers, and had a tavern on the east side of Manhattan on 110th Street. (Though an 1870 census page calls Christian a laborer.) Jacobina was a Third Order Carmelite nun, and was buried in its brown robes. Both were born in 1825. Christian is said to have died in 1886 while picking hops in Utica. Jacobina died in 1904.

Grandma (Ethel F. Englert) met Grandpa (George W. Searls) in 1903, when she was twenty and he was forty. She was working as a cleaning woman in the Fort Lee boarding house where Grandpa lived while he worked as a carpenter. One day she saw him laying asleep, and bent down to kiss him. He woke, reached up, pulled her down and kissed her back. Romance commenced.

Grandma was embarrassed about having done cleaning work, insisting always that she was “lace curtain Irish,” to distinguish her family (or her Mom’s side) from “shanty Irish.” When ethnic matters came up in conversation over dinner, she would often say “All for the Irish stand up,” and everybody would rise. Her mother, Catherine “Kitty” Trainor, died at 39. Henry later married an Italian woman and produced more progeny, only one of whom was ever mentioned by Grandma. That was Harry, who died at age five. The largest framed photograph in Grandma’s house was one of Harry, looking up and holding a toy.

Kitty’s dad was Thomas Trainor, who came over from Ireland in 1825 at age 15 to escape England’s harsh penal laws. (He shipped out of Letterkenny with an uncle, but the Trainors were from south of there. Trainor was anglicized from the Gaelic Tréinfhir, meaning “strong man.”) Thomas worked as an indentured servant in the carriage trade, and married Catherine McLaughlin, the daughter of his boss. Thomas then prospered in the same business, building and fixing carriages at his shop at the south end of Broadway. His two daughters were Kitty and “Aunt Mag” Meyer, whom Grandma often quoted. The line I best remember is, “You’ve got it in your hand. Now put it away.” Mag taught Grandma how to walk quietly while large numbers of other people in the house were sleeping. Grandma passed the same advice to her grandkids, including me: “Walk on the balls of your feet, toes first.” The Trainors also had a son, who ran away to fight in the Civil War. When the war ended and the boy didn’t come home, Thomas went down to Washington and found his son in a hospital there, recovering from a wound. The doctors said the boy would be home by Christmas. And, when Christmas came, the boy indeed arrived, in a coffin. Or so the story went.

An interesting fact about Fort Lee: it was the original Hollywood. The Searls family, like most of the town, was involved. Grandpa was D.W. Griffith’s head carpenter, building film sets such as this one here. Here he is (bottom right) with his crew. Here’s a link for the Fort Lee Film Commission, featuring samples of the silent movies made there. Among the extras are family members. Lillian Gish and Lon Chaney both boarded upstairs at 2063 Hoyt. So did the dad of the late Elliot Richardson, a cabinet member in the Nixon and Ford administrations.

Time flies, and so do people, places and memories. My parents’ generation is now gone, and family members of my own generation are starting to move on. I can count ten places I used to live that are now gone as well, including my high school. Kevin Kelly told me a couple years ago that none of us, even the famous, will be remembered in a thousand years. I’m sure he’s right.

But I still feel the urge to pour as much as I can of what I know into the public domain, which is what you’re witnessing now, if you’re still with me at the bottom of a long post. I believe it helps to see what was, as well as what is.

For example, this view up Hoyt Avenue from the site of the old Searls place, in 2012, is now filled with a high-rise that is almost complete. The little bridge-less town where my grandparents met and my father and his sisters grew up is now a henge of high-rises. Fort Lee itself is also known as Fort Lee Koreatown. In this constantly shifting urban context, the current scandal seems a drop in the bucket of time.

 

Save

Cool

Personal data and independence

  • The Independent Purchase Decision Support Test, by Adrian Gropper, M.D. Pull quote: ” What I need is an Agent that’s independent of my ‘provider’ institution EHR and communicates with that EHR using the Stage 2 guidelines without any interference from the EHR vendor or the ‘provider’. It’s my choice who gets the Direct messages, it’s my choice if I want to ask my doctor about the alternatives and it’s my doctor’s choice to open up or ignore the Direct messages I send.” (EHR is Electronic Health Record.)
  • Your data is your interface. By Jarno Mikael Koponen in Pando Daily. Pull quote: “Before solving the ‘Big Data’ we should figure out the ‘small’ personal part. Algorithms alone can’t make me whole. Different services need my continuous contribution to understand who I really am and what I want. And I believe that apps and services that openly share their data to provide me a better user experience are not far off.”
  • Jarno is also the father of Futureful (@futureful) which Zak Stone of Co.Exist (in Fast Company) in says “hopes to bring serendipitous browsing back to the web experience by providing a design-heavy platform for content discovery.” Just downloaded it.

Media

  • The rebirth of OMNI — and its vibe. Subhead: Glenn Fleishman on the imminent reboot of the legendary science and science fiction magazine. In BoingBoing. Two bonus links on the OMNI topic:
  • Jeff Bezos buys the Washington Post. This is either wonderful for journalism or horrifying. By Sarah Lacy in Pando Daily. Pull quote: “John Doerr…described an entrepreneur with uncommon focus and discipline around what the customer wants. I guess the future of the Post will ride on who Bezos sees as ‘the customer’ and what’s in his best interest.”
  • Donald Graham’s Choice, by David Remmick in The New Yorker.
  • Here’s Why I Think Jeff Bezos Bought The Washington Post. By Henry Blodget in Business Insider. Pull-quote:
    • First, I’d guess that Jeff Bezos thinks that owning the Washington Post will be fun, interesting, and cool. And my guess is that, if that is all it ever turns out to be, Jeff Bezos will be fine with that. This is a man who invests in rockets and atomic clocks, after all. He doesn’t necessarily make these investments for the money. Or bragging rights. Or strategic synergies.
    • Second, I’d guess that Jeff Bezos thinks that there are some similarities between the digital news business and his business (ecommerce) that no one in the news business has really capitalized on yet.
  • The Natives Are Feckless: Part One Of Three. By Bob Garfield in MediaPost. Pull-quotage:
    • Well done, media institutions. You have whored yourselves to a hustler. Your good name, such that it remains, is diminished accordingly, along with your trustworthiness, integrity and any serious claim to be serving the public. Indeed, by bending over for commercially motivated third parties who masquerade as bona fide editorial contributors, you evince almost as little respect for the public as you do for yourself.
    • There’s your native advertising for you. There’s the revenue savior being embraced by Forbes, the Atlantic, The Washington Post, The Guardian, Business Insider and each week more and more of the publishing world.
    • According to the Pew Research Center for the People and the Press, sponsored content of various kinds was a $1.56 billion category in 2012 and growing fast.
  • Future of TV might not include TV. By Shalini Ramachandran and Martin Peers in The Wall Street Journal. It begins, “Predicting that transmission of TV will move to the Internet eventually, Cablevision Systems Corp Chief Executive James Dolan says ‘there could come a day’ when his company stops offering television service, making broadband its primary offering.” And wow:
    • In a 90-minute interview on Friday, the usually media-shy 58-year-old executive also talked about his marriage, his relationship with his father Chuck and his after-hours role as a singer and songwriter. He said his rock band, JD & the Straight Shot, toured with the Eagles last month.
    • Mr. Dolan said that on the rare occasions he watches TV, it is often with his young children, who prefer to watch online video service Netflix, using Cablevision broadband.
    • He added that the cable-TV industry is in a ‘bubble’ with its emphasis on packages of channels that people are required to pay for, predicting it will mature ‘badly’ as young people opt to watch online video rather than pay for traditional TV services.
  • Making TVs smart: why Google and Netflix want to reinvent the remote control. By Janko Roettgers in Gigaom.
  • Hulu, HBO, Pandora coming to Chromecast. By Steve Smith in MediaPost. Pull-quote: “A battle over content clearly is brewing between Google and Apple. Apple TV has recently expanded its offerings of content providers to include HBO Go, Sky TV, ESPN and others. The two companies are pursuing different delivery models as they try to edge their way onto the TV. Apple TV is a set-top box with apps, while Chromecast relies on apps that are present on mobile devices to which the dongle connects.”
  • Setting TV Free. By yours truly in Linux Journal.

Tech

Retail

Legal

Handbaskets to hell

When you see an ad for Budweiser on TV, you know who paid for it and why it’s there. You also know it isn’t personal, because it’s brand advertising.

But when you see an ad on a website, do you know what it’s doing there? Do you know if its there just for you, or if it’s for anybody? Hard to tell.

However, if it’s an ad for a camera showingng up right after you visited some photography sites, it’s a pretty good guess you’re being tracked. It’s also likely you are among millions who are creeped out by the knowledge that they’re being tracked.

On the whole, the tracking-driven online advertising business (aka “adtech”) assumes that you have given permission to be followed, at least implicitly. This is one reason tracking users and targeting them with personalized ads is more normative than ever online today. But there is also a growing concern that personal privacy lines are not only being crossed, but trampled.

Ad industry veterans are getting creeped out too, because they know lawmakers and regulators will be called on for protection. That’s the case George Simpson — an ad industry insider — makes in  Suicide by Cookies, where he starts with the evidence:

Evidon measured sites across the Internet and found the number of web-tracking tags from ad servers, analytics companies, audience-segmenting firms, social networks and sharing tools up 53% in the past year. (The ones in Mandarin were probably set by the Chinese army.) But only 45% of the tracking tools were added to sites directly by publishers. The rest were added by publishers’ partners, or THEIR partners’ partners.

Then he makes a correct forecast government intervention, and concludes with this:

I have spent the better part of the last 15 years defending cookie-setting and tracking to help improve advertising. But it is really hard when the prosecution presents the evidence, and it has ad industry fingerprints all over it — every time. There was a time when “no PII” was an acceptable defense, but now that data is being compiled and cross-referenced from dozens, if not hundreds, of sources, you can no longer say this with a straight face. And we are way past the insanity plea.

I know there are lots of user privacy initiatives out there to discourage the bad apples and get all of the good ones on the same page. But clearly self-regulation is not working the way we promised Washington it would.

I appreciate the economics of this industry, and know that it is imperative to wring every last CPM out of every impression — but after a while, folks not in our business simply don’t care anymore, and will move to kill any kind of tracking that users don’t explicitly opt in to.

And when that happens, you can’t say, “Who knew?”

To get ahead of the regulatory steamroller, the ad business needs two things. One is transparency. There isn’t much today. (See Bringing Manners to Marketing at Customer Commons.) The other is permission. It can’t only be presumed. It has to be explicit.

We — the targets of adtech — need to know the provenance of an ad, at a glance. It should be as clear as possible when an ad is personal or not, when it is tracking-based or not, and whether it’s permitted. That is, welcomed. (More about that below.)

This can be done symbolically. How about these:

 means personalized.

↳ means tracking-based.

☌ means permitted.

I picked those out of a character viewer. There are hundreds of these kinds of things. It really doesn’t matter what they are, so long as people can easily, after awhile, grok what they mean.

People are already doing their own policy development anyway, by identifying and blocking both ads and tracking, through browser add-ons and extensions. Here are mine for Firefox, on just one of my computers:

All of these, in various ways, give me control over what gets into my browser. (In fact the Evidon research cited above was gained by Ghostery, which is an Evidon product installed in millions of browsers. So I guess I helped, in some very small way.)

Speaking of permission, now would be a good time to revisit Permission Marketing, which Seth Godin published in May 1999,  about the same time The Cluetrain Manifesto also went up. Here’s how Seth compressed the book’s case nine years later.

Permission marketing is the privilege (not the right) of delivering anticipated, personal and relevant messages to people who actually want to get them.

It recognizes the new power of the best consumers to ignore marketing. It realizes that treating people with respect is the best way to earn their attention.

Pay attention is a key phrase here, because permission marketers understand that when someone chooses to pay attention they are actually paying you with something precious. And there’s no way they can get their attention back if they change their mind. Attention becomes an important asset, something to be valued, not wasted.

Real permission is different from presumed or legalistic permission. Just because you somehow get my email address doesn’t mean you have permission. Just because I don’t complain doesn’t mean you have permission. Just because it’s in the fine print of your privacy policy doesn’t mean it’s permission either.

Real permission works like this: if you stop showing up, people complain, they ask where you went.

Real permission is what’s needed here. It’s what permission marketing has always been about. And it’s what VRM (Vendor Relationship Management) is about as well.

Brand advertising is permitted in part because it’s not personal. Sometimes it is even liked.. The most common example of that is Super Bowl TV ads. But a better example is magazines made thick with brand ads that are as appealing to readers as the editorial content. Fashion magazines are a good example of that.

Adtech right now is not in a demand market on the individual’s side. In fact, judging from the popularity of ad-blocking browser extensions, there is a lot of negative demand. According to ClarityRay, 9.23% of all ads were blocked by users surveyed a year ago. That number is surely much higher today.

At issue here is what economists call signaling — a subject about which Don Marti has written a great deal over the last couple of years. I visit the subject (with Don’s help) in this post at Wharton’s Future of Advertising site, where contributors are invited to say where they think advertising will be in the year 2020. My summary paragraph:

Here is where this will lead by 2020: The ability of individuals to signal their intentions in the marketplace will far exceed the ability of corporations to guess at those intentions, or to shape them through advertising. Actual relationships between people and processes on both sides of the demand-supply relationship will out-perform today’s machine-based guesswork by advertisers, based on “big data” gained by surveillance. Advertising will continue to do what it has always done best, which is to send clear signals of the advertiser’s substance. And it won’t be confused with its distant relatives in the direct response marketing business.

I invite everybody reading this to go there and jump in.

Meanwhile, consider this one among many olive branches that need to be extended between targets — you and me — and the advertisers targeting us.

 

My son remembers what I say better than I do. One example is this:

I uttered it in some context while wheezing my way up a slope somewhere in the Reservation.

Except it wasn’t there. Also I didn’t say that. Exactly. Or alone. He tells me it came up while we were walking across after getting some hang time after Mass at the . He just told me the preceding while looking over my shoulder at what I’m writing. He also explains that the above is compressed from dialog between the two of us, at the end of which he said it should be a bumper sticker, which he later designed, sent to me and you see above.

What I recall about the exchange, incompletely (as all recall is, thanks to the graces and curses of short term memory), is that I was thinking about the imperatives of invention, and why my nature is native to Silicon Valley, which exists everywhere ideas and ambition combine and catch fire.

I was in the midst of late edits on The Intention Economy this afternoon, wondering if I should refer to Steve Jobs in the past tense. I didn’t want to, but I knew he’d be gone by the time the book comes out next April, if he wasn’t gone already. So I decided to make the changes, and stopped cold before the first one. I just couldn’t go there.

Then the bad news came a few minutes ago, through an AP notification on my iPhone. Tonight we all have to go there.

Thirteen years, one month and one day ago, I wrote an email to Dave Winer, in response to a DaveNet post on Steve’s decision to kill off Apple’s clones. (Dave had also posted notes from an interview with Steve himself.) Dave published the email. Here’s the part that matters:

So Steve Jobs just shot the cloners in the head, indirectly doing the same to the growing percentage of Mac users who prefered cloned Mac systems to Apple’s own. So his message to everybody was no different than it was at Day One: all I want from the rest of you is your money and your appreciation for my Art.

It was a nasty move, but bless his ass: Steve’s art has always been first class, and priced accordingly. There was nothing ordinary about it. The Mac “ecosystem” Steve talks about is one that rises from that Art, not from market demand or other more obvious forces. And that art has no more to do with developers, customers and users than Van Gogh’s has to do with Sotheby’s, Christie’s and art collectors.

See, Steve is an elitist and an innovator, and damn good at both. His greatest achievements are novel works of beauty and style. The Apple I and II were Works of Woz; but Lisa, Macintosh, NeXT and Pixar were all Works of Jobs. Regardless of their market impact (which in the cases of Lisa and NeXT were disappointing), all four were remarkable artistic achievements. They were also inventions intended to mother necessity — and reasonably so. That’s how all radical innovations work. (Less forward marketers, including Bill Gates, wait for necessity to mother invention, and the best of those invent and implement beautifully, even though that beauty is rarely appreciated.)

To Steve, clones are the drag of the ordinary on the innovative. All that crap about cloners not sharing the cost of R&D is just rationalization. Steve puts enormous value on the engines of innovation. Killing off the cloners just eliminates a drag on his own R&D, as well as a way to reposition Apple as something closer to what he would have made the company if he had been in charge through the intervening years.

The simple fact is that Apple always was Steve’s company, even when he wasn’t there. The force that allowed Apple to survive more than a decade of bad leadership, cluelessness and constant mistakes was the legacy of Steve’s original Art. That legacy was not just an OS that was 10 years ahead of the rest of the world, but a Cause that induced a righteousness of purpose centered around a will to innovate — to perpetuate the original artistic achievements. And in Steve’s absence Apple did some righeous innovation too. Eventually, though, the flywheels lost mass and the engine wore out.

In the end, by when too many of the innovative spirts first animated by Steve had moved on to WebTV and Microsoft, all that remained was that righteousness, and Apple looked and worked like what it was: a church wracked by petty politics and a pointless yet deeply felt spirituality.

Now Steve is back, and gradually renovating his old company. He’ll do it his way, and it will once again express his Art.

These things I can guarantee about whatever Apple makes from this point forward:

  1. It will be original.
  2. It will be innovative.
  3. It will be exclusive.
  4. It will be expensive.
  5. It’s aesthetics will be impeccable.
  6. The influence of developers, even influential developers like you, will be minimal. The influence of customers and users will be held in even higher contempt.
  7. The influence of fellow business artisans such as Larry Ellison (and even Larry’s nemesis, Bill Gates) will be significant, though secondary at best to Steve’s own muse.

Turns out Steve’s muse was the best in the history of business. No one-hit wonders. We’re talking about world-changing stuff. Again and again and again.

Watch this clip from Robert X. Cringeley’s “Triumph of the Nerds” public TV special, filmed back when Steve was still running NeXT. This one too. Then look at what Steve did after coming back. Not just the iPod, iPhone, iPad, Pixar and the laptops we see with glowing apples all over the place. Look at the Apple Stores. I’ve been told that Apple Stores are top-grossing retail shops in every mall they occupy. Even if that’s not true, it’s believable.

I’ve also been told that Apple Stores were Steve’s idea. I don’t know if that’s true either, but it makes sense, because they succeeded where nearly every other attempt at the same thing failed. To get there, Steve and Apple had to look past the smoking corpses of Gateway, Circuit City, Computerland, Radio Shack and all the other computer stores that had failed, and do something very different and much better. And they did.

I was wrong about one thing in my list above. I don’t think Steve regarded customers and users with contempt, except in the sense that he believed he knew better than they did. As an elitist, he also knew that calling the smartest and most employable Apple users “geniuses” was great bait for employment serving customers at Apple Stores.

There is no shortage of quotes by and about Steve Jobs tonight. But the best quote is the one he uttered so long ago I can’t find a source for it (maybe one of ya’ll can): The journey is the reward.

His first hit, the Apple II, was “The computer for the rest of us.” So now is his legacy.

Tags:

So I’d like to find authoritative sources for two quotes. Here’s the first:

“I prefer the company of younger men. Their stories are shorter.”

No idea where I got that one. It’s too right not to be real, but I can’t a source yet. (That’s a job I’m giving ya’ll.)

The second quote I memorized instantly while reading a book, though I don’t remember which one.  (I’m guessing it was .) This is what Hughes said Parker wrote in a guest book at William Randolph Hearst‘s when the old man was still living with his consort, the actress :

“Upon my honor
I saw a madonna
standing in a niche,
above the door
of the private whore
of the world’s worst
son of a bitch”

Could be I’m one wrong about that one too. Dunno. Sources and corrections, anyone?

Tags:

is ahead of his time again.  nailed computing as a utility, long before “the cloud” came to mean pretty much the same thing. His latest book, , explored the changes in our lives and minds caused by moving too much of both online — again before others began noticing how much the Net was starting to look like a handbasket.

Thus The Shallows comes to mind when I read Alice Gregory’s in . An excerpt:

I have the sensation, as do my friends, that to function as a proficient human, you must both “keep up” with the internet and pursue more serious, analog interests. I blog about real life; I talk about the internet. It’s so exhausting to exist on both registers, especially while holding down a job. It feels like tedious work to be merely conversationally competent. I make myself schedules, breaking down my commute to its most elemental parts and assigning each leg of my journey something different to absorb: podcast, Instapaper article, real novel of real worth, real magazine of dubious worth. I’m pretty tired by the time I get to work at 9 AM.

In-person communication feels binary to me now: subjects are either private, confessional, and soulful or frantically current, determined mostly by critical mass, interesting only in their ephemeral status. Increasingly these modes of talk seem mutually exclusive. You can pull someone aside—away from the party, onto the fire escape—and confess to a foible or you can stay inside with the group and make a joke about something everyone’s read online. “Maybe you keep the wrong company,” my mother suggests. Maybe. But I like my friends! We can sympathize with each other and feel reassured that we’re not alone in our overeager consumption, denigrated self-control, and anxiety masked as ambition.

Here’s Nick:

On the Net, we face many information faucets, all going full blast. Our little thimble overflows as we rush from tap to tap. We transfer only a small jumble of drops from different faucets, not a continuous, coherent stream.

Psychologists refer to the information flowing into our working memory as our cognitive load. When the load exceeds our mind’s ability to process and store it, we’re unable to retain the information or to draw connections with other memories. We can’t translate the new material into conceptual knowledge. Our ability to learn suffers, and our understanding remains weak. That’s why the extensive brain activity that Small discovered in Web searchers may be more a cause for concern than for celebration. It points to cognitive overload.

The Internet is an interruption system. It seizes our attention only to scramble it. There’s the problem of hypertext and the many different kinds of media coming at us simultaneously. There’s also the fact that numerous studies—including one that tracked eye movement, one that surveyed people, and even one that examined the habits displayed by users of two academic databases—show that we start to read faster and less thoroughly as soon as we go online. Plus, the Internet has a hundred ways of distracting us from our onscreen reading. Most email applications check automatically for new messages every five or 10 minutes, and people routinely click the Check for New Mail button even more frequently. Office workers often glance at their inbox 30 to 40 times an hour. Since each glance breaks our concentration and burdens our working memory, the cognitive penalty can be severe.

The penalty is amplified by what brain scientists call . Every time we shift our attention, the brain has to reorient itself, further taxing our mental resources. Many studies have shown that switching between just two tasks can add substantially to our cognitive load, impeding our thinking and increasing the likelihood that we’ll overlook or misinterpret important information. On the Internet, where we generally juggle several tasks, the switching costs pile ever higher.

The Net’s ability to monitor events and send out messages and notifications automatically is, of course, one of its great strengths as a communication technology. We rely on that capability to personalize the workings of the system, to program the vast database to respond to our particular needs, interests, and desires. We want to be interrupted, because each interruption—email, tweet, instant message, RSS headline—brings us a valuable piece of information. To turn off these alerts is to risk feeling out of touch or even socially isolated. The stream of new information also plays to our natural tendency to overemphasize the immediate. We crave the new even when we know it’s trivial.

And so we ask the Internet to keep interrupting us in ever more varied ways. We willingly accept the loss of concentration and focus, the fragmentation of our attention, and the thinning of our thoughts in return for the wealth of compelling, or at least diverting, information we receive. We rarely stop to think that it might actually make more sense just to tune it all out.

Try writing about the Net and tuning it out at the same time. Clearly Nick can do that, because he’s written a bunch of books about the Net (and related matters) while the Net’s been an available distraction. Meanwhile I’ve spent most of the past year writing just one book, fighting and often losing against constant distraction. It’s very hard for me to put the blinders on and just write the thing. In the last few months what I’ve succeed in doing, while wearing the blinders and getting most of my book writing done, is participating far less in many things that I help sustain, or that sustain me, including projects I’m working on, time with my wife, kids and grandkids, and this very blog. (Lotta white spaces on the calendar to the right there.)

On the whole I’ve been dismissive of theories (including Nick’s) about how the Net changes us for the worse, mostly because my own preoccupations, including my distractions, tend to be of the intellectually nutritive sort — or so I like to believe. That is, I’m curious about all kinds of stuff, and like enlarging the sum of what I know, and how well I know it. The Net rocks for that. Still, I see the problem. I can triangulate on that problem just from own struggles plus Alice’s and Nick’s.

used to say, “Great minds discuss ideas, mediocre minds discuss events, and small minds discuss people.” (Attributed, with some dispute, to Eleanor Roosevelt.) The Net feeds all three, but at the risk of dragging one’s mind from the great to the small. “What else are we doing on the internet if not asserting our rank?” Alice writes. (Would we ask the same about what we’re doing in a library?) Later she adds,

Sometimes I can almost visualize parts of myself, the ones I’m most proud of, atrophying. I wish I had an app to monitor it! I notice that my thoughts are homeopathic, that they mirror content I wish I weren’t reading. I catch myself performing hideous, futuristic gestures, like that “hilarious” moment three seconds into an intimate embrace in which I realize I’m literally rubbing my iPhone screen across his spine. Almost every day at 6 PM my Google Alert tells me that an “Alice Gregory” has died. It’s a pretty outdated name, and most of these obituaries, from family newsletters and local papers, are for octogenarians. I know I’m being tidy-minded even to feel a pang from this metaphor, but still . . .

It’s hard not to think “death drive” every time I go on the internet. Opening Safari is an actively destructive decision. I am asking that consciousness be taken away from me. Like the lost time between leaving a party drunk and materializing somehow at your front door, the internet robs you of a day you can visit recursively or even remember. You really want to know what it is about 20-somethings? It’s this: we live longer now. But we also live less. It sounds hyperbolic, it sounds morbid, it sounds dramatic, but in choosing the internet I am choosing not to be a certain sort of alive. Days seem over before they even begin, and I have nothing to show for myself other than the anxious feeling that I now know just enough to engage in conversations I don’t care about.

The internet’s most ruinous effect on literacy may not be the obliteration of long-format journalism or drops in hardcover sales; it may be the destruction of the belief that books can be talked and written about endlessly. There are fewer official reviews of novels lately, but there are infinitely more pithily captioned links on Facebook, reader-response posts on Tumblr, punny jokes on Twitter. How depressing, to have a book you just read and loved feel so suddenly passé, to feel—almost immediately—as though you no longer have any claim to your own ideas about it. I started writing this piece when the book came out at the end of July, and I started unwriting it almost immediately thereafter. Zeno’s Paradox 2.0: delete your sentences as you read their approximations elsewhere. How will future fiction work? Will details coalesce into aphorism? I wonder if instead of scribbling down in my notebook all the familiar aspects of girls I see on the street, as I used to, I’ll continue doing what I do now: snapping a picture and captioning it, in the words of Shteyngart, “so media.”

I’ll grant that we have problems here, but is literacy actually being ruined? Is long-format journalism actually obliterated? The New Yorker is as thick as ever with six to eight thousand word essays. Books still move through stores online and off. Our fourteen year old kid still reads piles of books, even as he spends more time online, watching funny YouTube videos and chatting with a friend three time zones away. Is he worse for that? Maybe, but I don’t think so. Not yet, anyway.

What I am sure about is this: Twitter, Facebook and Tumblr are temporary constructions on the Web, like Worlds Fairs used to be, when we still had them. The Internet is a world where all four seasons happen at once. New sites and services are like plants that germinate, grow, bud, bloom and die, over and over. Even the big trees don’t grow to the sky. We need their fruit, their shade, their wood and the humus to which they return. Do we need the other crap that comes along with it those stages? Maybe not, but we go for it anyway.

Last Tuesday gave an excellent Berkman Lunch talk titled Status Update: Celebrity, Publicity and Self-Branding in Web 2.0. The summary:

In the mid-2000s, journalists and businesspeople heralded “Web 2.0” technologies such as YouTube, Twitter, and Facebook as signs of a new participatory era that would democratize journalism, entertainment, and politics. By the decade’s end, this idealism had been replaced by a gold-rush mentality focusing on status and promotion. While the rhetoric of Web 2.0 as democratic and revolutionary persists, I will contend that a primary use of social media is to boost user status and popularity, maintaining hierarchy rather than diminishing it. This talk focuses on three status-seeking techniques that emerged with social media: micro-celebrity, self-branding, and life-streaming. I examine interactions between social media and social life in the San Francisco “tech scene” to show that Web 2.0 has become a key aspect of social hierarchy in technologically mediated communities.

I’ve been in and out of that scene since 1985, and I know personally a large percentage of Alice’s sources. One of them, , provided Alice with some terrific insights about how the status system works. Tara also punched out of that system not long ago, moving to Montreal and starting a company. She has also been very active in the development community, for which I am very grateful. She’s on a helluva ride.

Listening to the two Alices,  comes to mind:

A Route of Evanescence,
With a revolving Wheel –
A Resonance of Emerald
A Rush of Cochineal –
And every Blossom on the Bush
Adjusts it’s tumbled Head –
The Mail from Tunis – probably,
An easy Morning’s Ride –

Speaking of which, here’s Bill Hicks on life’s ride:

The World is like a ride in an amusement park, and when you choose to go on it you think it’s real, because that’s how powerful our minds are. And the ride goes up and down and round and round, and it has thrills and chills and is very brightly colored, and it’s very loud. And it’s fun, for a while.

Some people have been on the ride for a long time, and they’ve begun to question, ‘Is this real, or is this just a ride?’, and other people have remembered, and they’ve come back to us and they say ‘Hey, don’t worry. Don’t be afraid, ever, because this is just a ride.’ and we KILL THOSE PEOPLE.

“Shut him up! We have alot invested in this ride! SHUT HIM UP! Look at my furrows of worry. Look at my big bank account, and my family. This has to be real.”

It’s just a ride.

But we always kill those good guys who try and tell us that. You ever noticed that? And let the demons run amok. But it doesn’t matter, because … It’s just a ride.

And we can change it anytime we want. It’s only a choice. No effort, no work, no job, no savings of money. A choice, right now, between fear and love. The eyes of fear wants you to put bigger locks on your door, buy guns, close yourself off. The eyes of love, instead see all of us as one.

(Watch the video. It’s better.)

Social media, social networking — all of it — is just practice. It’s just scaffolding for the roller coaster we keep re-building, riding on, falling off, and re-building. That’s what we’ve been making and re-making of civilization, especially since Industry won the Industrial revolution. (That’s why we needed world’s fairs,  to show off how Industry was doing.)

You go back before that and, on the whole, life didn’t change much, anywhere. Most of our ancestors, for most of the Holocene, lived short, miserable lives that were little different than those of generations prior or hence.

Back in the ’70s I lived in a little community called Oxbow, north of Chapel Hill. My house was one off whats now called Wild Primrose Lane, in this map here. In those days the bare area in the center of that map was a farm that was plowed fresh every spring. One day while we were walking there, I picked up a six-inch spear point (or hand-held scraper) that now resides at the (one county over):

I brought it to friends in the anthropology department at UNC — associates of the great Joffre Coe — who told me it was a Guilford point, from the Middle Archaic period, which ran from 6000 to 3000 B.C. (The original color was gray, as you can see from the chipped parts. The surface color comes from what’s called patination.)

What fascinates me about this date range, which is similar to the range for other kinds of points everywhere in the world, is how little technology changed over such a long period of time. Generation after generation made the same kinds of stone tools, the same way, for thousands of years. Today we change everything we make, pretty much constantly. There was no operating among the Guilford people, or anywhere, in 5000 B.C. Today Moore sometimes seems slow.

I don’t have a conclusion here, other than to say that maybe Nick and both Alices are right, and the Net is not so ideal as some of us (me especially) tend to think it is. But I also think the Net is something we make, and not just something that makes us.

Clearly, we could do a better job. We have the tools, and we can make many more.

 

The question on Quora goes, What lessons can be learned from the first browser war between Microsoft and Netscape?

I covered that war when it broke out, more than fifteen years ago. No magazine was interested in my writing then. Blogging was several years off in the future. All we had were websites, and that was good enough. The following is what I put up on mine — in as much of the original HTML as can survive WordPress’ HTML-rewriting mill. I’ll continue below the piece…


MICROSOFT+NETSCAPE

WHY THE PRESS NEEDS TO SNAP OUT OF ITS WAR-COVERAGE TRANCE

By Doc Searls
December 11, 1995

Outline

Wars?

Am I wrong here, or has the Web turned into a Star Wars movie?

I learn from the papers that the desktop world has fallen under the iron grip of the most wealthy and powerful warlord in the galaxy. With a boundless greed for money and control, Bill Gates of Microsoft now seeks to extend his evil empire across all of cyberspace.

The galaxy’s only hope is a small but popular rebel force called Netscape. Led by a young pilot (Marc Andreesen as Luke Skywalker), a noble elder (Jim Clark as Obi-wan Kanobe) and a cocky veteran (Jim Barksdale as Han Solo), Netscape’s mission is joined by the crafty and resourceful Java People from Sun.

Heavy with portent, the headlines tromp across the pages (cue the Death Star music — dum dum dum, dum da dum, dum da dummm)…

  • “MICROSOFT TAKES WAR TO THE NET: Software giant plots defensive course based on openness”
  • “MICROSOFT UNVEILS INTERNET STRATEGY: Stage set for battle with Netscape.”
  • “MICROSOFT, SUN FACE OFF IN INTERNET RING”
  • “MICROSOFT STORMS THE WEB”

The mind’s eye conjures a vision of The Emperor, deep in the half-built Death Star of Microsoft’s new Internet Strategy, looking across space at the Rebel fleet, his face twisted with contempt. “Your puny forces cannot win against this fully operational battle station!” he growls.

But the rebels are confident. “In a fight between a bear and an alligator, what determines the victor is the terrain,” Marc Andreessen says. “What Microsoft just did was move into our terrain.”

And Microsoft knows its strengths. December 7th, The Wall Street Journal writes, Bill Gates “issued a thinly veiled warning to Netscape and other upstarts that included a reference to the Pearl Harbor attack on the same date in 1941.”

Exciting stuff. But is there really a war going on? Should there be?

are the facts?

After reading all these alarming headlines, I decided to fire up my own copy of Netscape Navigator and search out a transcript of Bill’s December 7th speech.

I started at Microsoft’s own site, but got an “access forbidden” message. Then I went up to the internet level of the site’s directory, but found the Netscape view was impaired. (“Best viewed with Microsoft Explorer,” it said.) I finally found a Netscape-friendly copy at Dave Winer’s site. It appears to be the original, verbatim:*

MR. GATES: Well, good morning. I was realizing this morning that December 7th is kind of a famous day. (Laughter.) Fifty-four years ago or something. And I was trying to think if there were any parallels to what was going on here. And I really couldn’t come up with any. The only connection I could think of at all was that probably the most intelligent comment that was made on that day wasn’t made on Wall Street, or even by any type of that analyst; it was actually Admiral Yamomoto, who observed that he feared they had awakened a sleeping giant. (Laughter.)

I see. The “veiled threat” was Bill’s opening laugh line. Even if this was “a veiled threat,” it was made in good humor. The rest of the talk hardly seemed hostile. Instead, Bill showed a substantial understanding of how both competition and cooperation work to build markets, and of the roles played by users, developers, leaders and followers in creating the Internet. In his final sentence, Bill says, “We believe that integration and continuity are going to be valuable to end users and developers…”

Of course, I wish he’d pay a little more attention to Macintosh users and developers, but I don’t blame him for avoiding them. I blame Apple, which dissed and sued Microsoft for years, to no positive effect. Apple played a zero-sum game and — sure enough — ended up with zero. Brilliant strategy.

Think how much farther along we would be today if this relationship was still Apple plus Microsoft, rather than Apple vs. Microsoft.

The truth is that the Web will be better served by Microsoft plus Netscape than by Microsoft vs. Netscape. Plus is what most of us want, and it’s probably what we’ll get, regardless of how the press plays the story.

give a big AND to the Web

So what is the best way to characterize Microsoft, if not as the Heaviest of Heavies?

I think Release 1.0‘s Jerry Michalski gets closest to it when he says: “Microsoft thinks more broadly than any other company about what it’s doing. Its plans include global telecommunications, information creation, applications — even community building.” That tells us a lot more than “Microsoft goes to war.”

Markets are more than battlefields. The OR logic of war and sports get us excited, but tells us little of real substance. For that we also need the AND logic of cooperation, choice, partnership and working together. What we all want most — love — is hardly an OR proposition. Imagine a lover saying “there’s only room in this relationship for one of us, baby.”

But the press is caught in an OR trance. Blind to the AND logic that gives markets their full color, the press reduces every hot story to the black vs. white metaphors of war and sports. Why cover the Web as the strange, unprecedented place it is, when you can play it as yet another story about two guys trying to beat the crap out of each other? Especially when the antagonists are little good guy and a big bad guy?

Look, the Internet didn’t take off because Netscape showed up; and it wasn’t slowed down because Microsoft didn’t. It took off because millions of people added their creative energies to something that welcomed them — which was mostly each other. Death-fight competition didn’t make the Web we know now, and it won’t make the Web that’s coming, either.

That’s because every site on the Web is AND logic at work. So is every vendor/developer relationship that ever produced a product or created a market. So is the near-infinite P/E ratio Netscape enjoys today.

, what IS Microsoft doing?

“Embrace and extend,” Bill Gates called it in his December 7 talk. That’s what he said Microsoft will do with products from Oracle, Spyglass, Compuserve and Sun. Is this an AND strategy? Or is it yet an other example of what Gary Reback, Judge Sporkin and other Microsoft enemies call a “lock and leverage” strategy, intended to drive out competition and let Microsoft charge tolls to every traveler on the Information Highway?

We’ll see.

It should be clear by now that the Web does not welcome OR strategies. Microsoft Network was an OR strategy, and it didn’t work. If history repeats itself (as it usually does with Microsoft), the company will learn from this experience (as Apple learned earlier from its eWorld failure) and move on to do the Right Thing.

Not that most of the press would notice. To them Microsoft is The Empire and Bill is its gold-armored emperor. But reporters are the ones putting clothes on this emperor. To the people who make Microsoft’s markets — the users and developers — “billg” is as naked as a newborn.

Take away the war-front headlines, the play-by-play reporting, the color commentary by industry analysts, the infatuation with personal wealth — and you see Bill as an extremely competitive guy who’s also trying to do right by users and developers. And hiding little in the process. Is he a bully? Sometimes. Is this bad? No, it’s typical of big companies since the dawn of business. It looks to me more like a personality trait than a business strategy. And what makes Microsoft win is far more strategic than personal.

George Gilder puts it this way in Forbes ASAP (“Angst & Awe on the Internet“):

Blinded by the robber-baron image assigned in U.S. history courses to the heroic builders of American capitalism, many critics see Bill Gates as a menacing monopolist. They mistake for greed the gargantuan tenacity of Microsoft as it struggles to assure the compatibility of its standard with tens of thousands of applications and peripherals over generations of dynamically changing technology.

to win users and influence developers

How does Bill express that tenacity? As Dave Winer puts it in “The Platform is a Chinese Household,” Bill “sends flowers.” Bill courts developers and delivers for customers, who return the favor by buying Microsoft products.

Markets are conversations, and there isn’t a more willing conversational participant than Bill. That’s why I’m not surprised when Dave says “the only big company that’s responsive to my needs is Microsoft.” And Dave, by the way, is a pillar of the Macintosh community. To my knowledge, he hasn’t developed a DOS-compatible product since the original ThinkTank.

Users and developers don’t need to hear vendors talk about how much their competition sucks. No good ever comes of it. Is it just coincidence that Microsoft almost never bad-mouths its competition? Though Bill is hardly innocent of the occasional raspberry, he’s a long way from matching the nasty remarks made about him and his company by leaders at Sun, Apple, Netscape and Novell, just to name an obvious few.

It especially saddens me to hear competition-bashing from Guy Kawasaki, whose positive energies Apple desperately needs right now. As a customer and user of both Apple and Microsoft products, I see Guy’s “how to drive your competition crazy” rap as OR logic at its antiproductive worst.

At the opposite end of the diplomacy scale, I like the way Gordon Eubanks of Symantec has consistently been fair and constructive in his public remarks about Bill and Microsoft (and has reaped ample rewards in the process).

What makes markets work is a combination of AND and OR processes that deserve thoughtful and observant journalism. They also call for vendors who can drop their fists, open their minds and look at opportunities from users’ and developers’ points of view. This is how Microsoft came to change its Internet strategy. And this is what makes Microsoft the most adaptive company in the business, regardless of size. No wonder the laws of Darwin have been kind to them.

new breed of life

Urge and urge and urge,
Always the procreant urge of the world.
Out of the dimness opposite equals advance…
Always substance and increase,
Always a knit of identity… always distinction…
Always a breed of life.
—Walt Whitman

Where the language of war fails, perhaps the language of Whitman can succeed.

By the great poet’s lights, the Web is a new breed of life. An original knit of identity. Its substance increases when opposite equals like Netscape and Microsoft advance out of the dimness and obey their procreant urges — not their will to kill.

The Web is a product of relationships, not of victors and victims. Not one dime Netscape makes is at Microsoft’s expense. And Netscape won’t bleed to death if Microsoft produces a worthy browser. The Web as we know it won’t be the same in six weeks, much less six months or six years. As a “breed of life,” it is original, crazy and already immense. It is not like anything. To describe it with cheap-shot war and sports metaphors is worse than wrong — it is bad journalism.

A week after this experience, I went back to Microsoft site and found its whole Internet Strategy directory much more Netscape-friendly and nicely organized. Every presentation is there, including all the slides. Though the slides are in PowerPoint 4.0 for Windows, my Mac is able to view them with the Mac version of the program. [Back to *]

George Gilder’s Forbes ASAP article archives are at his Telecosm site.

Dave Winer’s provocative “rants” come out every few days, and accumulate at his DaveNet site. Check out “The User’s Software Company,” which inspired this essay.


One might look back on this and say “Yeah, but Microsoft still killed Netscape.” I don’t think so. Netscape had many advantages, including one it tried too late to save the company — but not too late to save the browser and keep it competititve: open-sourcing the Mozilla code. Five years after I wrote the above, I wrote a piece in Linux Journal describing Netscape’s mistakes:

For a year or two, Netscape looked like it could do no wrong. It was a Miata being chased down a mountain road by a tractor trailer. As long as it moved fast and looked ahead, there was no problem with the truck behind. But at some point, Netscape got fixated on the rear-view mirror. That’s where they were looking when they drove off the cliff.

Why did they do that?

  1. They forgot where they came from: the hacker community that had for years been developing the Net as a free and open place—one hospitable to business, but not constrained by anybody’s business agenda. The browser was born free, like Apache, Sendmail and other developments that framed the Net’s infrastructure. The decision to charge for the browser—especially while still offering it for free—put Netscape in a terminal business from the start.
  2. They got caught up in transient market’s fashions, which were all about leveraging pre-Web business models into an environment that wouldn’t support them. Mostly, they changed the browser from a tool of Demand (browsing) to an instrument of Supply. They added channels during the “push” craze. They portalized their web site. They turned the location bar into a search term window for a separate domain directory, to be populated by the identities of companies that paid to be put there (a major insult to the user’s intentions). Worst of all, they bloated the browser from a compact, single-purpose tool to an immense contraption that eventually included authoring software, a newsgroup reader, a conferencing system and an e-mail client—all of which were done better by stand-alone applications.
  3. They became arrogant and presumptuous about their advantages. At one point, Marc Andreessen said an OS was “just a device driver”.
  4. Their engineering went to hell. By the time Netscape was sold (at top dollar) to AOL, the dirty secret was that its browser code was a big kluge and had been for a long time. Jamie Zawinski (one of the company’s first and best-known engineers) put it bluntly: “Netscape was shipping garbage, and shipping it late.” Not exactly competitive.
  5. They lost touch with their first and best market: those customers who had actually paid for that damn browser.

So, back to the original question. What have we learned, now that IE is still around, and most of its competitors are either open source or based on open source code? Here’s a quick list:

  1. The browser was never a product in the sense that it’s something that can be charged and paid for as a scarce good. It wanted to be open source in the first place.
  2. The war metaphor is distracting and misleading, even when it’s appropriate.
  3. No browser is even close to perfect, and none will ever be.

Feel free to add more of your own, here or on Quora. (I’m very curious to see how Quora evolves.)

Tags: , , , , , , ,

« Older entries