history

You are currently browsing the archive for the history category.

Since I’m done with fighting in the red ocean of the surveillance-dominated Web, I’ve decided, while busy working in the blue ocean (on what for now we’re calling i-commerce), to bring back, in this blog, some of the hundreds of things I’ve written over the last 30+ years. I’m calling it the Redux series. To qualify, these should still ring true today, or at least provide some history. This early one is still on the Web, here at BuzzPhraser.com. I’ve made only two small edits, regarding dates. (And thanks to Denise Caruso for reminding me that this thing started out on paper, very long ago.)


The original BuzzPhraser was created in 1990, or perhaps earlier, as a spreadsheet, then a HyperCard stack; and it quickly became one of the most-downloaded files on AOL and Compuserve. For years after that it languished, mostly because I didn’t want to re-write the software. But when the Web came along, I knew I had to find a way to re-create it. The means didn’t find that end, however, until Charles Roth grabbed the buzzwords by their serifs and made it happen, using a bit of clever Javascript. Once you start having fun with the new BuzzPhraser, I’m sure you’ll thank him as much as I do.

The story that follows was written for the original BuzzPhraser. I thought it would be fun to publish it unchanged.

—Doc, sometime in the late ’90s

BuzzPhrases are built with TechnoLatin, a non-language that replaces plain English nouns with vague but precise-sounding substitutes.  In TechnoLatin, a disk drive is a “data management solution.”  A network is a “workgroup productivity platform.”  A phone is a “telecommunications device”.

The virtue of TechnoLatin is that it describes just about anything technical.  The vice of TechnoLatin is that it really doesn’t mean anything.  This is because TechnoLatin is comprised of words that are either meaningless or have been reduced to that state by frequent use.  Like the blank tiles in Scrabble, you can put them anywhere, but they have no value.  The real value of TechnoLatin is that it sounds precise while what it says is vague as air.  And as easily inflated.

Thanks to TechnoLatin, today’s technology companies no longer make chips, boards, computers, monitors or printers.  They don’t even make products.  Today everybody makes “solutions” that are described as “interoperable,” “committed,” “architected,” “seamless” or whatever.  While these words sound specific, they describe almost nothing.  But where they fail as description they succeed as camouflage: they conceal meaning, vanish into surroundings and tend to go unnoticed.

Take the most over-used word in TechnoLatin today: solution.  What the hell does “solution” really mean?  Well, if you lift the camouflage, you see it usually means “product.”  Try this: every time you run across “solution” in a technology context, substitute “product.”  Note that the two are completely interchangeable.  The difference is, “product” actually means something, while “solution” does not.  In fact, the popularity of “solution” owes to its lack of specificity.  While it presumably suggests the relief of some “problem,” it really serves only to distance what it labels from the most frightening risk of specificity: the clarity of actual limits.

The fact is, most vendors of technology products don’t like to admit that their creations are limited in any way.  Surely, a new spreadsheet — the labor of many nerd/years — is something more than “just a spreadsheet.”  But what?  Lacking an available noun, it’s easy to build a suitable substitute with TechnoLatin.  Call it an “executive information matrix.”  Or a “productivity enhancement engine.”  In all seriousness, many companies spend months at this exercise.  Or even years.  It’s incredible.

There is also a narcotic appeal to buzzphrasing in TechnoLatin.  It makes the abuser feel as if he or she is really saying something, while in fact the practice only mystifies the listener or reader.  And since buzzphrasing is so popular, it gives the abuser a soothing sense of conformity, like teenagers get when they speak slang.  But, like slang, TechnoLatin feels better than it looks.  In truth, it looks suspicious.  And with good reason.  TechnoLatin often does not mean what it says, because the elaborate buzzphrases it builds are still only approximations.

But who cares? Buzzphrasing is epidemic.  You can’t get away from it.  Everybody does it.  There is one nice thing about Everybody, however: they’re a big market.

So, after studying this disease for many years, I decided, like any self-respecting doctor, to profit from the problem.  And, like any self-respecting Silicon Valley entrepreneur, I decided to do this with a new product for which there was absolutely no proven need, in complete faith that people would buy it.  Such is the nature of marketing in the technology business.

But, lacking the investment capital required to generate demand where none exists, I decided on a more generous approach: to give it away, in hope that even if I failed to halt the epidemic, at least I could get people to talk about it.

With this altruistic but slightly commercial goal in mind, I joined farces with Ray Miller of Turtlelips Services to create a product that would encourage and support the narcotic practice of buzzphrasing.  Being the brilliant programmer he is, Ray hacked it into a stack in less time than it took for me to write this prose.  And now here it is, free as flu, catching on all over the damn place.

What made BuzzPhraser possible as a product is that the practice of buzzphrasing actually has rules.  Like English, TechnoLatin is built around nouns.  It has adjectives to modify those nouns.  And adverbs to modify the adjectives.  It also has a class of nouns that modify other nouns — we call them “adnouns.”  And it has a nice assortment of hyphenated prefixes and suffixes (such as “multi-” and “-driven”) that we call “hyphixes.”

Since the TechnoLatin lexicon is filled with meaningless words in all those categories, the words that comprise TechnoLatin buzzphrases can be assembled in just about any number or order, held together as if by velcro.  These are the rules:

  • adverbs modify adjectives
  • adjectives modify adnouns, nouns or each other
  • adnouns modify nouns or other adnouns
  • nouns are modified by adnouns or adjectives
  • prefixes modify all adjectives
  • suffixes qualify all adnouns

Here is a diagram that shows how the rules work:

As with English, there are many exceptions.  But, as with programming, we don’t make any.  So cope with it.

With one adverb, one adjective, two adnouns, a noun and a prefix, you get “backwardly architected hyper-intelligent analysis inference leader.”  With an adjective and two nouns, you get “interactive leverage module.”  Put together buzzphrases of almost any shape and length:

  • “Breakthrough-capable technology market”
  • “Primarily distinguished optional contingency philosophy control power environment”
  • “Executive inference server”
  • “Evidently complete key business manipulation capacity method”
  • “Incrementally intelligent workgroup process topology vendor”

The amazing thing is that all of these sound, as we say in TechnoLatin, “virtually credible.”  And one nice thing about the computer business is — thanks largely to the brain-softening results of prolonged TechnoLatin abuse — “virtually credible” is exactly what it means in plain English: close enough.

BuzzPhraser makes “close enough” easy to reach by substituting guesswork for thinking.  Just keep hitting the button until the right buzzphrase comes along.  Then use that buzzphrase in faith that at least it sounds like you know what you’re saying.  And hey, in this business, isn’t that virtually credible?

Acknowledgements

Thanks to:

Stewart Alsop II, who published “Random Strings of TechnoLatin” along with the original Generic Description Table in both the Preceedings and Proceedings of Agenda 90; and who would like an e-mail front end that automatically discards any message with too many TechnoLatin words and buzzphrases.

Spencer F. Katt of PC Week, who devoted parts of two consecutive rumor columns to the Table, and posted it on the magazine’s CompuServe bulletin board, from which so many people copied it that I thought there might be something going on here.

Guy Kawasaki, who told me “this needs to be a product.”

Bob LeVitus, who told me “you ought to get this hacked into a stack.”

And Ray Miller, who did it.  Beautifully.

Doc Searls
Palo Alto, California
March 7, 1991

Have you ever wondered why you have to consent to terms required by the websites of the world, rather than the other way around? Or why you have no record of what you have accepted or agreed to?

Blame the cookie.

Have you wondered why you have no more privacy on the Web than what other parties grant you (which is none at all), and that you can only opt in or out of choices that others provide—while the only controls you have over your privacy are to skulk around like a criminal (thank you, Edward Snowden and Russell Brand, for that analogy) or to stay offline completely?

Blame the cookie.

And have you paused to wonder why Europe’s GDPR regards you as a mere “data subject” while assuming that the only parties qualified to be “data controllers” and “data processors” are the sites and services of the world, leaving you with little more agency than those sites and services allow, or provide you?

Blame the cookie.

Or why California’s CCPA regards you as a mere “consumer” (not a producer, much less a complete human being), and only gives you the right to ask the sites and services of the world to give back data they have gathered about you, or not to “sell” that personal data, whatever the hell that means?

Blame the cookie.

There are more examples, but you get the point: this situation has become so established that it’s hard to imagine any other way for the Web to operate.

Now here’s another point: it didn’t have to be that way.

The World Wide Web that Tim Berners-Lee invented didn’t have cookies. It also didn’t have websites. It had pages one could publish or read, at any distance across the Internet.

This original Web was simple and peer-to-peer. It was meant to be personal as well, meaning an individual could publish with a server or read with a browser. One could also write pages easily with an HTML editor, which was also easy to invent and deploy.

It should help to recall that the Apache Web server, which has published most of the world’s Web pages across most the time the Web has been around, was meant originally to work as a personal server. That’s because the original design assumption was that anyone, from individuals to large enterprises, could have a server of their own, and publish whatever they wanted on it. The same went for people reading pages on the Web.

Back in the 90s my own website, searls.com, ran on a box under my desk. It could do that because, even though my connection was just dial-up speed, it was on full time over its own static IP address, which I easily rented from my ISP. In fact, that I had sixteen of those addresses, so I could operate another server in my office for storing and transferring articles and columns I wrote to Linux Journal. Every night a cron utility would push what I wrote to the magazine itself. Both servers ran Apache. And none of this was especially geeky. (I’m not a programmer and the only code I know is Morse.)

My point here is that the Web back then was still peer-to-peer and welcoming to individuals who wished to operate at full agency. It even stayed that way through the Age of Blogs in the early ’00s.

But gradually a poison disabled personal agency. That poison was the cookie.

Technically a cookie is a token—a string of text—left by one computer program with another, to help the two remember each other. These are used for many purposes in computing.

But computing for the Web got a special kind of cookie called the HTTP cookie. This, Wikipedia says (at that link)

…is a small piece of data stored on the user‘s computer by the web browser while browsing a website. Cookies were designed to be a reliable mechanism for websites to remember stateful information (such as items added in the shopping cart in an online store) or to record the user’s browsing activity (including clicking particular buttons, logging in, or recording which pages were visited in the past). They can also be used to remember pieces of information that the user previously entered into form fields, such as names, addresses, passwords, and payment card numbers.

It also says,

Cookies perform essential functions in the modern web. Perhaps most importantly, authentication cookies are the most common method used by web servers to know whether the user is logged in or not, and which account they are logged in with.

This, however, was not the original idea, which Lou Montulli came up with in 1994. Lou’s idea was just for a server to remember the last state of a browser’s interaction with it. But that one move—a server putting a cookie inside every visiting browser—crossed a privacy threshold: a personal boundary that should have been clear from the start but was not.

Once that boundary was crossed, and the number and variety of cookies increased, a snowball started rolling, and whatever chance we had to protect our privacy behind that boundary, was lost.

Today that snowball is so large that nearly all personal agency on the Web happens within the separate silos of every website, and compromised by whatever countless cookies and other tracking methods are used to keep track of, and to follow, the individual.

This is why most of the great stuff you can do on the Web is by grace of Google, Apple, Facebook, Amazon, Twitter, WordPress and countless others, including those third parties.

Bruce Schneier calls this a feudal system:

Some of us have pledged our allegiance to Google: We have Gmail accounts, we use Google Calendar and Google Docs, and we have Android phones. Others have pledged allegiance to Apple: We have Macintosh laptops, iPhones, and iPads; and we let iCloud automatically synchronize and back up everything. Still others of us let Microsoft do it all. Or we buy our music and e-books from Amazon, which keeps records of what we own and allows downloading to a Kindle, computer, or phone. Some of us have pretty much abandoned e-mail altogether … for Facebook.

These vendors are becoming our feudal lords, and we are becoming their vassals.

Bruce wrote that in 2012, about the time we invested hope in Do Not Track, which was designed as a polite request one could turn on in a browser, and servers could obey.

Alas, the tracking-based online advertising business and its dependents in publishing dismissed Do Not Track with contempt.

Starting in 2013, we serfs fought back, by the hundreds of millions, blocking ads and tracking: the biggest boycott in world history. This, however, did nothing to stop what Shoshana Zuboff calls Surveillance Capitalism and Brett Frischmann and Evan Selinger call Re-engineering Humanity.

Today our poisoned minds can hardly imagine having native capacities of our own that can operate at scale across all the world’s websites and services. To have that ability would also be at odds with the methods and imperatives of personally targeted advertising, which requires cookies and other tracking methods. One of those imperatives is making money: $Trillions of it.

The business itself (aka adtech) is extremely complex and deeply corrupt: filled with fraud, botnets and malwareMost of the money spent on adtech also goes to intermediaries and not to the media you (as they like to say) consume. It’s a freaking fecosystem, and every participant’s dependence on it is extreme.

Take, for example, Vizio TVs. As Samuel Axon puts it in Ars TechnicaVizio TV buyers are becoming the product Vizio sells, not just its customers Vizio’s ads, streaming, and data business grew 133 percent year over year.

Without cookies and the cookie-like trackers by which Vizio and its third parties can target customers directly, that business wouldn’t be there.

As a measure of how far this poisoning has gone, dig this: FouAnalyticsPageXray says the Ars Technica story above comes to your browser with all this spyware you don’t ask for or expect when you click on that link:

Adserver Requests: 786
Tracking Requests: 532
Other Requests: 112

I’m also betting that nobody reporting for a Condé Nast publication will touch that third rail, which I have been challenging journalists to do in 139 posts, essays, columns and articles, starting in 2008.

(Please prove me wrong, @SamuelAxon—or any reporter other than Farhad Manjoo, who so far is the only journalist from a major publication I know to have bitten the robotic hand that feeds them. I also note that the hand in his case is The New York Times‘, and that it has backed off a great deal in the amount of tracking it does. Hats off for that.)

At this stage of the Web’s moral devolution, it is nearly impossible to think outside the cookie-based fecosystem. If it was, we would get back the agency we lost, and the regulations we’re writing would respect and encourage that agency as well.

But that’s not happening, in spite of all the positive privacy moves Apple, Brave, Mozilla, Consumer Reports, the EFF and others are making.

My hat’s off to all of them, but let’s face it: the poisoning is too far advanced. After fighting it for more than 22 years (dating from publishing The Cluetrain Manifesto in 1999), I’m moving on.

To here.

Historic milestones don’t always line up with large round numbers on our calendars. For example, I suggest that the 1950s ended with the assassination of JFK in late 1963, and the rise of British Rock, led by the Beatles, in 1964. I also suggest that the 1960s didn’t end until Nixon resigned, and disco took off, in 1974.

It has likewise been suggested that the 20th century actually began with the assassination of Archduke Ferdinand and the start of WWI, in 1914. While that and my other claims might be arguable, you might at least agree that there’s no need for historic shifts to align with two or more zeros on a calendar—and that in most cases they don’t.

So I’m here to suggest that the 21st century began in 2020 with the Covid-19 pandemic and the fall of Donald Trump. (And I mean that literally. Social media platforms were Trump’s man’s stage, and the whole of them dropped him, as if through a trap door, on the occasion of the storming of the U.S. Capitol by his supporters on January 6, 2021. Whether you liked that or not is beside the facticity of it.)

Things are not the same now. For example, over the coming years, we may never hug, shake hands, or comfortably sit next to strangers again.

But I’m bringing this up for another reason: I think the future we wrote about in The Cluetrain Manifesto, in World of Ends, in The Intention Economy, and in other optimistic expressions during the first two decades of the 21st Century may finally be ready to arrive.

At least that’s the feeling I get when I listen to an interview I did with Christian Einfeldt (@einfeldt) at a San Diego tech conference in April, 2004—and that I just discovered recently in the Internet Archive. The interview was for a film to be called “Digital Tipping Point.” Here are its eleven parts, all just a few minutes long:

01 https://archive.org/details/e-dv038_doc_…
02 https://archive.org/details/e-dv039_doc_…
03 https://archive.org/details/e-dv038_doc_…
04 https://archive.org/details/e-dv038_doc_…
05 https://archive.org/details/e-dv038_doc_…
06 https://archive.org/details/e-dv038_doc_…
07 https://archive.org/details/e-dv038_doc_…
08 https://archive.org/details/e-dv038_doc_…
09 https://archive.org/details/e-dv038_doc_…
10 https://archive.org/details/e-dv039_doc_…
11 https://archive.org/details/e-dv039_doc_…

The title is a riff on Malcolm Gladwell‘s book The Tipping Point, which came out in 2000, same year as The Cluetrain Manifesto. The tipping point I sensed four years later was, I now believe, a foreshadow of now, and only suggested by the successes of the open source movement and independent personal publishing in the form of blogs, both of which I was high on at the time.

What followed in the decade after the interview were the rise of social networks, of smart mobile phones and of what we now call Big Tech. While I don’t expect those to end in 2021, I do expect that we will finally see  the rise of personal agency and of constructive social movements, which I felt swelling in 2004.

Of course, I could be wrong about that. But I am sure that we are now experiencing the millennial shift we expected when civilization’s odometer rolled past 2000.

Northern Red-Tail Hawk

On Quora the question went, If you went from an IQ of 135+ to 100, how would it feel?

Here’s how I answered::::

I went through that as a kid, and it was no fun.

In Kindergarten, my IQ score was at the top of the bell curve, and they put me in the smart kid class. By 8th grade my IQ score was down at the middle of the bell curve, my grades sucked, and my other standardized test scores (e.g. the Iowa) were terrible. So the school system shunted me from the “academic” track (aimed at college) to the “general” one (aimed at “trades”).

To the school I was a failure. Not a complete one, but enough of one for the school to give up on aiming me toward college. So, instead of sending me on to a normal high school, they wanted to send me to a “vocational-technical” school where boys learned to operate machinery and girls learned “secretarial” skills.

But in fact the school failed me, as it did countless other kids who adapted poorly to industrialized education: the same industrial system that still has people believing IQ tests are a measure of anything other than how well somebody answers a bunch puzzle questions on a given day.

Fortunately, my parents believed in me, even though the school had given up. I also believed in myself, no matter what the school thought. Like Walt Whitman, I believed “I was never measured, and never will be measured.” Walt also gifted everyone with these perfect lines (from Song of Myself):

I know I am solid and sound.
To me the converging objects of the universe
perpetually flow.

All are written to me,
and I must get what the writing means…
I know this orbit of mine cannot be swept
by a carpenter’s compass,

I know that I am august,
I do not trouble my spirit to vindicate itself
or be understood.
I see that the elementary laws never apologize.

Whitman argued for the genius in each of us that moves in its own orbit and cannot be encompassed by industrial measures, such as standardized tests that serve an institution that would rather treat students like rats in their mazes than support the boundless appetite for knowledge with which each of us is born—and that we keep if it doesn’t get hammered out of us by normalizing systems.

It amazes me that half a century since I escaped from compulsory schooling’s dehumanizing wringer, the system is largely unchanged. It might even be worse. (“Study says standardized testing is overwhelming nation’s public schools,” writes The Washington Post.)

To detox ourselves from belief in industrialized education, the great teacher John Taylor Gatto gives us The Seven Lesson Schoolteacher, which summarizes what he was actually paid to teach:

  1. Confusion — “Everything I teach is out of context. I teach the un-relating of everything. I teach disconnections. I teach too much: the orbiting of planets, the law of large numbers, slavery, adjectives, architectural drawing, dance, gymnasium, choral singing, assemblies, surprise guests, fire drills, computer languages, parents’ nights, staff-development days, pull-out programs, guidance with strangers my students may never see again, standardized tests, age-segregation unlike anything seen in the outside world….What do any of these things have to do with each other?”
  2. Class position — “I teach that students must stay in the class where they belong. I don’t know who decides my kids belong there but that’s not my business. The children are numbered so that if any get away they can be returned to the right class. Over the years the variety of ways children are numbered by schools has increased dramatically, until it is hard to see the human beings plainly under the weight of numbers they carry. Numbering children is a big and very profitable undertaking, though what the strategy is designed to accomplish is elusive. I don’t even know why parents would, without a fight, allow it to be done to their kids. In any case, again, that’s not my business. My job is to make them like it, being locked in together with children who bear numbers like their own.”
  3. Indifference — “I teach children not to care about anything too much, even though they want to make it appear that they do. How I do this is very subtle. I do it by demanding that they become totally involved in my lessons, jumping up and down in their seats with anticipation, competing vigorously with each other for my favor. It’s heartwarming when they do that; it impresses everyone, even me. When I’m at my best I plan lessons very carefully in order to produce this show of enthusiasm. But when the bell rings I insist that they stop whatever it is that we’ve been working on and proceed quickly to the next work station. They must turn on and off like a light switch. Nothing important is ever finished in my class, nor in any other class I know of. Students never have a complete experience except on the installment plan. Indeed, the lesson of the bells is that no work is worth finishing, so why care too deeply about anything?
  4. Emotional dependency — “By stars and red checks, smiles and frowns, prizes, honors and disgraces I teach kids to surrender their will to the predestined chain of command. Rights may be granted or withheld by any authority without appeal, because rights do not exist inside a school — not even the right of free speech, as the Supreme Court has ruled — unless school authorities say they do. As a schoolteacher, I intervene in many personal decisions, issuing a pass for those I deem legitimate, or initiating a disciplinary confrontation for behavior that threatens my control. Individuality is constantly trying to assert itself among children and teenagers, so my judgments come thick and fast. Individuality is a contradiction of class theory, a curse to all systems of classification.”
  5. Intellectual dependency — “Good people wait for a teacher to tell them what to do. It is the most important lesson, that we must wait for other people, better trained than ourselves, to make the meanings of our lives. The expert makes all the important choices; only I, the teacher, can determine what you must study, or rather, only the people who pay me can make those decisions which I then enforce… This power to control what children will think lets me separate successful students from failures very easily.
  6. Provisional self-esteem — “Our world wouldn’t survive a flood of confident people very long, so I teach that your self-respect should depend on expert opinion. My kids are constantly evaluated and judged. A monthly report, impressive in its provision, is sent into students’ homes to signal approval or to mark exactly, down to a single percentage point, how dissatisfied with their children parents should be. The ecology of “good” schooling depends upon perpetuating dissatisfaction just as much as the commercial economy depends on the same fertilizer.
  7. No place to hide — “I teach children they are always watched, that each is under constant surveillance by myself and my colleagues. There are no private spaces for children, there is no private time. Class change lasts three hundred seconds to keep promiscuous fraternization at low levels. Students are encouraged to tattle on each other or even to tattle on their own parents. Of course, I encourage parents to file their own child’s waywardness too. A family trained to snitch on itself isn’t likely to conceal any dangerous secrets. I assign a type of extended schooling called “homework,” so that the effect of surveillance, if not that surveillance itself, travels into private households, where students might otherwise use free time to learn something unauthorized from a father or mother, by exploration, or by apprenticing to some wise person in the neighborhood. Disloyalty to the idea of schooling is a Devil always ready to find work for idle hands. The meaning of constant surveillance and denial of privacy is that no one can be trusted, that privacy is not legitimate.”

Gatto won multiple teaching awards because he refused to teach any of those lessons. I succeeded in life by refusing to learn them as well.

All of us can succeed by forgetting those seven lessons—especially the one teaching that your own intelligence can be measured by anything other than what you do with it.

You are not a number. You are a person like no other. Be that, and refuse to contain your soul inside any institutional framework.

More Whitman:

Long enough have you dreamed contemptible dreams.
Now I wash the gum from your eyes.
You must habit yourself to the dazzle of the light and of every moment of your life.

Long have you timidly waited,
holding a plank by the shore.
Now I will you to be a bold swimmer,
To jump off in the midst of the sea, and rise again,
and nod to me and shout,
and laughingly dash your hair.

I am the teacher of athletes.
He that by me spreads a wider breast than my own
proves the width of my own.
He most honors my style
who learns under it to destroy the teacher.

Do I contradict myself?
Very well then. I contradict myself.
I am large. I contain multitudes.

I concentrate toward them that are nigh.
I wait on the door-slab.

Who has done his day’s work
and will soonest be through with his supper?
Who wishes to walk with me.

The spotted hawk swoops by and accuses me.
He complains of my gab and my loitering.

I too am not a bit tamed. I too am untranslatable.
I sound my barbaric yawp over the roofs of the world.

Be that hawk.

For many decades, one of the landmark radio stations in Washington, DC was WMAL-AM (now re-branded WSPN), at 630 on (what in pre-digital times we called) the dial. As AM listening faded, so did WMAL, which moved its talk format to 105.9 FM in Woodbridge and its signal to a less ideal location, far out to the northwest of town.

They made the latter move because the 75 acres of land under the station’s four towers in Bethesda had become far more valuable than the signal. So, like many other station owners with valuable real estate under legacy transmitter sites, Cumulus Mediasold sold the old site for $74 million. Nice haul.

I’ve written at some length about this here and here in 2015, and here in 2016. I’ve also covered the whole topic of radio and its decline here and elsewhere.

I only bring the whole mess up today because it’s a five-year story that ended this morning, when WMAL’s towers were demolished. The Washington Post wrote about it here, and provided the video from which I pulled the screen-grab above. Pedestrians.org also has a much more complete video on YouTube, here. WRC-TV, channel 4, has a chopper view (best I’ve seen yet) here. Spake the Post,

When the four orange and white steel towers first soared over Bethesda in 1941, they stood in a field surrounded by sparse suburbs emerging just north of where the Capital Beltway didn’t yet exist. Reaching 400 feet, they beamed the voices of WMAL 630 AM talk radio across the nation’s capital for 77 years.

As the area grew, the 75 acres of open land surrounding the towers became a de facto park for runners, dog owners and generations of teenagers who recall sneaking smokes and beer at “field parties.”

Shortly after 9 a.m. Wednesday, the towers came down in four quick controlled explosions to make way for a new subdivision of 309 homes, taking with them a remarkably large piece of privately owned — but publicly accessible — green space. The developer, Toll Brothers, said construction is scheduled to begin in 2021.

Local radio buffs say the Washington region will lose a piece of history. Residents say they’ll lose a public play space that close-in suburbs have too little of.

After seeing those towers fall, I posted this to a private discussion among broadcast engineers (a role I once played, briefly and inexpertly, many years ago):

It’s like watching a public execution.

I’m sure that’s how many of who have spent our lives looking at and maintaining these things feel at a sight like this.

It doesn’t matter that the AM band is a century old, and that nearly all listening today is to other media. We know how these towers make waves that spread like ripples across the land and echo off invisible mirrors in the night sky. We know from experience how the inverse square law works, how nulls and lobes are formed, how oceans and prairie soils make small signals large and how rocky mountains and crappy soils are like mud to a strong signal’s wheels. We know how and why it is good to know these things, because we can see an invisible world where other people only hear songs, talk and noise.

We also know that, in time, all these towers are going away, or repurposed to hold up antennas sending and receiving radio frequencies better suited for carrying data.

We know that everything ends, and in that respect AM radio is no different than any other medium.

What matters isn’t whether it ends with a bang (such as here with WMAL’s classic towers) or with a whimper (as with so many other stations going dark or shrinking away in lesser facilities). It’s that there’s still some good work and fun in the time this old friend still has left.

In the library of Earth’s history, there are missing books, and within books there are missing chapters, written in rock that is now gone. John Wesley Powell recorded the greatest example of gone rock in 1869, on his expedition by boat through the Grand Canyon. Floating down the Colorado river, he saw the canyon’s mile-thick layers of reddish sedimentary rock resting on a basement of gray non-sedimentary rock, the layers of which were cocked at angle from the flatnesses of every layer above. Observing this, he correctly assumed that the upper layers did not continue from the bottom one, because time had clearly passed between the the time when the basement rock was beveled flat, against its own grain, and when the floors of rock above it were successively laid down. He didn’t know how much time had passed between basement and flooring, and could hardly guess. The answer turned out to be more than a billion years. The walls of the Grand Canyon say nothing about what happened during that time. Geology calls that nothing an unconformity.

In the decades since Powell made his notes, the same gap has been found all over the world, and is now called the Great Unconformity. Because of that unconformity, geology knows close to nothing about what happened in the world through stretches of time up to 1.6 billion years long.

All of those stretches end abruptly with the Cambrian Explosion, which began about 541 million years ago, when the Cambrian period arrived, and with it an amplitude of history, written in stone.

Many theories attempt to explain what erased such a large span of Earth’s history, but the prevailing guess is perhaps best expressed in “Neoproterozoic glacial origin of the Great Unconformity”, published on the last day of 2018 by nine geologists writing for the National Academy of Sciences. Put simply, they blame snow. Lots of it: enough to turn the planet into one giant snowball, informally called Snowball Earth. A more accurate name for this time would be Glacierball Earth, because glaciers, all formed from accumulated snow, apparently covered most or all of Earth’s land during the Great Unconformity—and most or all of the seas as well. Every continent was a Greenland or an Antarctica.

The relevant fact about glaciers is that they don’t sit still. They push immensities of accumulated ice down on landscapes and then spread sideways, pulverizing and scraping against adjacent landscapes, bulldozing their ways seaward through mountains and across hills and plains. In this manner, glaciers scraped a vastness of geological history off the Earth’s continents and sideways into ocean basins, where plate tectonics could hide the evidence. (A fact little known outside of geology is that nearly all the world’s ocean floors are young: born in spreading centers and killed by subduction under continents or piled up as debris on continental edges here and there. Example: the Bay Area of California is ocean floor that wasn’t subducted into a trench.) As a result, the stories of Earth’s missing history are partly told by younger rock that remembers only that a layer of moving ice had erased pretty much everything other than a signature on its work.

I bring all this up because I see something analogous to Glacierball Earth happening right now, right here, across our new worldwide digital sphere. A snowstorm of bits is falling on the virtual surface of our virtual sphere, which itself is made of bits even more provisional and temporary than the glaciers that once covered the physical Earth. Nearly all of this digital storm, vivid and present at every moment, is doomed to vanish, because it lacks even a glacier’s talent for accumulation.

The World Wide Web is also the World Wide Whiteboard.

Think about it: there is nothing about a bit that lends itself to persistence, other than the media it is written on. Form follows function; and most digital functions, even those we call “storage”, are temporary. The largest commercial facilities for storing digital goods are what we fittingly call “clouds”. By design, these are built to remember no more of what they once contained than does an empty closet. Stop paying for cloud storage, and away goes your stuff, leaving no fossil imprints. Old hard drives, CDs and DVDs might persist in landfills, but people in the far future may look at a CD or a DVD the way a geologist today looks at Cambrian zircons: as hints of digital activities may have happened during an interval about which otherwise nothing is known. If those fossils speak of what’s happening now at all, it will be of a self-erasing Digital Earth that was born in the late 20th century.

This theory actually comes from my wife, who has long claimed that future historians will look on our digital age as an invisible one, because it sucks so royally at archiving itself.

Credit where due: the Internet Archive is doing its best to make sure that some stuff will survive. But what will keep that archive alive, when all the media we have for recalling bits—from spinning platters to solid state memory—are volatile by nature?

My own future unconformity is announced by the stack of books on my desk, propping up the laptop on which I am writing. Two of those books are self-published compilations of essays I wrote about technology in the mid-1980s, mostly for publications that are long gone. The originals are on floppy disks that can be read only by PCs and apps of that time, some of which are buried in lower strata of boxes in my garage. I just found a floppy with some of those essays. (It’s the one with a blue edge in the wood case near the right end of the photo above.) If those still retain readable files, I am sure there are ways to recover at least the raw ASCII text. But I’m still betting the paper copies of the books under this laptop will live a lot longer than will these floppies or my mothalled PCs, all of which are likely bricked by decades of un-use.

As for other media, the prospect isn’t any better.

At the base of my video collection is a stratum of VHS videotapes, atop of which are strata of MiniDV and Hi8 tapes, and then one of digital stuff burned onto CDs and stored in hard drives, most of which have been disconnected for years. Some of those drives have interfaces and connections (e.g. FireWire) no longer supported by any computers being made today. Although I’ve saved machines to play all of them, none I’ve checked still work. One choked to death on a CD I stuck in it. That was a failure that stopped me from making Christmas presents of family memories recorded on old tapes and DVDs. I meant to renew the project sometime before the following Christmas, but that didn’t happen. Next Christmas? The one after that? I still hope, but odds are against it.

Then there are my parents’ 8mm and 16mm movies filmed between the 1930s and the 1960s. In 1989, my sister and I had all of those copied over to VHS tape. We then recorded our mother annotating the tapes onto companion cassette tapes while we all watched the show. I still have the original film in a box somewhere, but I haven’t found any of the tapes. Mom died in 2003 at age 90, and her whole generation is now gone.

The base stratum of my audio past is a few dozen open reel tapes recorded in the 1950s and 1960s. Above those are cassette and micro-cassette tapes, plus many Sony MiniDisks recorded in ATRAC, a proprietary compression algorithm now used by nobody, including Sony. Although I do have ways to play some (but not all) of those, I’m cautious about converting any of them to digital formats (Ogg, MPEG or whatever), because all digital storage media are likely to become obsolete, dead, or both—as will formats, algorithms and codecs. Already I have dozens of dead external hard drives in boxes and drawers. And, since no commercial cloud service is committed to digital preservation in the absence of payment, my files saved in clouds are sure to be flushed after neither my heirs nor I continue paying for their preservation. I assume my old open reel and cassette tapes okay, but I can’t tell right now because both my Sony TCWE-475 cassette deck (high end in its day) and my Akai 202D-SS open reel deck (a quadrophonic model from the early ’70s) are in need of work, since some of their rubber parts have rotted.

Same goes for my photographs. My printed photos—countless thousands of them dating from the late 1800s to 2004—are stored in boxes and albums of photos, negatives and Kodak slide carousels. My digital photos are spread across a mess of duplicated back-up drives totaling many terabytes, plus a handful of CDs. About 60,000 photos are exposed to the world on Flickr’s cloud, where I maintain two Pro accounts (here and here) for $50/year a piece. More are in the Berkman Klein Center’s pro account (here) and Linux Journal‘s (here). I doubt any of those will survive after those entities stop getting paid their yearly fees. SmugMug, which now owns Flickr, has said some encouraging things about photos such as mine, all of which are Creative Commons-licensed to encourage re-use. But, as Geoffrey West tells us, companies are mortal. All of them die.

As for my digital works as a whole (or anybody’s), there is great promise in what the Internet Archive and Wikimedia Commons do, but there is no guarantee that either will last for decades more, much less for centuries or millennia. And neither are able to archive everything that matters (much as they might like to).

It should also be sobering to recognize that nobody truly “owns” a domain on the internet. All those “sites” with “domains” at “locations” and “addresses” are rented. We pay a sum to a registrar for the right to use a domain name for a finite period of time. There are no permanent domain names or IP addresses. In the digital world, finitude rules.

So the historic progression I see, and try to illustrate in the photo at the top of this post, is from hard physical records through digital ones we hold for ourselves, and then up into clouds… that go away. Everything digital is snow falling and disappearing on the waters of time.

Will there ever be a way to save for the very long term what we ironically call our digital “assets?” Or is all of it doomed by its own nature to disappear, leaving little more evidence of its passage than a Digital Unconformity, when everything was forgotten?

I can’t think of any technical questions more serious than those two.


The original version of this post appeared in the March 2019 issue of Linux Journal.

In The Web and the New Reality, which I posted on December 1, 1995 (and again a few days ago), I called that date “Reality 1.995.12,” and made twelve predictions. In this post I’ll visit how those have played out over the quarter century since then.

1. As more customers come into direct contact with suppliers, markets for suppliers will change from target populations to conversations.

Well, both. While there are many more direct conversations between demand and supply than there were in the pre-Internet world, we are more targeted than ever, now personally and not just as populations. This has turned into a gigantic problem that many of us have been talking about for a decade or more, to sadly insufficient effect.

2. Travel, ticket, advertising and PR agencies will all find new ways to add value, or they will be subtracted from market relationships that no longer require them.

I don’t recall why I grouped those four things, so let’s break them apart:

  • Little travel agencies went to hell. Giant Net-based ones thrived. See here.
  • Tickets are now almost all digital. I don’t know what a modern ticket agency does, if if any exist.
  • Advertising agencies went digital and became malignant. I’ve written about that a lot, here. All of those writings could be compressed to a pull quote from Separating Advertising’s Wheat and Chaff: “Madison Avenue fell asleep, direct response marketing ate its brain, and it woke up as an alien replica of itself.”
  • PR agencies, far as I know (and I haven’t looked very far) are about the same.

3. Within companies, marketing communications will change from peripheral activities to core competencies.New media will flourish on the Web, and old media will learn to live with the Web and take advantage of it.

If we count the ascendance of the Chief Marketing Officer (CMO) as a success, this was a bulls-eye. However, most CMOs are all about “digital,” by which they generally mean direct response marketing. And if you didn’t skip to this item you know what I think about that.

4. Retail space will complement cyber space. Customer and technical service will change dramatically, as 800 numbers yield to URLs and hard copy documents yield to soft copy versions of the same thing… but in browsable, searchable forms.

Yep. All that happened.

5. Shipping services of all kinds will bloom. So will fulfillment services. So will ticket and entertainment sales services.

That too.

The web’s search engines will become the new yellow pages for the whole world. Your fingers will still do the walking, but they won’t get stained with ink. Same goes for the white pages. Also the blue ones.

And that.

6. The scope of the first person plural will enlarge to include the whole world. “We” may mean everybody on the globe, or any coherent group that inhabits it, regardless of location. Each of us will swing from group to group like monkeys through trees.

Oh yeah.

7. National borders will change from barricades and toll booths into speed bumps and welcome mats.

Mixed success. When I wrote this, nearly all Internet access was through telcos, so getting online away from home still required a local phone number. That’s pretty much gone. But the Internet itself is being broken into pieces. See here

8. The game will be over for what teacher John Taylor Gatto labels “the narcotic we call television.” Also for the industrial relic of compulsory education. Both will be as dead as the mainframe business. In other words: still trucking, but not as the anchoring norms they used to be.

That hasn’t happened; but self-education, home-schooling and online study of all kinds are thriving.

9. Big Business will become as anachronistic as Big Government, because institutional mass will lose leverage without losing inertia.

Well, this happened. So, no.

10. Domination will fail where partnering succeeds, simply because partners with positive sums will combine to outproduce winners and losers with zero sums.

Here’s what I meant by that.
I think more has happened than hasn’t. But, visiting the particulars requires a whole ‘nuther post.

11. Right will make might.

Nope. And this one might never happen. Hey, in 25 years one tends to become wiser.

12. And might will be mighty different.

That’s true, and in some ways that depresses me.

So, on the whole, not bad.

I posted this essay in my own pre-blog, Reality 2.0, on December 1, 1995. I think maybe now, in this long moment after we’ve hit a pause button on our future, we can start working on making good the unfulfilled promises that first gleamed in our future a quarter century ago.

Web


Contents


Reality 2.0

The import of the Internet is so obvious and extreme that it actually defies valuation: witness the stock market, which values Netscape so far above that company’s real assets and earnings that its P/E ratio verges on the infinite.

Whatever we’re driving toward, it is very different from anchoring certainties that have grounded us for generations, if not for the duration of our species. It seems we are on the cusp of a new and radically different reality. Let’s call it Reality 2.0.

The label has a millenial quality, and a technical one as well. If Reality 2.0 is Reality 2.000, this month we’re in Reality 1.995.12.

With only a few revisions left before Reality 2.0 arrives, we’re in a good position to start seeing what awaits. Here are just a few of the things this writer is starting to see…

  1. As more customers come into direct contact with suppliers, markets for suppliers will change from target populations to conversations.
  2. Travel, ticket, advertising and PR agencies will all find new ways to add value, or they will be subtracted from market relationships that no longer require them.
  3. Within companies, marketing communications will change from peripheral activities to core competencies.New media will flourish on the Web, and old media will learn to live with the Web and take advantage of it.
  4. Retail space will complement cyber space. Customer and technical service will change dramatically, as 800 numbers yield to URLs and hard copy documents yield to soft copy versions of the same thing… but in browsable, searchable forms.
  5. Shipping services of all kinds will bloom. So will fulfillment services. So will ticket and entertainment sales services.
  6. The web’s search engines will become the new yellow pages for the whole world. Your fingers will still do the walking, but they won’t get stained with ink. Same goes for the white pages. Also the blue ones.
  7. The scope of the first person plural will enlarge to include the whole world. “We” may mean everybody on the globe, or any coherent group that inhabits it, regardless of location. Each of us will swing from group to group like monkeys through trees.
  8. National borders will change from barricades and toll booths into speed bumps and welcome mats.
  9. The game will be over for what teacher John Taylor Gatto labels “the narcotic we call television.” Also for the industrial relic of compulsory education. Both will be as dead as the mainframe business. In other words: still trucking, but not as the anchoring norms they used to be.
  10. Big Business will become as anachronistic as Big Government, because institutional mass will lose leverage without losing inertia.Domination will fail where partnering succeeds, simply because partners with positive sums will combine to outproduce winners and losers with zero sums.
  11. Right will make might.
  12. And might will be mighty different.

Polyopoly

The Web is the board for a new game Phil Salin called “Polyopoly.” As Phil described it, Polyopoly is the opposite of Monopoly. The idea is not to win a fight over scarce real estate, but to create a farmer’s market for the boundless fruits of the human mind.

It’s too bad Phil didn’t live to see the web become what he (before anyone, I believe) hoped to create with AMIX: “the first efficient marketplace for information.” The result of such a marketplace, Phil said, would be polyopoly.

In Monopoly, what mattered were the three Ls of real estate: “location, location and location.”

On the web, location means almost squat.

What matters on the web are the three Cs: contentconnections and convenience. These are what make your home page a door the world beats a path to when it looks for the better mouse trap that only you sell. They give your webfront estate its real value.

If commercial interests have their way with the Web, we can also add a fourth C: cost. But how high can costs go in a polyopolistic economy? Not very. Because polyopoly creates…

An economy of abundance

The goods of Polyopoly and Monopoly are as different as love and lug nuts. Information is made by minds, not factories; and it tends to make itself abundant, not scarce. Moreover, scarce information tends to be worthless information.

Information may be bankable, but traditional banking, which secures and contains scarce commodities (or their numerical representations) does not respect the nature of information.

Because information abhors scarcity. It loves to reproduce, to travel, to multiply. Its natural habitats are wires and airwaves and disks and CDs and forums and books and magazines and web pages and hot links and chats over cappuccinos at Starbucks. This nature lends itself to polyopoly.

Polyopoly’s rules are hard to figure because the economy we are building with it is still new, and our vocabulary for describing it is sparse.

This is why we march into the Information Age hobbled by industrial metaphors. The “information highway” is one example. Here we use the language of freight forwarding to describe the movement of music, love, gossip, jokes, ideas and other communicable forms of knowledge that grow and change as they move from mind to mind.

We can at least say that knowledge, even in its communicable forms, is not reducible to data. Nor is the stuff we call “intellectual property.” A song and a bank account do not propagate the same ways. But we are inclined to say they do (and should), because we describe both with the same industrial terms.

All of which is why there is no more important work in this new economy than coining the new terms we use to describe it.

The Age of Enlightenment finally arrives

The best place to start looking for help is at the dawn of the Industrial Age. Because this was when the Age of Reason began. Nobody knew more about the polyopoly game — or played it — better than those champions of reason from whose thinking our modern republics are derived: Thomas Paine, Thomas Jefferson and Benjamin Franklin.

As Jon Katz says in “The Age of Paine” (Wired, May 1995 ), Thomas Paine was the the “moral father of the Internet.” Paine said “my country is the world,” and sought as little compensation as possible for his work, because he wanted it to be inexpensive and widely read. Paine’s thinking still shapes the politics of the U.S., England and France, all of which he called home.

Thomas Jefferson wrote the first rule of Polyopoly: “He who receives an idea from me receives instruction himself without lessening mine; as he who lights his taper at mine, receives light without darkening me.”

He also left a live bomb for modern intellectual property law: “Inventions then cannot, in nature, be a subject of property.” The best look at the burning fuse is John Perry Barlow’s excellent essay “The Economy of Ideas,” in the March 1994 issue of Wired. (I see that Jon Katz repeats it in his paean to Paine. Hey, if someone puts it to song, who gets the rights?)

If Paine was the moral father of the Internet, Ben Franklin’s paternity is apparent in Silicon Valley. Today he’d fit right in, inventing hot products, surfing the Web and spreading his wit and wisdom like a Johnny Cyberseed. Hell, he even has the right haircut.

Franklin left school at 10 and was barely 15 when he ran his brother’s newspaper, writing most of its content and getting quoted all over Boston. He was a self-taught scientist and inventor while still working as a writer and publisher. He also found time to discover electricity, create the world’s first postal service, invent a heap of handy products and serve as a politician and diplomat.

Franklin’s biggest obsession was time. He scheduled and planned constantly. He even wrote his famous epitaph when he was 22, six decades before he died. “The work shall not be lost,” it reads, “for it will (as he believed) appear once more in a new and more elegant edition, revised and edited by the author.”

One feels the ghost of Franklin today, editing the web.

Time to subtract the garbage

Combine Jefferson and Franklin and you get the two magnetic poles that tug at every polyopoly player: information that only gets more abundant, and time that only gets more scarce.

As Alain Couder of Groupe Bull puts it, “we treat time as a constant in all these formulas — revolutions per minute, instructions per second — yet we experience time as something that constantly decreases.”

After all, we’re born with an unknown sum of time, and we need to spend it all before we die. The notion of “saving” it is absurd. Time can only be spent.

So: to play Polyopoly well, we need to waste as little time as possible. This is not easy in a world where the sum of information verges on the infinite.

Which is why I think Esther Dyson might be our best polyopoly player.

“There’s too much noise out there anyway,” she says in ‘Esther Dyson on DaveNet‘ (12/1/94). “The new wave is not value added, it’s garbage-subtracted.”

Here’s a measure of how much garbage she subtracts from her own life: her apartment doesn’t even have a phone.

Can she play this game, or what?

So what’s left?

I wouldn’t bother to ask Esther if she watches television, or listens to the radio. I wouldn’t ask my wife, either. To her, television is exactly what Fred Allen called it forty years ago: “chewing gum for the eyes.” Ours heats up only for natural disasters and San Jose Sharks games.

Dean Landsman, a sharp media observer from the broadcast industry, tells me that John Gresham books are cutting into time that readers would otherwise spend watching television. And that’s just the beginning of a tide that will swell as every medium’s clients weigh more carefully what they do with their time.

Which is why it won’t be long before those clients wad up their television time and stick it under their computer. “Media will eat media,” Dean says.

The computer is looking a lot hungrier than the rest of the devices out there. Next to connected computing, television is AM radio.

Fasten your seat belts.

Web of the free, home of the Huns

Think of the Industrial world — the world of Big Business and Big Government — as a modern Roman Empire.

Now think of Bill Gates as Attilla the Hun.

Because that’s exactly how Bill looks to the Romans who still see the web, and everything else in the world, as a monopoly board. No wonder Bill doesn’t have a senator in his pocket (as Mark Stahlman told us in ‘Off to the Slaughter House,’ (DaveNet, 3/14/94).

Sadly for the the Romans, their empire is inhabited almost entirely by Huns, all working away on their PCs. Most of those Huns don’t have a problem with Bill. After all, Bill does a fine job of empowering his people, and they keep electing him with their checkbooks, credit cards and purchase orders.

Which is why, when they go forth to tame the web, these tough-talking Captains of Industry and Leaders of Government look like animated mannequins in Armani Suits: clothes with no emperor. Their content is emulation. They drone about serving customers and building architectures and setting standards and being open and competing on level playing fields. But their game is still control, no matter what else they call it.

Bill may be our emperor, but ruling Huns is not the same as ruling Romans. You have to be naked as a fetus and nearly as innocent. Because polyopoly does not reward the dark tricks that used to work for industry, government and organized crime. Those tricks worked in a world where darkness had leverage, where you could fool some of the people some of the time, and that was enough.

But polyopoly is a positive-sum game. Its goods are not produced by huge industries that control the world, but by smart industries that enable the world’s inhabitants. Like the PC business that thrives on it, information grows up from individuals, not down from institutions. Its economy thrives on abundance rather than scarcity. Success goes to enablers, not controllers. And you don’t enable people by fooling them. Or by manipulating them. Or by muscling them.

In fact, you don’t even play to win. As Craig Burton of The Burton Group puts it, “the goal isn’t win/win, it’s play/play.”

This is why Bill does not “control” his Huns the way IBM controlled its Romans. Microsoft plays by winning support, where IBM won by dominating the play. Just because Microsoft now holds a controlling position does not mean that a controlling mentality got them there. What I’ve seen from IBM and Apple looks far more Monopoly-minded and controlling than anything I’ve seen from Microsoft.

Does this mean that Bill’s manners aren’t a bit Roman at times? No. Just that the support Microsoft enjoys is a lot more voluntary on the part of its customers, users and partners. It also means that Microsoft has succeeded by playing Polyopoly extremely well. When it tries to play Monopoly instead, the Huns don’t like it. Bill doesn’t need the Feds to tell him when that happens. The Huns tell him soon enough.

market is a conversation

No matter how Roman Bill’s fantasies might become, he knows his position is hardly more substantial than a conversation. In fact, it IS a conversation.

I would bet that Microsoft is engaged in more conversations, more of the time, with more customers and partners, than any other company in the world. Like or hate their work, the company connects. I submit that this, as much as anything else, accounts for its success.

In the Industrial Age, a market was a target population. Goods rolled down a “value chain” that worked like a conveyor belt. Raw materials rolled into one end and finished products rolled out the other. Customers bought the product or didn’t, and customer feedback was limited mostly to the money it spent.

To encourage customer spending, “messages” were “targeted” at populations, through advertising, PR and other activities. The main purpose of these one-way communications was to stimulate sales. That model is obsolete. What works best to day is what Normann & Ramirez (Harvard Business Review, June/July 1993) call a “value constellation” of relationships that include customers, partners, suppliers, resellers, consultants, contractors and all kinds of people.

The Web is the star field within which constellations of companies, products and markets gather themselves. And what binds them together, in each case, are conversations.

How it all adds up

What we’re creating here is a new economy — an information economy.

Behind the marble columns of big business and big government, this new economy stands in the lobby like a big black slab. The primates who work behind those columns don’t know what this thing is, but they do know it’s important and good to own. The problem is, they can’t own it. Nobody can. Because it defies the core value in all economies based on physical goods: scarcity.

Scarcity ruled the stone hearts and metal souls of every zero-sum value system that ever worked — usually by producing equal quantities of gold and gore. And for dozens of millennia, we suffered with it. If Tribe A crushed Tribe B, it was too bad for Tribe B. Victors got the spoils.

This win/lose model has been in decline for some time. Victors who used to get spoils now just get responsibilities. Cooperation and partnership are now more productive than competition and domination. Why bomb your enemy when you can get him on the phone and do business with him? Why take sides when the members of “us” and “them” constantly change?

The hard evidence is starting to come in. A recent Wharton Impact report said, “Firms which specified their objectives as ‘beating our competitors’ or ‘gaining market share’ earned substantially lower profits over the period.” We’re reading stories about women-owned businesses doing better, on the whole, because women are better at communicating and less inclined to waste energy by playing sports and war games in their marketplaces.

From the customer’s perspective, what we call “competition” is really a form of cooperation that produces abundant choices. Markets are created by addition and multiplication, not just by subtraction and division.

In my old Mac IIci, I can see chips and components from at least 11 different companies and 8 different countries. Is this evidence of war among Apple’s suppliers? Do component vendors succeed by killing each other and limiting choices for their customers? Did Apple’s engineers say, “Gee, let’s help Hitachi kill Philips on this one?” Were they cheering for one “side” or another? The answer should be obvious.

But it isn’t, for two reasons. One is that the “Dominator Model,” as anthropologist (and holocaust survivor) Riane Eisler calls it, has been around for 20,000 years, and until recently has reliably produced spoils for victors. The other is that conflict always makes great copy. To see how seductive conflict-based thinking is, try to find a hot business story that isn’t filled with sports and war metaphors. It isn’t easy.

Bound by the language of conflict, most of us still believe that free enterprise runs on competition between “sides” driven by urges to dominate, and that the interests of those “sides” are naturally opposed.

To get to the truth here, just ask this: which has produced more — the U.S. vs. Japan, or the U.S. + Japan? One produced World War II and a lot of bad news. The other produced countless marvels — from cars to consumer electronics — on which the whole world depends.

Now ask this: which has produced more — Apple vs. Microsoft or Apple + Microsoft? One profited nobody but the lawyers, and the other gave us personal computing as we know it today.

The Plus Paradigm

What brings us to Reality 2.0 is the Plus Paradigm.

The Plus Paradigm says that our world is a positive construction, and that the best games produce positive sums for everybody. It recognizes the power of information and the value of abundance. (Think about it: the best information may have the highest power to abound, and its value may vary as the inverse of its scarcity.)

Over the last several years, mostly through discussions with client companies that are struggling with changes that invalidate long-held assumptions, I have built table of old (Reality 1.0) vs. new (Reality 2.0) paradigms. The difference between these two realities, one client remarked, is that the paradigm on the right is starting to work better than the paradigm on the left.

 

Paradigm Reality 1.0 Reality 2.0
Means to ends Domination Partnership
Cause of progress Competition Collaboration
Center of interest Personal Social
Concept of systems Closed Open
Dynamic Win/Lose Play/Play
Roles Victor/Victim Partner/Ally
Primary goods Capital Information
Source of leverage Monopoly Polyopoly
Organization Hierarchy Flexiarchy
Roles Victor/Victim Server/Client
Scope of self-interest Self/Nation Self/World
Source of power Might Right
Source of value Scarcity Abundance
Stage of growth Child (selfish) Adult (social)
Reference valuables Metal, Money Life, Time
Purpose of boundaries Protection Limitation

Changes across the paradigms show up as positive “reality shifts.” The shift is from OR logic to AND logic, from Vs. to +:

 

Reality 1.0 Reality 2.0
man vs nature man + nature
Labor vs management Labor + management
Public vs private Public + private
Men vs women Men + women
Us vs them Us + them
Majority vs minority Majority + minority
Party vs party Party + party
Urban vs rural Urban + rural
Black vs white Black + white
Business vs govt. Business + govt.

The Plus Paradigm comprehends the world as a positive construction, and sees that the best games produce positive sums for everybody. It recognizes the power of information and the value of abundance. (Think about it: the best information may have the highest power to abound, and its value may vary as the inverse of its scarcity.)

For more about this whole way of thinking, see Bernie DeKoven’s ideas about “the ME/WE” at his “virtual playground.”]

This may sound sappy, but information works like love: when you give it away, you still get to keep it. And when you give it back, it grows.

Which has always been the case. But in Reality 2.0, it should become a lot more obvious.

Journalism’s biggest problem (as I’ve said before) is what it’s best at: telling stories. That’s what Thomas B. Edsall (of Columbia and The New York Times) does in Trump’s Digital Advantage Is Freaking Out Democratic Strategists, published in today’s New York Times. He tells a story. Or, in the favored parlance of our time, a narrative, about what he sees Republicans’ superior use of modern methods for persuading voters:

Experts in the explosively growing field of political digital technologies have developed an innovative terminology to describe what they do — a lexicon that is virtually incomprehensible to ordinary voters. This language provides an inkling of the extraordinarily arcane universe politics has entered:

geofencingmass personalizationdark patternsidentity resolution technologiesdynamic prospectinggeotargeting strategieslocation analyticsgeo-behavioural segmentpolitical data cloudautomatic content recognitiondynamic creative optimization.

Geofencing and other emerging digital technologies derive from microtargeting marketing initiatives that use consumer and other demographic data to identify the interests of specific voters or very small groups of like-minded individuals to influence their thoughts or actions.

In fact the “arcane universe” he’s talking about is the direct marketing playbook, which was born offline as the junk mail business. In that business, tracking individuals and bothering them personally is a fine and fully rationalized practice. And let’s face it: political campaigning has always wanted to get personal. It’s why we have mass mailings, mass callings, mass textings and the rest of it—all to personal addresses, numbers and faces.

Coincidence: I just got this:

There is nothing new here other than (at the moment) the Trump team doing it better than any Democrat. (Except maybe Bernie.) Obama’s team was better at it in ’08 and ’12. Trump’s was better at it in ’16 and is better again in ’20.*

However, debating which candidates do the best marketing misdirects our attention away from the destruction of personal privacy by constant tracking of our asses online—including tracking of asses by politicians. This, I submit, is a bigger and badder issue than which politicians do the best direct marketing. It may even be bigger than who gets elected to what in November.

As issues go, personal privacy is soul-deep. Who gets elected, and how, are not.

As I put it here,

Surveillance of people is now the norm for nearly every website and app that harvests personal data for use by machines. Privacy, as we’ve understood it in the physical world since the invention of the loincloth and the door latch, doesn’t yet exist. Instead, all we have are the “privacy policies” of corporate entities participating in the data extraction marketplace, plus terms and conditions they compel us to sign, either of which they can change on a whim. Most of the time our only choice is to deny ourselves the convenience of these companies’ services or live our lives offline.

Worse is that these are proffered on the Taylorist model, meaning mass-produced.

There is a natural temptation to want to fix this with policy. This is a mistake for two reasons:

  1. Policy-makers are themselves part of the problem. Hell, most of their election campaigns are built on direct marketing. And law enforcement (which carries out certain forms of policy) has always regarded personal privacy as a problem to overcome rather than a solution to anything. Example.
  2. Policy-makers often screw things up. Exhibit A: the EU’s GDPR, which has done more to clutter the Web with insincere and misleading cookie notices than it has to advance personal privacy tech online. (I’ve written about this a lot. Here’s one sample.)

We need tech of our own. Terms and policies of our own. In the physical world, we have privacy tech in the forms of clothing, shelter, doors, locks and window shades. We have policies in the form of manners, courtesies, and respect for privacy signals we send to each other. We lack all of that online. Until we invent it, the most we’ll do to achieve real privacy online is talk about it, and inveigh for politicians to solve it for us. Which they won’t.

If you’re interested in solving personal privacy at the personal level, take a look at Customer Commons. If you want to join our efforts there, talk to me.

_____________
*The Trump campaign also has the enormous benefit of an already-chosen Republican ticket. The Democrats have a mess of candidates and a split in the party between young and old, socialists and moderates, and no candidate as interesting as is Trump. (Also, I’m not Joyce.)

At this point, it’s no contest. Trump is the biggest character in the biggest story of our time. (I explain this in Where Journalism Fails.) And he’s on a glide path to winning in November, just as I said he was in 2016.

Here’s the popover that greets visitors on arrival at Rolling Stone‘s website:

Our Privacy Policy has been revised as of January 1, 2020. This policy outlines how we use your information. By using our site and products, you are agreeing to the policy.

That policy is supplied by Rolling Stone’s parent (PMC) and weighs more than 10,000 words. In it the word “advertising” appears 68 times. Adjectives modifying it include “targeted,” “personalized,” “tailored,” “cookie-based,” “behavioral” and “interest-based.” All of that is made possible by, among other things—

Information we collect automatically:

Device information and identifiers such as IP address; browser type and language; operating system; platform type; device type; software and hardware attributes; and unique device, advertising, and app identifiers

Internet network and device activity data such as information about files you download, domain names, landing pages, browsing activity, content or ads viewed and clicked, dates and times of access, pages viewed, forms you complete or partially complete, search terms, uploads or downloads, the URL that referred you to our Services, the web sites you visit after this web site; if you share our content to social media platforms; and other web usage activity and data logged by our web servers, whether you open an email and your interaction with email content, access times, error logs, and other similar information. See “Cookies and Other Tracking Technologies” below for more information about how we collect and use this information.

Geolocation information such as city, state and ZIP code associated with your IP address or derived through Wi-Fi triangulation; and precise geolocation information from GPS-based functionality on your mobile devices, with your permission in accordance with your mobile device settings.

The “How We Use the Information We Collect” section says they will—

Personalize your experience to Provide the Services, for example to:

  • Customize certain features of the Services,
  • Deliver relevant content and to provide you with an enhanced experience based on your activities and interests
  • Send you personalized newsletters, surveys, and information about products, services and promotions offered by us, our partners, and other organizations with which we work
  • Customize the advertising on the Services based on your activities and interests
  • Create and update inferences about you and audience segments that can be used for targeted advertising and marketing on the Services, third party services and platforms, and mobile apps
  • Create profiles about you, including adding and combining information we obtain from third parties, which may be used for analytics, marketing, and advertising
  • Conduct cross-device tracking by using information such as IP addresses and unique mobile device identifiers to identify the same unique users across multiple browsers or devices (such as smartphones or tablets, in order to save your preferences across devices and analyze usage of the Service.
  • using inferences about your preferences and interests for any and all of the above purposes

For a look at what Rolling Stone, PMC and their third parties are up to, Privacy Badger’s browser extension “found 73 potential trackers on www.rollingstone.com:

tagan.adlightning.com
 acdn.adnxs.com
 ib.adnxs.com
 cdn.adsafeprotected.com
 static.adsafeprotected.com
 d.agkn.com
 js.agkn.com
 c.amazon-adsystem.com
 z-na.amazon-adsystem.com
 display.apester.com
 events.apester.com
 static.apester.com
 as-sec.casalemedia.com
 ping.chartbeat.net
 static.chartbeat.com
 quantcast.mgr.consensu.org
 script.crazyegg.com
 dc8xl0ndzn2cb.cloudfront.net
cdn.digitru.st
 ad.doubleclick.net
 securepubads.g.doubleclick.net
 hbint.emxdgt.com
 connect.facebook.net
 adservice.google.com
 pagead2.googlesyndication.com
 www.googletagmanager.com
 www.gstatic.com
 static.hotjar.com
 imasdk.googleapis.com
 js-sec.indexww.com
 load.instinctiveads.com
 ssl.p.jwpcdn.com
 content.jwplatform.com
 ping-meta-prd.jwpltx.com
 prd.jwpltx.com
 assets-jpcust.jwpsrv.com
 g.jwpsrv.com
pixel.keywee.co
 beacon.krxd.net
 cdn.krxd.net
 consumer.krxd.net
 www.lightboxcdn.com
 widgets.outbrain.com
 cdn.permutive.com
 assets.pinterest.com
 openbid.pubmatic.com
 secure.quantserve.com
 cdn.roiq.ranker.com
 eus.rubiconproject.com
 fastlane.rubiconproject.com
 s3.amazonaws.com
 sb.scorecardresearch.com
 p.skimresources.com
 r.skimresources.com
 s.skimresources.com
 t.skimresources.com
launcher.spot.im
recirculation.spot.im
 js.spotx.tv
 search.spotxchange.com
 sync.search.spotxchange.com
 cc.swiftype.com
 s.swiftypecdn.com
 jwplayer.eb.tremorhub.com
 pbs.twimg.com
 cdn.syndication.twimg.com
 platform.twitter.com
 syndication.twitter.com
 mrb.upapi.net
 pixel.wp.com
 stats.wp.com
 www.youtube.com
 s.ytimg.com

This kind of shit is why we have the EU’s GDPR (General Data Protection Regulation) and California’s CCPA (California Consumer Privacy Act). (No, it’s not just because Google and Facebook.) If publishers and the adtech industry (those third parties) hadn’t turned the commercial Web into a target-rich environment for suckage by data vampires, we’d never have had either law. (In fact, both laws are still new: the GDPR went into effect in May 2018 and the CCPA a few days ago.)

I’m in California, where the CCPA gives me the right to shake down the vampiretariat for all the information about me they’re harvesting, sharing, selling or giving away to or through those third parties.* But apparently Rolling Stone and PMC don’t care about that.

Others do, and I’ll visit some of those in later posts. Meanwhile I’ll let Rolling Stone and PMC stand as examples of bad acting by publishers that remains rampant, unstopped and almost entirely unpunished, even under these new laws.

I also suggest following and getting involved with the fight against the plague of data vampirism in the publishing world. These will help:

  1. Reading Don Marti’s blog, where he shares expert analysis and advice on the CCPA and related matters. Also People vs. Adtech, a compilation of my own writings on the topic, going back to 2008.
  2. Following what the browser makers are doing with tracking protection (alas, differently†). Shortcuts: Brave, Google’s Chrome, Ghostery’s Cliqz, Microsoft’s Edge, Epic, Mozilla’s Firefox.
  3. Following or joining communities working to introduce safe forms of nourishment for publishers and better habits for advertisers and their agencies. Those include Customer CommonsMe2B AllianceMyData Global and ProjectVRM.

______________

*The bill (AB 375), begins,

The California Constitution grants a right of privacy. Existing law provides for the confidentiality of personal information in various contexts and requires a business or person that suffers a breach of security of computerized data that includes personal information, as defined, to disclose that breach, as specified.

This bill would enact the California Consumer Privacy Act of 2018. Beginning January 1, 2020, the bill would grant a consumer a right to request a business to disclose the categories and specific pieces of personal information that it collects about the consumer, the categories of sources from which that information is collected, the business purposes for collecting or selling the information, and the categories of 3rd parties with which the information is shared. The bill would require a business to make disclosures about the information and the purposes for which it is used. The bill would grant a consumer the right to request deletion of personal information and would require the business to delete upon receipt of a verified request, as specified. The bill would grant a consumer a right to request that a business that sells the consumer’s personal information, or discloses it for a business purpose, disclose the categories of information that it collects and categories of information and the identity of 3rd parties to which the information was sold or disclosed…

Don Marti has a draft letter one might submit to the brokers and advertisers who use all that personal data. (He also tweets a caution here.)

†This will be the subject of my next post.

« Older entries