Internet

You are currently browsing the archive for the Internet category.

Historic milestones don’t always line up with large round numbers on our calendars. For example, I suggest that the 1950s ended with the assassination of JFK in late 1963, and the rise of British Rock, led by the Beatles, in 1964. I also suggest that the 1960s didn’t end until Nixon resigned, and disco took off, in 1974.

It has likewise been suggested that the 20th century actually began with the assassination of Archduke Ferdinand and the start of WWI, in 1914. While that and my other claims might be arguable, you might at least agree that there’s no need for historic shifts to align with two or more zeros on a calendar—and that in most cases they don’t.

So I’m here to suggest that the 21st century began in 2020 with the Covid-19 pandemic and the fall of Donald Trump. (And I mean that literally. Social media platforms were Trump’s man’s stage, and the whole of them dropped him, as if through a trap door, on the occasion of the storming of the U.S. Capitol by his supporters on January 6, 2021. Whether you liked that or not is beside the facticity of it.)

Things are not the same now. For example, over the coming years, we may never hug, shake hands, or comfortably sit next to strangers again.

But I’m bringing this up for another reason: I think the future we wrote about in The Cluetrain Manifesto, in World of Ends, in The Intention Economy, and in other optimistic expressions during the first two decades of the 21st Century may finally be ready to arrive.

At least that’s the feeling I get when I listen to an interview I did with Christian Einfeldt (@einfeldt) at a San Diego tech conference in April, 2004—and that I just discovered recently in the Internet Archive. The interview was for a film to be called “Digital Tipping Point.” Here are its eleven parts, all just a few minutes long:

01 https://archive.org/details/e-dv038_doc_…
02 https://archive.org/details/e-dv039_doc_…
03 https://archive.org/details/e-dv038_doc_…
04 https://archive.org/details/e-dv038_doc_…
05 https://archive.org/details/e-dv038_doc_…
06 https://archive.org/details/e-dv038_doc_…
07 https://archive.org/details/e-dv038_doc_…
08 https://archive.org/details/e-dv038_doc_…
09 https://archive.org/details/e-dv038_doc_…
10 https://archive.org/details/e-dv039_doc_…
11 https://archive.org/details/e-dv039_doc_…

The title is a riff on Malcolm Gladwell‘s book The Tipping Point, which came out in 2000, same year as The Cluetrain Manifesto. The tipping point I sensed four years later was, I now believe, a foreshadow of now, and only suggested by the successes of the open source movement and independent personal publishing in the form of blogs, both of which I was high on at the time.

What followed in the decade after the interview were the rise of social networks, of smart mobile phones and of what we now call Big Tech. While I don’t expect those to end in 2021, I do expect that we will finally see  the rise of personal agency and of constructive social movements, which I felt swelling in 2004.

Of course, I could be wrong about that. But I am sure that we are now experiencing the millennial shift we expected when civilization’s odometer rolled past 2000.

“Give me a lever long enough and a fulcrum on which to place it, and I shall move the world,” Archimedes is said to have said.

For almost all of the last four years, Donald Trump was one hell of an Archimedes. With the U.S. presidency as his lever and Twitter as his fulcrum, the 45th President leveraged an endless stream of news-making utterances into a massive following and near-absolute domination of news coverage, worldwide. It was an amazing show, the like of which we may never see again.

Big as it was, that show ended on January 8, when Twitter terminated the @RealDonaldTrump account. Almost immediately after that, Trump was “de-platformed” from all these other services as well: PayPal, Reddit, Shopify, Snapchat, Discord, Amazon, Twitch, Facebook, TikTok, Google, Apple, Twitter, YouTube and Instagram. That’s a lot of fulcrums to lose.

What makes them fulcrums is their size. All are big, and all are centralized: run by one company. As members, users and customers of these centralized services, we are also at their mercy: no less vulnerable to termination than Trump.

So here is an interesting question: What if Trump had his own fulcrum from the start? For example, say he took one of the many Trump domains he probably owns (or should have bothered to own, long ago), and made it a blog where he said all the same things he tweeted, and that site had the same many dozens of millions of followers today? Would it still be alive?

I’m not sure it would. Because, even though the base protocols of the Internet and the Web are peer-to-peer and end-to-end, all of us are dependent on services above those protocols, and at the mercy of those services’ owners.

That to me is the biggest lesson the de-platforming of Donald Trump has for the rest of us. We can talk “de-centralization” and “distribution” and “democratization” along with peer-to-peer and end-to-end, but we are still at the mercy of giants.

Yes, there are work-arounds. The parler.com website, de-platformed along with Trump, is back up and, according to @VickerySec (Chris Vickery), “routing 100% of its user traffic through servers located within the Russian Federation.” Adds @AdamSculthorpe, “With a DDos-Guard IP, exactly as I predicted the day it went offline. DDoS Guard is the Russian equivalent of CloudFlare, and runs many shady sites. RiTM (Russia in the middle) is one way to think about it.” Encrypted services such as Signal and Telegram also provide ways for people to talk and be social. But those are also platforms, and we are at their mercy too.

I bring all this up as a way of thinking out loud toward the talk I’ll be giving in a few hours (also see here), on the topic “Centralized vs. Decentralized.” Here’s the intro:

Centralised thinking is easy. Control sits on one place, everything comes home, there is a hub, the corporate office is where all the decisions are made and it is a power game.

Decentralised thinking is complex. TCP/IP and HTTP created a fully decentralised fabric for packet communication. No-one is in control. It is beautiful. Web3 decentralised ideology goes much further but we continually run into conflicts. We need to measure, we need to report, we need to justify, we need to find a model and due to regulation and law, there are liabilities.

However, we have to be doing both. We have to centralise some aspects and at the same time decentralise others. Whilst we hang onto an advertising model that provides services for free we have to have a centralised business model. Apple with its new OS is trying to break the tracking model and in doing so could free us from the barter of free, is that the plan which has nothing to do with privacy or are the ultimate control freaks. But the new distributed model means more risks fall on the creators as the aggregators control the channels and access to a model. Is our love for free preventing us from seeing the value in truly distributed or are those who need control creating artefacts that keep us from achieving our dreams? Is distributed even possible with liability laws and a need to justify what we did to add value today?

So here is what I think I’ll say.

First, we need to respect the decentralized nature of humanity. All of us are different, by design. We look, sound, think and feel different, as separate human beings. As I say in How we save the world, “no being is more smart, resourceful or original than a human one. Again, by design. Even identical twins, with identical DNA from a single sperm+egg, can be as different as two primary colors. (Examples: Laverne Cox and M.LamarNicole and Jonas Maines.)”

This simple fact of our distributed souls and talents has had scant respect from the centralized systems of the digital world, which would rather lead than follow us, and rather guess about us than understand us. That’s partly because too many of them have become dependent on surveillance-based personalized advertising (which is awful in ways I’ve detailed in 136 posts, essays and articles compiled here). But it’s mostly because they’re centralized and can’t think or work outside their very old and square boxes.

Second, advertising, subscriptions and donations through the likes of (again, centralized) Patreon aren’t the only possible ways to support a site or a service. Those are industrial age conventions leveraged in the early decades of the digital age. There are other approaches we can implement as well, now that the pendulum is started to swing back from the centralized extreme. For example, the fully decentralized EmanciPay. A bunch of us came up with that one at ProjectVRM way back in 2009. What makes it decentralized is that the choice of what to pay, and how, is up to the customer. (No, it doesn’t have to be scary.) Which brings me to—

Third, we need to start thinking about solving business problems, market problems, technical problems, from our side. Here is how Customer Commons puts it:

There is … no shortage of of business problems that can only be solved from the customer’s side. Here are a few examples :

  1. Identity. Logins and passwords are burdensome leftovers from the last millennium. There should be (and already are) better ways to identify ourselves, and to reveal to others only what we need them to know. Working on this challenge is the SSI—Self-Sovereign Identity—movement. The solution here for individuals is tools of their own that scale.
  2. Subscriptions. Nearly all subscriptions are pains in the butt. “Deals” can be deceiving, full of conditions and changes that come without warning. New customers often get better deals than loyal customers. And there are no standard ways for customers to keep track of when subscriptions run out, need renewal, or change. The only way this can be normalized is from the customers’ side.
  3. Terms and conditions. In the world today, nearly all of these are ones companies proffer; and we have little or no choice about agreeing to them. Worse, in nearly all cases, the record of agreement is on the company’s side. Oh, and since the GDPR came along in Europe and the CCPA in California, entering a website has turned into an ordeal typically requiring “consent” to privacy violations the laws were meant to stop. Or worse, agreeing that a site or a service provider spying on us is a “legitimate interest.”
  4. Payments. For demand and supply to be truly balanced, and for customers to operate at full agency in an open marketplace (which the Internet was designed to be), customers should have their own pricing gun: a way to signal—and actually pay willing sellers—as much as they like, however they like, for whatever they like, on their own terms. There is already a design for that, called Emancipay.
  5. Internet of Things. What we have so far are the Apple of things, the Amazon of things, the Google of things, the Samsung of things, the Sonos of things, and so on—all silo’d in separate systems we don’t control. Things we own on the Internet should be our things. We should be able to control them, as independent customers, as we do with our computers and mobile devices. (Also, by the way, things don’t need to be intelligent or connected to belong to the Internet of Things. They can be, or have, picos.)
  6. Loyalty. All loyalty programs are gimmicks, and coercive. True loyalty is worth far more to companies than the coerced kind, and only customers are in position to truly and fully express it. We should have our own loyalty programs, to which companies are members, rather than the reverse.
  7. Privacy. We’ve had privacy tech in the physical world since the inventions of clothing, shelter, locks, doors, shades, shutters, and other ways to limit what others can see or hear—and to signal to others what’s okay and what’s not. Instead, all we have are unenforced promises by others not to watching our naked selves, or to report what they see to others. Or worse, coerced urgings to “accept” spying on us and distributing harvested information about us to parties unknown, with no record of what we’ve agreed to.
  8. Customer service. There are no standard ways to call for service yet, or to get it. And there should be.
  9. Advertising. Our main problem with advertising today is tracking, which is failing because it doesn’t work. (Some history: ad blocking has been around since 2004, it took off in 2013, when the advertising and publishing industries gave the middle finger to Do Not Track, which was never more than a polite request in one’s browser not to be tracked off a site. By 2015, ad blocking alone was the biggest boycott i world history. And in 2018 and 2019 we got the GDPR and the CCPA, two laws meant to thwart tracking and unwanted data collection, and which likely wouldn’t have happened if we hadn’t been given that finger.) We can solve that problem from the customer side with intentcasting,. This is where we advertise to the marketplace what we want, without risk that our personal data won’t me misused. (Here is a list of intentcasting providers on the ProjectVRM Development Work list.)

We already have examples of personal solutions working at scale: the Internet, the Web, email and telephony. Each provides single, simple and standards-based ways any of us can scale how we deal with others—across countless companies, organizations and services. And they work for those companies as well.

Other solutions, however, are missing—such as ones that solve the eight problems listed above.

They’re missing for the best of all possible reasons: it’s still early. Digital living is still new—decades old at most. And it’s sure to persist for many decades, centuries or millennia to come.

They’re also missing because businesses typically think all solutions to business problems are ones for them. Thinking about customers solving business problems is outside that box.

But much work is already happening outside that box. And there already exist standards and code for building many customer-side solutions to problems shared with businesses. Yes, there are not yet as many or as good as we need; but there are enough to get started.

A lot of levers there.

For those of you attending this event, I’ll talk with you shortly. For the rest of you, I’ll let you know how it goes.

Let’s say the world is going to hell. Don’t argue, because my case isn’t about that. It’s about who saves it.

I suggest everybody. Or, more practically speaking, a maximized assortment of the smartest and most helpful anybodies.

Not governments. Not academies. Not investors. Not charities. Not big companies and their platforms. Any of those can be involved, of course, but we don’t have to start there. We can start with people. Because all of them are different. All of them can learn. And teach. And share. Especially since we now have the Internet.

To put this in a perspective, start with Joy’s Law: “No matter who you are, most of the smartest people work for someone else.” Then take Todd Park‘s corollary: “Even if you get the best and the brightest to work for you, there will always be an infinite number of other, smarter people employed by others.” Then take off the corporate-context blinders, and note that smart people are actually far more plentiful among the world’s customers, readers, viewers, listeners, parishioners, freelancers and bystanders.

Hundreds of millions of those people also carry around devices that can record and share photos, movies, writings and a boundless assortment of other stuff. Ways of helping now verge on the boundless.

We already have millions (or billions) of them are reporting on everything by taking photos and recording videos with their mobiles, obsolescing journalism as we’ve known it since the word came into use (specifically, around 1830). What matters with the journalism example, however, isn’t what got disrupted. It’s how resourceful and helpful (and not just opportunistic) people can be when they have the tools.

Because no being is more smart, resourceful or original than a human one. Again, by design. Even identical twins, with identical DNA from a single sperm+egg, can be as different as two primary colors. (Examples: Laverne Cox and M. Lamar. Nicole and Jonas Maines.)

Yes, there are some wheat/chaff distinctions to make here. To thresh those, I dig Carlo Cipolla‘s Basic Laws on Human Stupidity (.pdf here) which stars this graphic:

The upper right quadrant has how many people in it? Billions, for sure.

I’m counting on them. If we didn’t have the Internet, I wouldn’t.

In Internet 3.0 and the Beginning of (Tech) History, @BenThompson of @Stratechery writes this:

The Return of Technology

Here technology itself will return to the forefront: if the priority for an increasing number of citizens, companies, and countries is to escape centralization, then the answer will not be competing centralized entities, but rather a return to open protocols. This is the only way to match and perhaps surpass the R&D advantages enjoyed by centralized tech companies; open technologies can be worked on collectively, and forked individually, gaining both the benefits of scale and inevitability of sovereignty and self-determination.

—followed by this graphic:

If you want to know what he means by “Politics,” read the piece. I take it as something of a backlash by regulators against big tech, especially in Europe. (With global scope. All those cookie notices you see are effects of European regulations.) But the bigger point is where that arrow goes. We need infrastructure there, and it won’t be provided by regulation alone. Tech needs to take the lead. (See what I wrote here three years ago.) But our tech, not big tech.

The wind is at our backs now. Let’s sail with it.

Bonus links: Cluetrain, New Clues, World of EndsCustomer Commons.

And a big HT to my old buddy Julius R. Ruff, Ph.D., for turning me on to Cipolla.

[Later…] Seth Godin calls all of us “indies.” I like that. HT to @DaveWiner for flagging it.

If you listen to Episode 49: Parler, Ownership, and Open Source of the latest Reality 2.0 podcast, you’ll learn that I was blindsided at first by the topic of Parler, which has lately become a thing. But I caught up fast, even getting a Parler account not long after the show ended. Because I wanted to see what’s going on.

Though self-described as “the world’s town square,” Parler is actually a centralized social platform built for two purposes: 1) completely free speech; and 2) creating and expanding echo chambers.

The second may not be what Parler’s founders intended (see here), but that’s how social media algorithms work. They group people around engagements, especially likes. (I think, for our purposes here, that algorithmically nudged engagement is a defining feature of social media platforms as we understand them today. That would exclude, for example, Wikipedia or a popular blog or newsletter with lots of commenters. It would include, say, Reddit and Linkedin, because algorithms.)

Let’s start with recognizing that the smallest echo chamber in these virtual places is our own, comprised of the people we follow and who follow us. Then note that our visibility into other virtual spaces is limited by what’s shown to us by algorithmic nudging, such as by Twitter’s trending topics.

The main problem with this is not knowing what’s going on, especially inside other echo chambers. There are also lots of reasons for not finding out. For example, my Parler account sits idle because I don’t want Parler to associate me with any of the people it suggests I follow, soon as I show up:

l also don’t know what to make of this, which is the only other set of clues on the index page:

Especially since clicking on any of them brings up the same or similar top results, which seem to have nothing to do with the trending # topic. Example:

Thus endeth my research.

But serious researchers should be able to see what’s going on inside the systems that produce these echo chambers, especially Facebook’s.

The problem is that Facebook and other social networks are shell games, designed to make sure nobody knows exactly what’s going on, but feels okay with it, because they’re hanging with others who agree on the basics.

The design principle at work here is obscurantism—”the practice of deliberately presenting information in an imprecise, abstruse manner designed to limit further inquiry and understanding.”

To put the matter in relief, consider a nuclear power plant:

(Photo of kraftwerk Grafenrheinfeld, 2013, by Avda. Licensed CC BY-SA 3.0.)

Nothing here is a mystery. Or, if there is one, professional inspectors will be dispatched to solve it. In fact, the whole thing is designed from the start to be understandable, and its workings accountable to a dependent public.

Now look at a Facebook data center:

What it actually does is pure mystery, by design, to those outside the company. (And hell, to most, maybe all, of the people inside the company.) No inspector arriving to look at a rack of blinking lights in that place is going to know either. What Facebook looks like to you, to me, to anybody, is determined by a pile of discoveries, both on and off of Facebook’s site and app, around who you are and what to machines you seem interested in, and an algorithmic process that is not accountable to you, and impossible for anyone, perhaps including Facebook itself, to fully explain.

All societies, and groups within societies, are echo chambers. And, because they cohere in isolated (and isolating) ways it is sometimes hard for societies to understand each other, especially when they already have prejudicial beliefs about each other. Still, without the further influence of social media, researchers can look at and understand what’s going on.

Over in the digital world, which overlaps with the physical one, we at least know that social media amplifies prejudices. But, though it’s obvious by now that this is what’s going on, doing something to reduce or eliminate the production and amplification of prejudices is damn near impossible when the mechanisms behind it are obscure by design.

This is why I think these systems need to be turned inside out, so researchers can study them. I don’t know how to make that happen; but I do know there is nothing more large and consequential in the world that is also absent of academic inquiry. And that ain’t right.

BTW, if Facebook, Twitter, Parler or other social networks actually are opening their algorithmic systems to academic researchers, let me know and I’ll edit this piece accordingly.

I just got this email today:

Which tells me, from a sample of one (after another, after another) that Zoom is to video conferencing in 2020 what Microsoft Windows was to personal computing in 1999. Back then one business after another said they would only work with Windows and what was left of DOS: Microsoft’s two operating systems for PCs.

What saved the personal computing world from being absorbed into Microsoft was the Internet—and the Web, running on the Internet. The Internet, based on a profoundly generative protocol, supported all kinds of hardware and software at an infinitude of end points. And the Web, based on an equally generative protocol, manifested on browsers that ran on Mac and Linux computers, as well as Windows ones.

But video conferencing is different. Yes, all the popular video conferencing systems run in apps that work on multiple operating systems, and on the two main mobile device OSes as well. And yes, they are substitutable. You don’t have to use Zoom (unless, in cases like mine, where talking to my doctors requires it). There’s still Skype, Webex, Microsoft Teams, Google Hangouts and the rest.

But all of them have a critical dependency through their codecs. Those are the ways they code and decode audio and video. While there are some open source codecs, all the systems I just named use proprietary (patent-based) codecs. The big winner among those is H.264, aka AVC-1, which Wikipedia says “is by far the most commonly used format for the recording, compression, and distribution of video content, used by 91% of video industry developers as of September 2019.” Also,

H.264 is perhaps best known as being the most commonly used video encoding format on Blu-ray Discs. It is also widely used by streaming Internet sources, such as videos from NetflixHuluPrime VideoVimeoYouTube, and the iTunes Store, Web software such as the Adobe Flash Player and Microsoft Silverlight, and also various HDTV broadcasts over terrestrial (ATSCISDB-TDVB-T or DVB-T2), cable (DVB-C), and satellite (DVB-S and DVB-S2) systems.

H.264 is protected by patents owned by various parties. A license covering most (but not all) patents essential to H.264 is administered by a patent pool administered by MPEG LA.[9]

The commercial use of patented H.264 technologies requires the payment of royalties to MPEG LA and other patent owners. MPEG LA has allowed the free use of H.264 technologies for streaming Internet video that is free to end users, and Cisco Systems pays royalties to MPEG LA on behalf of the users of binaries for its open source H.264 encoder.

This is generative, clearly, but not as generative as the Internet and the Web, which are both end-to-end by design. .

More importantly, AVC-1 in effect slides the Internet and the Web into the orbit of companies that have taken over what used to be telephony and television, which are now mooshed together. In the Columbia Doctors example, Zoom the new PBX. The new classroom is every teacher and kid on her or his own rectangle, “zooming” with each other through the new telephony. The new TV is Netflix, Disney, Comcast, Spectrum, Apple, Amazon and many others, all competing for wedges our Internet access and entertainment budgets.

In this new ecosystem, you are less the producer than you were, or would have been, in the early days of the Net and the Web. You are the end user, the consumer, the audience, the customer. Not the producer, the performer. Sure, you can audition for those roles, and play them on YouTube and TikTok, but those are somebody else’s walled gardens. You operate within them at their grace. You are not truly free.

And maybe none of us ever were, in those early days of the Net and the Web. But it sure seemed that way. And it does seem that we have lost something.

Or maybe just that we are slowly losing it, in the manner of boiling frogs.

Do we have to? I mean, it’s still early.

The digital world is how old? Decades, at most.

And how long will it last? At the very least, more than that. Centuries or millennia, probably.

So there’s hope.

[Later…] For some of that, dig OBS—Open Broadcaster Software’s OBS StudioFree and open source software for video recording and live streaming. HT: Joel Grossman (@jgro).

Also, though unrelated, why is Columbia Doctors’ Telehealth leaking patient data to advertisers? See here.

In New Digital Realities; New Oversight SolutionsTom Wheeler, Phil Verveer and Gene Kimmelman suggest that “the problems in dealing with digital platform companies” strip the gears of antitrust and other industrial era regulatory machines, and that what we need instead is “a new approach to regulation that replaces industrial era regulation with a new more agile regulatory model better suited for the dynamism of the digital era.” For that they suggest “a new Digital Platform Agency should be created with a new, agile approach to oversight built on risk management rather than micromanagement.” They provide lots of good reasons for this, which you can read in depth here.

I’m on a list where this is being argued. One of those participating is Richard Shockey, who often cites his eponymous law, which says, “The answer is money. What is the question?” I bring that up as background for my own post on the list, which I’ll share here:

The Digital Platform Agency proposal seems to obey a law like Shockey’s that instead says, “The answer is policy. What is the question?”

I think it will help, before we apply that law, to look at modern platforms as something newer than new. Nascent. Larval. Embryonic. Primitive. Epiphenomenal.

It’s not hard to think of them that way if we take a long view on digital life.

Start with this question: is digital tech ever going away?

Whether yes or no, how long will digital tech be with us, mothering boundless inventions and necessities? Centuries? Millenia?

And how long have we had it so far? A few decades? Hell, Facebook and Twitter have only been with us since the late ’00s.

So why start to regulate what can be done with those companies from now on, right now?

I mean, what if platforms are just castles—headquarters of modern duchies and principalities?

Remember when we thought IBM, AT&T and the PTTs in Europe would own and run the world forever?

Remember when the BUNCH was around, and we called IBM “the environment?” Remember EBCDIC?

Remember when Microsoft ruled the world, and we thought they had to be broken up?

Remember when Kodak owned photography, and thought their enemy was Fuji?

Remember when recorded music had to be played by rolls of paper, lengths of tape, or on spinning discs and disks?

Remember when “social media” was a thing, and all the world’s gossip happened on Facebook and Twitter?

Then consider the possibility that all the dominant platforms of today are mortally vulnerable to obsolescence, to collapse under their own weight, or both.

Nay, the certainty.

Every now is a future then, every is a was. And trees don’t grow to the sky.

It’s an easy bet that every platform today is as sure to be succeeded as were stone tablets by paper, scribes by movable type, letterpress by offset, and all of it by xerography, ink jet, laser printing and whatever comes next.

Sure, we do need regulation. But we also need faith in the mortality of every technology that dominates the world at any moment in history, and in the march of progress and obsolescence.

Another thought: if the only answer is policy, the problem is the question.

This suggests yet another another law (really an aphorism, but whatever): “The answer is obsolescence. What is the question?”

As it happens, I wrote about Facebook’s odds for obsolescence two years ago here. An excerpt:

How easy do you think it is for Facebook to change: to respond positively to market and regulatory pressures?

Consider this possibility: it can’t.

One reason is structural. Facebook is comprised of many data centers, each the size of a Walmart or few, scattered around the world and costing many $billions to build and maintain. Those data centers maintain a vast and closed habitat where more than two billion human beings share all kinds of revealing personal shit about themselves and each other, while providing countless ways for anybody on Earth, at any budget level, to micro-target ads at highly characterized human targets, using up to millions of different combinations of targeting characteristics (including ones provided by parties outside Facebook, such as Cambridge Analytica, which have deep psychological profiles of millions of Facebook members). Hey, what could go wrong?

In three words, the whole thing.

The other reason is operational. We can see that in how Facebook has handed fixing what’s wrong with it over to thousands of human beings, all hired to do what The Wall Street Journal calls “The Worst Job in Technology: Staring at Human Depravity to Keep It Off Facebook.” Note that this is not the job of robots, AI, ML or any of the other forms of computing magic you’d like to think Facebook would be good at. Alas, even Facebook is still a long way from teaching machines to know what’s unconscionable. And can’t in the long run, because machines don’t have a conscience, much less an able one.

You know Goethe’s (or hell, Disney’s) story of The Sorceror’s Apprentice? Look it up. It’ll help. Because Mark Zuckerberg is both the the sorcerer and the apprentice in the Facebook version of the story. Worse, Zuck doesn’t have the mastery level of either one.

Nobody, not even Zuck, has enough power to control the evil spirits released by giant machines designed to violate personal privacy, produce echo chambers beyond counting, amplify tribal prejudices (including genocidal ones) and produce many $billions for itself in an advertising business that depends on all of that—while also trying to correct, while they are doing what they were designed to do, the massively complex and settled infrastructural systems that make all if it work.

I’m not saying regulators should do nothing. I am saying that gravity still works, the mighty still fall, and these are facts of nature it will help regulators to take into account.

There is latency to everything. Pain, for example. Nerve impulses from pain sensors travel at about two feet per second. That’s why we wait for the pain when we stub a toe. The crack of a bat on a playing field takes half a second before we hear it in the watching crowd. The sunlight we see on Earth is eight minutes old. Most of this doesn’t matter to us, or if it does we adjust to it.

Likewise with how we adjust to the inverse square law. That law is why the farther away something is, the smaller it looks or the fainter it sounds. How much smaller or fainter is something we intuit more than we calculate. What matters is that we understand the law with our bodies. In fact we understand pretty much everything with our bodies.

All our deepest, most unconscious metaphors start with our bodies. That’s why we graspcatch, toss around, or throw away an idea. It’s also why nearly all our prepositions pertain to location or movement. Over, under, around, throughwithbeside, within, alongside, on, off, above and below only make sense to us because we have experienced them with our bodies.

So::: How are we to make full sense of the Web, or the Internet, where we are hardly embodied at all?

We may say we are on the Web, because we need it to make sense to us as embodied beings. Yet we are only looking at a manifestation of it.

The “it” is the hypertext protocol (http) that Tim Berners-Lee thought up in 1990 so high energy physicists, scattered about the world, could look at documents together. That protocol ran on another one: TCP/IP. Together they were mannered talk among computers about how to show the same document across any connection over any collection of networks between any two end points, regardless of who owned or controlled those networks. In doing so, Tim rubbed a bottle of the world’s disparate networks. Out popped the genie we call the Web, ready to grant boundless wishes that only began with document sharing.

This was a miracle beyond the scale of loaves and fish: one so new and so odd that the movie Blade Runner, which imagined in 1982 that Los Angeles in 2019 would feature floating cars, off-world colonies and human replicants, failed to foresee a future when anyone could meet with anyone else, or any group, anywhere in the world, on wish-granting slabs they could put on their desks, laps, walls or hold in their hands. (Instead Blade Runner imagined there would still be pay phones and computers with vacuum tubes for screens.)

This week I attended Web Science 20 on my personal slab in California, instead of what was planned originally: in a conference at the University of Southampton in the UK. It was still a conference, but now a virtual one, comprised of many people on many slabs, all over the world, each with no sense of distance any more meaningful than those imposed by the inconvenience of time zones.

Joyce (my wife, who is also the source of much wisdom for which her husband gets the credit) says our experience on the Web is one of absent distance and gravity—and that this experience is still so new to us that we have only begun to make full sense of it as embodied creatures. We’ll adjust, she says, much as astronauts adjust to the absence of gravity; but it will take more time than we’ve had so far. We may become expert at using the likes of Zoom, but that doesn’t mean we operate in full comprehension of the new digital environment we co-occupy.

My own part in WebSci20 was talking with five good people, plus others asking questions in a chat, during the closing panel of the conference. (That’s us, at the top of this post.) The title of our session was The Future of Web Science. To prep for that session I wrote the first draft of what follows: a series of thoughts I hoped to bring up in the session, and some of which I actually did.

The first of thought is the one I just introduced: The Web, like the Net it runs on, is both new and utterly vexing toward understanding in terms we’ve developed for making sense of embodied existence.

Here are some more.

The Web is a whiteboard.

In the beginning we thought of the Web as something of a library, mostly because it was comprised of sites with addresses and pages that were authoredpublishedsyndicated, browsed and read. A universal resource locator, better known as a URL, would lead us through what an operating system calls a path or a directory, much as a card catalog did before library systems went digital. It also helped that we understood the Web as real estate, with sites and domains that one owned and others could visit.

The metaphor of the Web as a library, though useful, also misdirects our attention and understanding away from its nature as collection of temporary manifestations. Because, for all we attempt to give the Web a sense of permanence, it is evanescent, temporary, ephemeral. We write and publish there as we might on snow, sand or a whiteboard. Even the websites we are said to “own” are in fact only rented. Fail to pay the registrar and off it goes.

The Web is not what’s on it.

It is not Google, or Facebook, dot-anything or dot-anybody. It is the manifestation of documents and other non-stuff we call “content,” presented to us in browsers and whatever else we invent to see and deal with what the hypertext protocol makes possible. Here is how David Weinberger and I put it in World of Ends, more than seventeen years ago:

1. The Internet isn’t complicated
2. The Internet isn’t a thing. It’s an agreement.
3. The Internet is stupid.
4. Adding value to the Internet lowers its value.
5. All the Internet’s value grows on its edges.
6. Money moves to the suburbs.
7. The end of the world? Nah, the world of ends.
8. The Internet’s three virtues:
a. No one owns it
b. Everyone can use it
c. Anyone can improve it
9. If the Internet is so simple, why have so many been so boneheaded about it?
10. Some mistakes we can stop making already

That was a follow-up of sorts to The Cluetrain Manifesto, which we co-wrote with two other guys four years earlier. We followed up both five years ago with an appendix to Cluetrain called New Clues. While I doubt we’d say any of that stuff the same ways today, the heart of it beats the same.

The Web is free.

The online advertising industry likes to claim the “free Internet” is a grace of advertising that is “relevant,” “personalized,” “interest-based,” “interactive” and other adjectives that misdirect us away from what those forms of advertising actually do, which is track us like marked animals.

That claim, of course, is bullshit. Here’s what Harry Frankfurt says about that in his canonical work, On Bullshit (Cambridge University Press, 1988): “The realms of advertising and public relations, and the nowadays closely related realm of politics, are replete with instances of bullshit so unmitigated that they can serve among the most indisputable and classic paradigms of the concept.” Boiled down, bullshit is what Wikipedia (at the moment, itsef being evanescent) calls “speech intended to persuade without regard for truth.” Another distinction: “The liar cares about the truth and attempts to hide it; the bullshitter doesn’t care if what they say is true or false, but rather only cares whether their listener is persuaded.”

Consider for a moment Win Bigly: Persuasion in a World Where Facts Don’t Matter, a 2017 book by Scott Adams that explains, among other things, how a certain U.S. tycoon got his ass elected President. The world Scott’s talks about is the Web.

Nothing in the history of invention is more supportive of bullshit than the Web. Nor is anything more supportive of truth-telling, education and damned near everything else one can do in the civilized world. And we’re only beginning to discover and make sense of all those possibilities.

We’re all digital now

Meaning not just physical. This is what’s new, not just to human experience, but to human existence.

Marshall McLuhan calls our technologies, including our media, extensions of our bodily selves. Consider how, when you ride a bike or drive a car, those are my wheels and my brakes. Our senses extend outward to suffuse our tools and other technologies, making them parts of our larger selves. Michael Polanyi called this process indwelling.

Think about how, although we are not really on or through the Web, we do dwell in it when we read, write, speak, watch and perform there. That is what I am doing right now, while I type what I see on a screen in San Marino, California, as a machine, presumably in Cambridge, Massachusetts, records my keystrokes and presents them back to me, and now you are reading it, somewhere else in (or on, or choose your preposition) the world. Dwell may be the best verb for what each of us are doing in the non-here we all co-occupy in this novel (to the physical world) non-place and times.

McLuhan also said media revolutions are formal causes. Meaning that they form us. (He got that one from Aristotle.) In different ways we were formed and re-formed by speech, writing, printing, and radio and television broadcasting.

I submit that we are far more formed by digital technologies, and especially by the Internet and the Web, than by any other prior technical revolution. (A friend calls our current revolution “the biggest thing since oxygenation.”)

But this is hard to see because, as McLuhan puts it, every one of these major revolutions becomes a ground on which everything else dances as figures. But it is essential to recognize that the figures are not the ground. This, I suggest, is the biggest challenge for Web Science.

It’s damned hard to study ground-level formal causes such as digital tech, the Net and the Web. Because what they are technically is not what they do formally. They are rising tides that float all boats, in oblivity to the boats themselves.

I could say more, and I’m sure I will; but I want to get this much out there before the panel.

 

 

A few days ago, in Figuring the Future, I sourced an Arnold Kling blog post that posed an interesting pair of angles toward outlook: a 2×2 with Fragile <—> Robust on one axis and Essential <—> Inessential on the other. In his sort, essential + fragile are hospitals and airlines. Inessential + fragile are cruise ships and movie theaters. Robust + essential are tech giants. Inessential + robust are sports and entertainment conglomerates, plus major restaurant chains. It’s a heuristic, and all of it is arguable (especially given the gray along both axes), which is the idea. Cases must be made if planning is to have meaning.

Now, haul Arnold’s template over to The U.S. Labor Market During the Beginning of the Pandemic Recession, by Tomaz Cajner, Leland D. Crane, Ryan A. Decker, John Grigsby, Adrian Hamins-Puertolas, Erik Hurst, Christopher Kurz, and Ahu Yildirmaz, of the University of Chicago, and lay it on this item from page 21:

The highest employment drop, in Arts, Entertainment and Recreation, leans toward inessential + fragile. The second, in Accommodation and Food Services is more on the essential + fragile side. The lowest employment changes, from Construction on down to Utilities, all tending toward essential + robust.

So I’m looking at those bottom eight essential + robust categories and asking a couple of questions:

1) What percentage of workers in each essential + robust category are now working from home?

2) How much of this work is essentially electronic? Meaning, done by people who live and work through glowing rectangles, connected on the Internet?

Hard to say, but the answers will have everything to do with the transition of work, and life in general, into a digital world that coexists with the physical one. This was the world we were gradually putting together when urgency around COVID-19 turned “eventually” into “now.”

In Junana, Bruce Caron writes,

“Choose One” was extremely powerful. It provided a seed for everything from language (connecting sound to meaning) to traffic control (driving on only one side of the road). It also opened up to a constructivist view of society, suggesting that choice was implicit in many areas, including gender.

Choose One said to the universe, “There are several ways we can go, but we’re all going to agree on this way for now, with the understanding that we can do it some other way later, thank you.” It wasn’t quite as elegant as “42,” but it was close. Once you started unfolding with it, you could never escape the arbitrariness of that first choice.

In some countries, an arbitrary first choice to eliminate or suspend personal privacy allowed intimate degrees of contract tracing to help hammer flat the infection curve of COVID-19. Not arbitrary, perhaps, but no longer escapable.

Other countries face similar choices. Here in the U.S., there is an argument that says “The tech giants already know our movements and social connections intimately. Combine that with what governments know and we can do contact tracing to a fine degree. What matters privacy if in reality we’ve lost it already and many thousands or millions of lives are at stake—and so are the economies that provide what we call our ‘livings.’ This virus doesn’t care about privacy, and for now neither should we.” There is also an argument that says, “Just because we have no privacy yet in the digital world is no reason not to have it. So, if we do contact tracing through our personal electronics, it should be disabled afterwards and obey old or new regulations respecting personal privacy.”

Those choices are not binary, of course. Nor are they outside the scope of too many other choices to name here. But many of those are “Choose Ones” that will play out, even if our choice is avoidance.

In The Web and the New Reality, which I posted on December 1, 1995 (and again a few days ago), I called that date “Reality 1.995.12,” and made twelve predictions. In this post I’ll visit how those have played out over the quarter century since then.

1. As more customers come into direct contact with suppliers, markets for suppliers will change from target populations to conversations.

Well, both. While there are many more direct conversations between demand and supply than there were in the pre-Internet world, we are more targeted than ever, now personally and not just as populations. This has turned into a gigantic problem that many of us have been talking about for a decade or more, to sadly insufficient effect.

2. Travel, ticket, advertising and PR agencies will all find new ways to add value, or they will be subtracted from market relationships that no longer require them.

I don’t recall why I grouped those four things, so let’s break them apart:

  • Little travel agencies went to hell. Giant Net-based ones thrived. See here.
  • Tickets are now almost all digital. I don’t know what a modern ticket agency does, if if any exist.
  • Advertising agencies went digital and became malignant. I’ve written about that a lot, here. All of those writings could be compressed to a pull quote from Separating Advertising’s Wheat and Chaff: “Madison Avenue fell asleep, direct response marketing ate its brain, and it woke up as an alien replica of itself.”
  • PR agencies, far as I know (and I haven’t looked very far) are about the same.

3. Within companies, marketing communications will change from peripheral activities to core competencies.New media will flourish on the Web, and old media will learn to live with the Web and take advantage of it.

If we count the ascendance of the Chief Marketing Officer (CMO) as a success, this was a bulls-eye. However, most CMOs are all about “digital,” by which they generally mean direct response marketing. And if you didn’t skip to this item you know what I think about that.

4. Retail space will complement cyber space. Customer and technical service will change dramatically, as 800 numbers yield to URLs and hard copy documents yield to soft copy versions of the same thing… but in browsable, searchable forms.

Yep. All that happened.

5. Shipping services of all kinds will bloom. So will fulfillment services. So will ticket and entertainment sales services.

That too.

The web’s search engines will become the new yellow pages for the whole world. Your fingers will still do the walking, but they won’t get stained with ink. Same goes for the white pages. Also the blue ones.

And that.

6. The scope of the first person plural will enlarge to include the whole world. “We” may mean everybody on the globe, or any coherent group that inhabits it, regardless of location. Each of us will swing from group to group like monkeys through trees.

Oh yeah.

7. National borders will change from barricades and toll booths into speed bumps and welcome mats.

Mixed success. When I wrote this, nearly all Internet access was through telcos, so getting online away from home still required a local phone number. That’s pretty much gone. But the Internet itself is being broken into pieces. See here

8. The game will be over for what teacher John Taylor Gatto labels “the narcotic we call television.” Also for the industrial relic of compulsory education. Both will be as dead as the mainframe business. In other words: still trucking, but not as the anchoring norms they used to be.

That hasn’t happened; but self-education, home-schooling and online study of all kinds are thriving.

9. Big Business will become as anachronistic as Big Government, because institutional mass will lose leverage without losing inertia.

Well, this happened. So, no.

10. Domination will fail where partnering succeeds, simply because partners with positive sums will combine to outproduce winners and losers with zero sums.

Here’s what I meant by that.
I think more has happened than hasn’t. But, visiting the particulars requires a whole ‘nuther post.

11. Right will make might.

Nope. And this one might never happen. Hey, in 25 years one tends to become wiser.

12. And might will be mighty different.

That’s true, and in some ways that depresses me.

So, on the whole, not bad.

I posted this essay in my own pre-blog, Reality 2.0, on December 1, 1995. I think maybe now, in this long moment after we’ve hit a pause button on our future, we can start working on making good the unfulfilled promises that first gleamed in our future a quarter century ago.

Web


Contents


Reality 2.0

The import of the Internet is so obvious and extreme that it actually defies valuation: witness the stock market, which values Netscape so far above that company’s real assets and earnings that its P/E ratio verges on the infinite.

Whatever we’re driving toward, it is very different from anchoring certainties that have grounded us for generations, if not for the duration of our species. It seems we are on the cusp of a new and radically different reality. Let’s call it Reality 2.0.

The label has a millenial quality, and a technical one as well. If Reality 2.0 is Reality 2.000, this month we’re in Reality 1.995.12.

With only a few revisions left before Reality 2.0 arrives, we’re in a good position to start seeing what awaits. Here are just a few of the things this writer is starting to see…

  1. As more customers come into direct contact with suppliers, markets for suppliers will change from target populations to conversations.
  2. Travel, ticket, advertising and PR agencies will all find new ways to add value, or they will be subtracted from market relationships that no longer require them.
  3. Within companies, marketing communications will change from peripheral activities to core competencies.New media will flourish on the Web, and old media will learn to live with the Web and take advantage of it.
  4. Retail space will complement cyber space. Customer and technical service will change dramatically, as 800 numbers yield to URLs and hard copy documents yield to soft copy versions of the same thing… but in browsable, searchable forms.
  5. Shipping services of all kinds will bloom. So will fulfillment services. So will ticket and entertainment sales services.
  6. The web’s search engines will become the new yellow pages for the whole world. Your fingers will still do the walking, but they won’t get stained with ink. Same goes for the white pages. Also the blue ones.
  7. The scope of the first person plural will enlarge to include the whole world. “We” may mean everybody on the globe, or any coherent group that inhabits it, regardless of location. Each of us will swing from group to group like monkeys through trees.
  8. National borders will change from barricades and toll booths into speed bumps and welcome mats.
  9. The game will be over for what teacher John Taylor Gatto labels “the narcotic we call television.” Also for the industrial relic of compulsory education. Both will be as dead as the mainframe business. In other words: still trucking, but not as the anchoring norms they used to be.
  10. Big Business will become as anachronistic as Big Government, because institutional mass will lose leverage without losing inertia.Domination will fail where partnering succeeds, simply because partners with positive sums will combine to outproduce winners and losers with zero sums.
  11. Right will make might.
  12. And might will be mighty different.

Polyopoly

The Web is the board for a new game Phil Salin called “Polyopoly.” As Phil described it, Polyopoly is the opposite of Monopoly. The idea is not to win a fight over scarce real estate, but to create a farmer’s market for the boundless fruits of the human mind.

It’s too bad Phil didn’t live to see the web become what he (before anyone, I believe) hoped to create with AMIX: “the first efficient marketplace for information.” The result of such a marketplace, Phil said, would be polyopoly.

In Monopoly, what mattered were the three Ls of real estate: “location, location and location.”

On the web, location means almost squat.

What matters on the web are the three Cs: contentconnections and convenience. These are what make your home page a door the world beats a path to when it looks for the better mouse trap that only you sell. They give your webfront estate its real value.

If commercial interests have their way with the Web, we can also add a fourth C: cost. But how high can costs go in a polyopolistic economy? Not very. Because polyopoly creates…

An economy of abundance

The goods of Polyopoly and Monopoly are as different as love and lug nuts. Information is made by minds, not factories; and it tends to make itself abundant, not scarce. Moreover, scarce information tends to be worthless information.

Information may be bankable, but traditional banking, which secures and contains scarce commodities (or their numerical representations) does not respect the nature of information.

Because information abhors scarcity. It loves to reproduce, to travel, to multiply. Its natural habitats are wires and airwaves and disks and CDs and forums and books and magazines and web pages and hot links and chats over cappuccinos at Starbucks. This nature lends itself to polyopoly.

Polyopoly’s rules are hard to figure because the economy we are building with it is still new, and our vocabulary for describing it is sparse.

This is why we march into the Information Age hobbled by industrial metaphors. The “information highway” is one example. Here we use the language of freight forwarding to describe the movement of music, love, gossip, jokes, ideas and other communicable forms of knowledge that grow and change as they move from mind to mind.

We can at least say that knowledge, even in its communicable forms, is not reducible to data. Nor is the stuff we call “intellectual property.” A song and a bank account do not propagate the same ways. But we are inclined to say they do (and should), because we describe both with the same industrial terms.

All of which is why there is no more important work in this new economy than coining the new terms we use to describe it.

The Age of Enlightenment finally arrives

The best place to start looking for help is at the dawn of the Industrial Age. Because this was when the Age of Reason began. Nobody knew more about the polyopoly game — or played it — better than those champions of reason from whose thinking our modern republics are derived: Thomas Paine, Thomas Jefferson and Benjamin Franklin.

As Jon Katz says in “The Age of Paine” (Wired, May 1995 ), Thomas Paine was the the “moral father of the Internet.” Paine said “my country is the world,” and sought as little compensation as possible for his work, because he wanted it to be inexpensive and widely read. Paine’s thinking still shapes the politics of the U.S., England and France, all of which he called home.

Thomas Jefferson wrote the first rule of Polyopoly: “He who receives an idea from me receives instruction himself without lessening mine; as he who lights his taper at mine, receives light without darkening me.”

He also left a live bomb for modern intellectual property law: “Inventions then cannot, in nature, be a subject of property.” The best look at the burning fuse is John Perry Barlow’s excellent essay “The Economy of Ideas,” in the March 1994 issue of Wired. (I see that Jon Katz repeats it in his paean to Paine. Hey, if someone puts it to song, who gets the rights?)

If Paine was the moral father of the Internet, Ben Franklin’s paternity is apparent in Silicon Valley. Today he’d fit right in, inventing hot products, surfing the Web and spreading his wit and wisdom like a Johnny Cyberseed. Hell, he even has the right haircut.

Franklin left school at 10 and was barely 15 when he ran his brother’s newspaper, writing most of its content and getting quoted all over Boston. He was a self-taught scientist and inventor while still working as a writer and publisher. He also found time to discover electricity, create the world’s first postal service, invent a heap of handy products and serve as a politician and diplomat.

Franklin’s biggest obsession was time. He scheduled and planned constantly. He even wrote his famous epitaph when he was 22, six decades before he died. “The work shall not be lost,” it reads, “for it will (as he believed) appear once more in a new and more elegant edition, revised and edited by the author.”

One feels the ghost of Franklin today, editing the web.

Time to subtract the garbage

Combine Jefferson and Franklin and you get the two magnetic poles that tug at every polyopoly player: information that only gets more abundant, and time that only gets more scarce.

As Alain Couder of Groupe Bull puts it, “we treat time as a constant in all these formulas — revolutions per minute, instructions per second — yet we experience time as something that constantly decreases.”

After all, we’re born with an unknown sum of time, and we need to spend it all before we die. The notion of “saving” it is absurd. Time can only be spent.

So: to play Polyopoly well, we need to waste as little time as possible. This is not easy in a world where the sum of information verges on the infinite.

Which is why I think Esther Dyson might be our best polyopoly player.

“There’s too much noise out there anyway,” she says in ‘Esther Dyson on DaveNet‘ (12/1/94). “The new wave is not value added, it’s garbage-subtracted.”

Here’s a measure of how much garbage she subtracts from her own life: her apartment doesn’t even have a phone.

Can she play this game, or what?

So what’s left?

I wouldn’t bother to ask Esther if she watches television, or listens to the radio. I wouldn’t ask my wife, either. To her, television is exactly what Fred Allen called it forty years ago: “chewing gum for the eyes.” Ours heats up only for natural disasters and San Jose Sharks games.

Dean Landsman, a sharp media observer from the broadcast industry, tells me that John Gresham books are cutting into time that readers would otherwise spend watching television. And that’s just the beginning of a tide that will swell as every medium’s clients weigh more carefully what they do with their time.

Which is why it won’t be long before those clients wad up their television time and stick it under their computer. “Media will eat media,” Dean says.

The computer is looking a lot hungrier than the rest of the devices out there. Next to connected computing, television is AM radio.

Fasten your seat belts.

Web of the free, home of the Huns

Think of the Industrial world — the world of Big Business and Big Government — as a modern Roman Empire.

Now think of Bill Gates as Attilla the Hun.

Because that’s exactly how Bill looks to the Romans who still see the web, and everything else in the world, as a monopoly board. No wonder Bill doesn’t have a senator in his pocket (as Mark Stahlman told us in ‘Off to the Slaughter House,’ (DaveNet, 3/14/94).

Sadly for the the Romans, their empire is inhabited almost entirely by Huns, all working away on their PCs. Most of those Huns don’t have a problem with Bill. After all, Bill does a fine job of empowering his people, and they keep electing him with their checkbooks, credit cards and purchase orders.

Which is why, when they go forth to tame the web, these tough-talking Captains of Industry and Leaders of Government look like animated mannequins in Armani Suits: clothes with no emperor. Their content is emulation. They drone about serving customers and building architectures and setting standards and being open and competing on level playing fields. But their game is still control, no matter what else they call it.

Bill may be our emperor, but ruling Huns is not the same as ruling Romans. You have to be naked as a fetus and nearly as innocent. Because polyopoly does not reward the dark tricks that used to work for industry, government and organized crime. Those tricks worked in a world where darkness had leverage, where you could fool some of the people some of the time, and that was enough.

But polyopoly is a positive-sum game. Its goods are not produced by huge industries that control the world, but by smart industries that enable the world’s inhabitants. Like the PC business that thrives on it, information grows up from individuals, not down from institutions. Its economy thrives on abundance rather than scarcity. Success goes to enablers, not controllers. And you don’t enable people by fooling them. Or by manipulating them. Or by muscling them.

In fact, you don’t even play to win. As Craig Burton of The Burton Group puts it, “the goal isn’t win/win, it’s play/play.”

This is why Bill does not “control” his Huns the way IBM controlled its Romans. Microsoft plays by winning support, where IBM won by dominating the play. Just because Microsoft now holds a controlling position does not mean that a controlling mentality got them there. What I’ve seen from IBM and Apple looks far more Monopoly-minded and controlling than anything I’ve seen from Microsoft.

Does this mean that Bill’s manners aren’t a bit Roman at times? No. Just that the support Microsoft enjoys is a lot more voluntary on the part of its customers, users and partners. It also means that Microsoft has succeeded by playing Polyopoly extremely well. When it tries to play Monopoly instead, the Huns don’t like it. Bill doesn’t need the Feds to tell him when that happens. The Huns tell him soon enough.

market is a conversation

No matter how Roman Bill’s fantasies might become, he knows his position is hardly more substantial than a conversation. In fact, it IS a conversation.

I would bet that Microsoft is engaged in more conversations, more of the time, with more customers and partners, than any other company in the world. Like or hate their work, the company connects. I submit that this, as much as anything else, accounts for its success.

In the Industrial Age, a market was a target population. Goods rolled down a “value chain” that worked like a conveyor belt. Raw materials rolled into one end and finished products rolled out the other. Customers bought the product or didn’t, and customer feedback was limited mostly to the money it spent.

To encourage customer spending, “messages” were “targeted” at populations, through advertising, PR and other activities. The main purpose of these one-way communications was to stimulate sales. That model is obsolete. What works best to day is what Normann & Ramirez (Harvard Business Review, June/July 1993) call a “value constellation” of relationships that include customers, partners, suppliers, resellers, consultants, contractors and all kinds of people.

The Web is the star field within which constellations of companies, products and markets gather themselves. And what binds them together, in each case, are conversations.

How it all adds up

What we’re creating here is a new economy — an information economy.

Behind the marble columns of big business and big government, this new economy stands in the lobby like a big black slab. The primates who work behind those columns don’t know what this thing is, but they do know it’s important and good to own. The problem is, they can’t own it. Nobody can. Because it defies the core value in all economies based on physical goods: scarcity.

Scarcity ruled the stone hearts and metal souls of every zero-sum value system that ever worked — usually by producing equal quantities of gold and gore. And for dozens of millennia, we suffered with it. If Tribe A crushed Tribe B, it was too bad for Tribe B. Victors got the spoils.

This win/lose model has been in decline for some time. Victors who used to get spoils now just get responsibilities. Cooperation and partnership are now more productive than competition and domination. Why bomb your enemy when you can get him on the phone and do business with him? Why take sides when the members of “us” and “them” constantly change?

The hard evidence is starting to come in. A recent Wharton Impact report said, “Firms which specified their objectives as ‘beating our competitors’ or ‘gaining market share’ earned substantially lower profits over the period.” We’re reading stories about women-owned businesses doing better, on the whole, because women are better at communicating and less inclined to waste energy by playing sports and war games in their marketplaces.

From the customer’s perspective, what we call “competition” is really a form of cooperation that produces abundant choices. Markets are created by addition and multiplication, not just by subtraction and division.

In my old Mac IIci, I can see chips and components from at least 11 different companies and 8 different countries. Is this evidence of war among Apple’s suppliers? Do component vendors succeed by killing each other and limiting choices for their customers? Did Apple’s engineers say, “Gee, let’s help Hitachi kill Philips on this one?” Were they cheering for one “side” or another? The answer should be obvious.

But it isn’t, for two reasons. One is that the “Dominator Model,” as anthropologist (and holocaust survivor) Riane Eisler calls it, has been around for 20,000 years, and until recently has reliably produced spoils for victors. The other is that conflict always makes great copy. To see how seductive conflict-based thinking is, try to find a hot business story that isn’t filled with sports and war metaphors. It isn’t easy.

Bound by the language of conflict, most of us still believe that free enterprise runs on competition between “sides” driven by urges to dominate, and that the interests of those “sides” are naturally opposed.

To get to the truth here, just ask this: which has produced more — the U.S. vs. Japan, or the U.S. + Japan? One produced World War II and a lot of bad news. The other produced countless marvels — from cars to consumer electronics — on which the whole world depends.

Now ask this: which has produced more — Apple vs. Microsoft or Apple + Microsoft? One profited nobody but the lawyers, and the other gave us personal computing as we know it today.

The Plus Paradigm

What brings us to Reality 2.0 is the Plus Paradigm.

The Plus Paradigm says that our world is a positive construction, and that the best games produce positive sums for everybody. It recognizes the power of information and the value of abundance. (Think about it: the best information may have the highest power to abound, and its value may vary as the inverse of its scarcity.)

Over the last several years, mostly through discussions with client companies that are struggling with changes that invalidate long-held assumptions, I have built table of old (Reality 1.0) vs. new (Reality 2.0) paradigms. The difference between these two realities, one client remarked, is that the paradigm on the right is starting to work better than the paradigm on the left.

 

Paradigm Reality 1.0 Reality 2.0
Means to ends Domination Partnership
Cause of progress Competition Collaboration
Center of interest Personal Social
Concept of systems Closed Open
Dynamic Win/Lose Play/Play
Roles Victor/Victim Partner/Ally
Primary goods Capital Information
Source of leverage Monopoly Polyopoly
Organization Hierarchy Flexiarchy
Roles Victor/Victim Server/Client
Scope of self-interest Self/Nation Self/World
Source of power Might Right
Source of value Scarcity Abundance
Stage of growth Child (selfish) Adult (social)
Reference valuables Metal, Money Life, Time
Purpose of boundaries Protection Limitation

Changes across the paradigms show up as positive “reality shifts.” The shift is from OR logic to AND logic, from Vs. to +:

 

Reality 1.0 Reality 2.0
man vs nature man + nature
Labor vs management Labor + management
Public vs private Public + private
Men vs women Men + women
Us vs them Us + them
Majority vs minority Majority + minority
Party vs party Party + party
Urban vs rural Urban + rural
Black vs white Black + white
Business vs govt. Business + govt.

The Plus Paradigm comprehends the world as a positive construction, and sees that the best games produce positive sums for everybody. It recognizes the power of information and the value of abundance. (Think about it: the best information may have the highest power to abound, and its value may vary as the inverse of its scarcity.)

For more about this whole way of thinking, see Bernie DeKoven’s ideas about “the ME/WE” at his “virtual playground.”]

This may sound sappy, but information works like love: when you give it away, you still get to keep it. And when you give it back, it grows.

Which has always been the case. But in Reality 2.0, it should become a lot more obvious.

« Older entries