Business

You are currently browsing the archive for the Business category.

racing shopping cartYesterday my 15-year-old son and I made brief stop at the Micro Center in Cambridge, looking at what it might take (and cost) to build a Linux/Windows desktop computer from the ground up—something that had been an interest of his for the last couple of years. (Mine too, actually.) The answer, price-wise (at least there), was more than we wanted to spend, so we decided to stop looking and head out.

But there were plenty of distractions in the store, so I paused at a few counters, tables and bins to examine stuff like cheap ($2.99) optical mice, flat-screen monitors (mine are old and fading), and various kinds of outboard drives. (Two of mine crapped out last week, and our first stop of the day had been dropping them off at a repair shop.)

As it happens we were on our way back from a hike in the Blue Hills Reservation, where the kid’s patience had already been stretched by my tendency to pause to munch and reminisce at every huckleberry patch (I grew up spending summers among them), plus a half-hour stop at the observatory and science center at the crest of Blue Hill itself. So he was glad when we finally walked out of the store, and, presumably, to the car and then home.

But there was a Trader Joe’s in the same lot, and I wanted to make a quick stop there to pick up a few supplies. The kid groaned.

“I promise to shop like a man,” I said. “Fast as I can.” Then I began to sing that rhyme to the tune of the Four Season’s “Walk Like a Man.” Shop like a man, fast as you can. Shop like a man, my son…

He replied, “The true man shops without stopping.”

It then took us less than two minutes to get what we wanted (yogurt for smoothies, peppermint tea, a couple other things), check out and leave. The kid calls this “speed shopping.” Also “powering through” a store. Rob Becker, whose Defending the Caveman is a required theater experience (go as a couple—it’ll help), puts it this way in an interview:

Q: What does the title of your show refer to?

A: The show is about an average guy’s response to all the anger that is coming at him. It goes back to the beginning of time. The image of the caveman is that of a guy bopping a woman on the head and dragging her back to his cave. But no serious anthropologist believes that. The caveman thought women were magical. But the caveman, to me, became a symbol of man being misunderstood.

Q: What were our primitive roles, and what effect do they have on our behavior today?

A: Men were hunters; women were gatherers. The hunter locks in on one thing, which is why guys have a narrow focus, whether it’s watching TV, reading the newspaper or driving. They block everything else out because, as hunters, they had to focus on the rear end of an animal. On the other hand, women, as gatherers, had to take in the whole landscape. Their field of vision is wider.

Q: How do these differences manifest themselves in a shopping mall?

A: The hunter tracks one thing. If I need a shirt, I go and kill a shirt with my credit card and drag it home. The gatherer doesn’t know what she’s going for because she doesn’t know what’s going to be ripe or in bloom. She’s open to the environment. When I go shopping with my wife, I keep bugging her about what she’s looking for, and she says, “Don’t bother me; I’ll know it when I see it.”

Q: Do men and women respond differently to an empty bowl of potato chips?

A: Women cooperate, men negotiate. If six women are sitting around a bowl and it gets low, they all get up and go to the kitchen as a pack. And while they’re there, they’ll make more dip. With six guys, it’s completely different. One guy will say, “It’s my house; I’m not going to refill it.” Another will say, “Yeah, but I bought ’em.” Another will say, “But we used my car.” I’ve seen it come down to their using a tape measure so the guy closest to the kitchen had to go.

The italics are mine. The kid’s too.

Tags:

The Web as we know it today was two years old in June 1997, when the page below went up. It lasted, according to Archive.org, until October 2010. When I ran across it back then, it blew my mind — especially the passage I have boldfaced in the long paragraph near the end.

The Internet is a table for two. Any two, anywhere. All attempts to restrict it and lock it down will fail to alter the base fact that the Net’s protocols are designed to eliminate the functional distance, as far as possible, between any two points, any two devices, any two people. This is the design principle for a World of Ends. That last link goes to a piece and I wrote in 2003, to as little effect, I suspect, as @Man’s piece had in 1997. I doubt any of the three of us would write the same things the same ways today. But the base principle, that table-for-two-ness, is something I believe all of us respect. It won’t go away. That’s why I thought it best to disinter @Man’s original and run it again here.

I have another reason. Searching for @Man is Michael O’Connor Clarke‘s last blog post before falling ill in June. I don’t know who or where @Man is today. I did correspond with him briefly when we were writing The Cluetrain Manifesto in 1999, but all my emails from that time were trashed years ago. So I’m clueless on this one. If you’re out there and reading this, @Man, get in touch. Thanks.


Attention, Fat Corporate Bastards!

by @Man

Attention, Fat Corporate Bastards!
Attention, Fat Corporate Bastards in your three piece suits!

Attention Fat Congressional Bastards!
Attention, Fat Congressional Bastards in your three piece suits!

We know about your plans for the Internet. Although you won’t listen, we would like to point out how wrong you are now, so we can point out gleefully how right we were later.

According to a presentation given by Nicholas Negroponte at the Sheraton Hotel in downtown Toronto, called “The Information Age: Transforming Technology to Strategy,” here is what you Fat Corporate Bastards think we want:

  1. Movies on demand (94% executive approval)
  2. Home shopping (89% approval)
  3. On-line video games (89% approval)

Here’s what you think we don’t want:

  1. educational services
  2. access to government information

Here’s a clue: you can stick the first set up your bum, sideways.

Here’s what we really want. Don’t bother paying attention; I want you to learn the hard way, by wasting lots of time and money.

Desired Internet Service Attributes:

  1. Cheap, unlimited flat-rate international communication
  2. Hands off: No censorship, no advertisements, no lawsuits
  3. Respect
  4. Privacy

Desired Internet Services:

  1. Email, WWW, Usenet, IRC, FTP
  2. Explicit adult material
  3. Access to government and corporate information for oversight purposes
  4. Educational services
  5. Free networked multiplayer games

Guess what? We already have all the things we want. As soon as we’re ready for something new, we get it — for free. Why? Because the traditional consumer/producer relationship doesn’t exist on the Internet. Don’t you think that if we really wanted the things you think we want, we would have already developed them some time in the past 20 years for free? Free! Free! It’s so much fun to be able to use that word you hate. Take your margins with you and stick to trying to shove ads onto PBS and NPR.

You almost certainly think of the Internet as an audience of some type–perhaps somewhat captive. If you actually had even the faintest glimmering of what reality on the net is like, you’d realize that the real unit of currency isn’t dollars, data, or digicash. It’s reputation and respect. Think about how that impacts your corporate strategy. Think about how you’d feel if a guy sat down at your lunch table one afternoon when you were interviewing an applicant for a vice-president’s position and tried to sell the two of you a car, and wouldn’t go away. Believe it or not, what you want to do with the Internet is very similar. Just as you have a reasonable expectation of privacy and respect when you’re at a table for two in a public place, so too do the users of the Internet have a reasonable expectation of privacy and respect. When you think of the Internet, don’t think of Mack trucks full of widgets destined for distributorships, whizzing by countless billboards. Think of a table for two.

If you don’t understand right now, don’t worry. You’ll learn it the hard way. We’ll be there to help you learn, you filthy corporate guttersnipes.

With bile and premonitions of glee,

@Man


@Man, World-Class Data Snuggler

My son remembers what I say better than I do. One example is this:

I uttered it in some context while wheezing my way up a slope somewhere in the Reservation.

Except it wasn’t there. Also I didn’t say that. Exactly. Or alone. He tells me it came up while we were walking across after getting some hang time after Mass at the . He just told me the preceding while looking over my shoulder at what I’m writing. He also explains that the above is compressed from dialog between the two of us, at the end of which he said it should be a bumper sticker, which he later designed, sent to me and you see above.

What I recall about the exchange, incompletely (as all recall is, thanks to the graces and curses of short term memory), is that I was thinking about the imperatives of invention, and why my nature is native to Silicon Valley, which exists everywhere ideas and ambition combine and catch fire.

I fired up Searls.com in early 1995, and began publishing on it immediately. A lot of that writing is at a subdomain called Reality 2.0. Here is one piece from that early list, which I put up just days before Bill Gates’ famously (at the time) “declared war” on the browser market (essentially, Netscape). Interesting to look back on what happened and what didn’t. — Doc


THE WEB
AND THE NEW REALITY
By Doc Searls
December 1, 1995 

Contents


Reality 2.0

The import of the Internet is so obvious and extreme that it actually defies valuation: witness the stock market, which values Netscape so far above that company’s real assets and earnings that its P/E ratio verges on the infinite.

Whatever we’re driving toward, it is very different from anchoring certainties that have grounded us for generations, if not for the duration of our species. It seems we are on the cusp of a new and radically different reality. Let’s call it Reality 2.0.

The label has a millenial quality, and a technical one as well. If Reality 2.0 is Reality 2.000, this month we’re in Reality 1.995.12.

With only a few revisions left before Reality 2.0 arrives, we’re in a good position to start seeing what awaits. Here are just a few of the things this writer is starting to see…

  1. As more customers come into direct contact with suppliers, markets for suppliers will change from target populationsto conversations.
  2. Travel, ticket, advertising and PR agencies will all find new ways to add value, or they will be subtracted from market relationships that no longer require them.
  3. Within companies, marketing communications will change from peripheral activities to core competencies.New media will flourish on the Web, and old media will learn to live with the Web and take advantage of it.
  4. Retail space will complement cyber space. Customer and technical service will change dramatically, as 800 numbers yield to URLs and hard copy documents yield to soft copy versions of the same thing… but in browsable, searchable forms.
  5. Shipping services of all kinds will bloom. So will fulfillment services. So will ticket and entertainment sales services.
  6. The web’s search engines will become the new yellow pages for the whole world. Your fingers will still do the walking, but they won’t get stained with ink. Same goes for the white pages. Also the blue ones.
  7. The scope of the first person plural will enlarge to include the whole world. “We” may mean everybody on the globe, or any coherent group that inhabits it, regardless of location. Each of us will swing from group to group like monkeys through trees.
  8. National borders will change from barricades and toll booths into speed bumps and welcome mats.
  9. The game will be over for what teacher John Taylor Gatto labels “the narcotic we call television.” Also for the industrial relic of compulsory education. Both will be as dead as the mainframe business. In other words: still trucking, but not as the anchoring norms they used to be.
  10. Big Business will become as anachronistic as Big Government, because institutional mass will lose leverage without losing inertia.Domination will fail where partnering succeeds, simply because partners with positive sums will combine to outproduce winners and losers with zero sums.
  11. Right will make might.
  12. And might will be mighty different.

Polyopoly

The Web is the board for a new game Phil Salin called “Polyopoly.” As Phil described it, Polyopoly is the opposite of Monopoly. The idea is not to win a fight over scarce real estate, but to create a farmer’s market for the boundless fruits of the human mind.

It’s too bad Phil didn’t live to see the web become what he (before anyone, I believe) hoped to create with AMIX: “the first efficient marketplace for information.” The result of such a marketplace, Phil said, would be polyopoly.

In Monopoly, what mattered were the three Ls of real estate: “location, location and location.”

On the web, location means almost squat.

What matters on the web are the three Cs: contentconnections and convenience. These are what make your home page a door the world beats a path to when it looks for the better mouse trap that only you sell. They give your webfront estate its real value.

If commercial interests have their way with the Web, we can also add a fourth C: cost. But how high can costs go in a polyopolistic economy? Not very. Because polyopoly creates…

An economy of abundance

The goods of Polyopoly and Monopoly are as different as love and lug nuts. Information is made by minds, not factories; and it tends to make itself abundant, not scarce. Moreover, scarce information tends to be worthless information.

Information may be bankable, but traditional banking, which secures and contains scarce commodities (or their numerical representations) does not respect the nature of information.

Because information abhors scarcity. It loves to reproduce, to travel, to multiply. Its natural habitats are wires and airwaves and disks and CDs and forums and books and magazines and web pages and hot links and chats over cappuccinos at Starbucks. This nature lends itself to polyopoly.

Polyopoly’s rules are hard to figure because the economy we are building with it is still new, and our vocabulary for describing it is sparse.

This is why we march into the Information Age hobbled by industrial metaphors. The “information highway” is one example. Here we use the language of freight forwarding to describe the movement of music, love, gossip, jokes, ideas and other communicable forms of knowledge that grow and change as they move from mind to mind.

We can at least say that knowledge, even in its communicable forms, is not reducible to data. Nor is the stuff we call “intellectual property.” A song and a bank account do not propagate the same ways. But we are inclined to say they do (and should), because we describe both with the same industrial terms.

All of which is why there is no more important work in this new economy than coining the new terms we use to describe it.

The Age of Enlightenment finally arrives

The best place to start looking for help is at the dawn of the Industrial Age. Because this was when the Age of Reason began. Nobody knew more about the polyopoly game — or played it — better than those champions of reason from whose thinking our modern republics are derived: Thomas Paine, Thomas Jefferson and Benjamin Franklin.

As Jon Katz says in “The Age of Paine” (Wired, May 1995 ), Thomas Paine was the the “moral father of the Internet.” Paine said “my country is the world,” and sought as little compensation as possible for his work, because he wanted it to be inexpensive and widely read. Paine’s thinking still shapes the politics of the U.S., England and France, all of which he called home.

Thomas Jefferson wrote the first rule of Polyopoly: “He who receives an idea from me receives instruction himself without lessening mine; as he who lights his taper at mine, receives light without darkening me.”

He also left a live bomb for modern intellectual property law: “Inventions then cannot, in nature, be a subject of property.” The best look at the burning fuse is John Perry Barlow’s excellent essay “The Economy of Ideas,” in the March 1994 issue of Wired. (I see that Jon Katz repeats it in his paean to Paine. Hey, if someone puts it to song, who gets the rights?)

If Paine was the moral father of the Internet, Ben Franklin’s paternity is apparent in Silicon Valley. Today he’d fit right in, inventing hot products, surfing the Web and spreading his wit and wisdom like a Johnny Cyberseed. Hell, he even has the right haircut.

Franklin left school at 10 and was barely 15 when he ran his brother’s newspaper, writing most of its content and getting quoted all over Boston. He was a self-taught scientist and inventor while still working as a writer and publisher. He also found time to discover electricity, create the world’s first postal service, invent a heap of handy products and serve as a politician and diplomat.

Franklin’s biggest obsession was time. He scheduled and planned constantly. He even wrote his famous epitaph when he was 22, six decades before he died. “The work shall not be lost,” it reads, “for it will (as he believed) appear once more in a new and more elegant edition, revised and edited by the author.”

One feels the ghost of Franklin today, editing the web.

Time to subtract the garbage

Combine Jefferson and Franklin and you get the two magnetic poles that tug at every polyopoly player: information that only gets more abundant, and time that only gets more scarce.

As Alain Couder of Groupe Bull puts it, “we treat time as a constant in all these formulas — revolutions per minute, instructions per second — yet we experience time as something that constantly decreases.”

After all, we’re born with an unknown sum of time, and we need to spend it all before we die. The notion of “saving” it is absurd. Time can only be spent.

So: to play Polyopoly well, we need to waste as little time as possible. This is not easy in a world where the sum of information verges on the infinite.

Which is why I think Esther Dyson might be our best polyopoly player.

“There’s too much noise out there anyway,” she says in ‘Esther Dyson on DaveNet‘ (12/1/94). “The new wave is not value added, it’s garbage-subtracted.”

Here’s a measure of how much garbage she subtracts from her own life: her apartment doesn’t even have a phone.

Can she play this game, or what?

So what’s left?

I wouldn’t bother to ask Esther if she watches television, or listens to the radio. I wouldn’t ask my wife, either. To her, television is exactly what Fred Allen called it forty years ago: “chewing gum for the eyes.” Ours heats up only for natural disasters and San Jose Sharks games.

Dean Landsman, a sharp media observer from the broadcast industry, tells me that John Gresham books are cutting into time that readers would otherwise spend watching television. And that’s just the beginning of a tide that will swell as every medium’s clients weigh more carefully what they do with their time.

Which is why it won’t be long before those clients wad up their television time and stick it under their computer. “Media will eat media,” Dean says.

The computer is looking a lot hungrier than the rest of the devices out there. Next to connected computing, television is AM radio.

Fasten your seat belts.

Web of the free, home of the Huns

Think of the Industrial world — the world of Big Business and Big Government — as a modern Roman Empire.

Now think of Bill Gates as Attilla the Hun.

Because that’s exactly how Bill looks to the Romans who still see the web, and everything else in the world, as a monopoly board. No wonder Bill doesn’t have a senator in his pocket (as Mark Stahlman told us in ‘Off to the Slaughter House,’ (DaveNet, 3/14/94).

Sadly for the the Romans, their empire is inhabited almost entirely by Huns, all working away on their PCs. Most of those Huns don’t have a problem with Bill. After all, Bill does a fine job of empowering his people, and they keep electing him with their checkbooks, credit cards and purchase orders.

Which is why, when they go forth to tame the web, these tough-talking Captains of Industry and Leaders of Government look like animated mannequins in Armani Suits: clothes with no emperor. Their content is emulation. They drone about serving customers and building architectures and setting standards and being open and competing on level playing fields. But their game is still control, no matter what else they call it.

Bill may be our emperor, but ruling Huns is not the same as ruling Romans. You have to be naked as a fetus and nearly as innocent. Because polyopoly does not reward the dark tricks that used to work for industry, government and organized crime. Those tricks worked in a world where darkness had leverage, where you could fool some of the people some of the time, and that was enough.

But polyopoly is a positive-sum game. Its goods are not produced by huge industries that control the world, but by smart industries that enable the world’s inhabitants. Like the PC business that thrives on it, information grows up from individuals, not down from institutions. Its economy thrives on abundance rather than scarcity. Success goes to enablers, not controllers. And you don’t enable people by fooling them. Or by manipulating them. Or by muscling them.

In fact, you don’t even play to win. As Craig Burton of The Burton Group puts it, “the goal isn’t win/win, it’s play/play.”

This is why Bill does not “control” his Huns the way IBM controlled its Romans. Microsoft plays by winning support, where IBM won by dominating the play. Just because Microsoft now holds a controlling position does not mean that a controlling mentality got them there. What I’ve seen from IBM and Apple looks far more Monopoly-minded and controlling than anything I’ve seen from Microsoft.

Does this mean that Bill’s manners aren’t a bit Roman at times? No. Just that the support Microsoft enjoys is a lot more voluntary on the part of its customers, users and partners. It also means that Microsoft has succeeded by playing Polyopoly extremely well. When it tries to play Monopoly instead, the Huns don’t like it. Bill doesn’t need the Feds to tell him when that happens. The Huns tell him soon enough.

market is a conversation

No matter how Roman Bill’s fantasies might become, he knows his position is hardly more substantial than a conversation. In fact, it IS a conversation.

I would bet that Microsoft is engaged in more conversations, more of the time, with more customers and partners, than any other company in the world. Like or hate their work, the company connects. I submit that this, as much as anything else, accounts for its success.

In the Industrial Age, a market was a target population. Goods rolled down a “value chain” that worked like a conveyor belt. Raw materials rolled into one end and finished products rolled out the other. Customers bought the product or didn’t, and customer feedback was limited mostly to the money it spent.

To encourage customer spending, “messages” were “targeted” at populations, through advertising, PR and other activities. The main purpose of these one-way communications was to stimulate sales. That model is obsolete. What works best to day is what Normann & Ramirez (Harvard Business Review, June/July 1993) call a “value constellation” of relationships that include customers, partners, suppliers, resellers, consultants, contractors and all kinds of people.

The Web is the star field within which constellations of companies, products and markets gather themselves. And what binds them together, in each case, are conversations.

How it all adds up

What we’re creating here is a new economy — an information economy.

Behind the marble columns of big business and big government, this new economy stands in the lobby like a big black slab. The primates who work behind those columns don’t know what this thing is, but they do know it’s important and good to own. The problem is, they can’t own it. Nobody can. Because it defies the core value in all economies based on physical goods: scarcity.

Scarcity ruled the stone hearts and metal souls of every zero-sum value system that ever worked — usually by producing equal quantities of gold and gore. And for dozens of millennia, we suffered with it. If Tribe A crushed Tribe B, it was too bad for Tribe B. Victors got the spoils.

This win/lose model has been in decline for some time. Victors who used to get spoils now just get responsibilities. Cooperation and partnership are now more productive than competition and domination. Why bomb your enemy when you can get him on the phone and do business with him? Why take sides when the members of “us” and “them” constantly change?

The hard evidence is starting to come in. A recent Wharton Impact report said, “Firms which specified their objectives as ‘beating our competitors’ or ‘gaining market share’ earned substantially lower profits over the period.” We’re reading stories about women-owned businesses doing better, on the whole, because women are better at communicating and less inclined to waste energy by playing sports and war games in their marketplaces.

From the customer’s perspective, what we call “competition” is really a form of cooperation that produces abundant choices. Markets are created by addition and multiplication, not just by subtraction and division.

In my old Mac IIci, I can see chips and components from at least 11 different companies and 8 different countries. Is this evidence of war among Apple’s suppliers? Do component vendors succeed by killing each other and limiting choices for their customers? Did Apple’s engineers say, “Gee, let’s help Hitachi kill Philips on this one?” Were they cheering for one “side” or another? The answer should be obvious.

But it isn’t, for two reasons. One is that the “Dominator Model,” as anthropologist (and holocaust survivor) Riane Eisler calls it, has been around for 20,000 years, and until recently has reliably produced spoils for victors. The other is that conflict always makes great copy. To see how seductive conflict-based thinking is, try to find a hot business story that isn’t filled with sports and war metaphors. It isn’t easy.

Bound by the language of conflict, most of us still believe that free enterprise runs on competition between “sides” driven by urges to dominate, and that the interests of those “sides” are naturally opposed.

To get to the truth here, just ask this: which has produced more — the U.S. vs. Japan, or the U.S. + Japan? One produced World War II and a lot of bad news. The other produced countless marvels — from cars to consumer electronics — on which the whole world depends.

Now ask this: which has produced more — Apple vs. Microsoft or Apple + Microsoft? One profited nobody but the lawyers, and the other gave us personal computing as we know it today.

The Plus Paradigm

What brings us to Reality 2.0 is the Plus Paradigm.

The Plus Paradigm says that our world is a positive construction, and that the best games produce positive sums for everybody. It recognizes the power of information and the value of abundance. (Think about it: the best information may have the highest power to abound, and its value may vary as the inverse of its scarcity.)

Over the last several years, mostly through discussions with client companies that are struggling with changes that invalidate long-held assumptions, I have built table of old (Reality 1.0) vs. new (Reality 2.0) paradigms. The difference between these two realities, one client remarked, is that the paradigm on the right is starting to work better than the paradigm on the left.

Paradigm Reality 1.0 Reality 2.0
Means to ends Domination Partnership
Cause of progress Competition Collaboration
Center of interest Personal Social
Concept of systems Closed Open
Dynamic Win/Lose Play/Play
Roles Victor/Victim Partner/Ally
Primary goods Capital Information
Source of leverage Monopoly Polyopoly
Organization Hierarchy Flexiarchy
Roles Victor/Victim Server/Client
Scope of self-interest Self/Nation Self/World
Source of power Might Right
Source of value Scarcity Abundance
Stage of growth Child (selfish) Adult (social)
Reference valuables Metal, Money Life, Time
Purpose of boundaries Protection Limitation

Changes across the paradigms show up as positive “reality shifts.” The shift is from OR logic to AND logic, from Vs. to +:

 

Reality 1.0 Reality 2.0
man vs nature man + nature
Labor vs management Labor + management
Public vs private Public + private
Men vs women Men + women
Us vs them Us + them
Majority vs minority Majority + minority
Party vs party Party + party
Urban vs rural Urban + rural
Black vs white Black + white
Business vs govt. Business + govt.

The Plus Paradigm comprehends the world as a positive construction, and sees that the best games produce positive sums for everybody. It recognizes the power of information and the value of abundance. (Think about it: the best information may have the highest power to abound, and its value may vary as the inverse of its scarcity.)

For more about this whole way of thinking, see Bernie DeKoven’s ideas about “the ME/WE” at his “virtual playground.”]

This may sound sappy, but information works like love: when you give it away, you still get to keep it. And when you give it back, it grows.

Which has always been the case. But in Reality 2.0, it should become a lot more obvious.

Tags: , , , , , , , , ,

… I’ll be speaking about The Intention Economy at the Hyatt Regency Santa Clara, in the Winchester Ballroom, courtesy of the good people at Weber Shandwick Here’s a link to the invite. (It’s open and free, but ya gotta RSVP.)

The book covers a lot of topics, and the one I’m going to focus on tonight is marketing. Right now the big bux in marketing are going toward Big Data, with a lesser emphasis on Big Engagement. This needs to be reversed.

What marketing needs to do now is get personal, and not just social. Marketing needs to start truly listening and interacting with customers on a personal level. Crunching numbers to improve guesswork. won’t cut it any more.

I’ve got more to say about that, but I’m saving it for tonight. Look forward to seeing you there.

Looks like IBM and I Bookare in agreement. Last week the first image you saw at IBM’s site (at least here in the U.S.) was a larger version of the one on the left, with the headline “Meet the new Chief Executive Customer. That’s who’s driving the new science of marketing.”*

At the “learn more” link, the headline reads, “The new CMO and the science of giving people what they want.” In the copy there’s this:

In this highly connected world of commerce and communication, you can no longer market broadly to a demographic. A consumer doesn’t want to be a “segment.” She’s an individual. To capture and keep her business, she must be treated as one.

The onus of this evolution has landed on the doorstep of the Chief Marketing Officer. And that means that the mind-set, as well as the skill set, of a CMO has to evolve right along with it. IBM has identified the three mandates for the new CMO.

The first of those is “Harness data to paint a predictive picture of each customer as an individual—on a massive scale.” The second is “Create ‘systems of engagement’ so you do more than shape desire—you predict it. The third is “Design your culture and brand so they are authentically one.”

Above that last one it says this:

Your brand is tested in every interaction. Today, the same transparency that allows you to understand each customer as an individual; conversely allows each customer to understand everything about your company. And gaps between what the brand promises and what it delivers are known―not just by those who experience them, but by others in their social network. Thus how authentically a culture lives its brand becomes the measure of success. This is the heart of becoming a social business. Marketing’s role is to close the gaps by building a system so that in every interaction brand and culture are one.

Two problems with that. Also two opportunities:

  1. Transparency isn’t what allows a company to understand each customer as an individual. Direct interaction is. Better yet, direct interaction that the customer drives, in her own way.
  2. “Becoming a social business” is very 2011. Business was personal in the first place, and it will be personal again. What the hell is a Chief Executive Customer if she doesn’t have direct personal influence with the company?

IBM is familiar with CRM: Customer Relationship Management. Now it needs to get familiar with VRM: Vendor Relationship Management. Because it’s with VRM tools and services that customers will have the means to tell companies exactly what IBM’s headline welcomes: what they want.

Meanwhile, here’s the bad news for Big Data: what customers don’t want, most of the time, is to be told constantly what they want. Or to be told that their Chief Executive status with a company derives from a “predictive picture” derived from “harnessed data” about one’s individual self — least of all “on a massive scale” in which desire is not only “shaped” but “predicted.” IBM continues,

Today’s abundance of data helps companies understand each customer in multiple dimensions. This leads to insights which, when combined, help build a clearer understanding of each customer as an individual. With that, marketers can make better decisions about the mix that will serve customers more completely—based on needs, desire, likely next action, opinions. Today’s marketing practice requires building this capability of understanding customers as individuals across millions of interactions.

There is no clearer sign that a relationship has gone bad than this statement: “We don’t need to talk. I already know what you’re going to say.” Or worse, “I can also shape your desire.” Hell, that’s a relationship headed for divorce, and it’s hardly begun.

But that’s what Big Data marketing is about — so far — and why it will fail if the customer is not truly involved as an independent and autonomous human being, and not just as a “million points of data:+” (IBM’s term), and then as a target for messages and offers, based on the crunching of that data.

On that same page IBM posts this short pile of Big Data stats:

Earth to IBM and CMOs: The next era isn’t social. It’s personal. No amount of marketing analytics will out-perform knowing exactly what the customer wants, intends, or wishes to contribute to the company’s intelligence about the marketplace —in her own ways, and on her own terms.

If a brand wants to be fully understood and respected — and if it deserves both — it needs to be ready for customers to truly engage, and not just be told what they’re like, and then guessed at.

The means for that will be provided by both sides, not just by one. Until IBM and CMOs welcome independent customers, operating at full agency, outside any company’s silo or walled garden, all this mandating will be the sound of one hand shaking.

*The links have 404’d or changed. Here’s what I can find in March, 2016:

 

Tags: ,

Apple TV (whatever it ends up being called) will kill cable. It will also give TV new life in a new form.

manhole coverIt won’t kill the cable companies, which will still carry data to your house, and which will still get a cut of the content action, somehow. But the division between cable content and other forms you pay for will be exposed for the arbitrary thing it is, in an interactive world defined by the protocols of the Internet, rather than by the protocols of television. It will also contain whatever deals Apple does for content distribution.

These deals will be motivated by a shared sense that Something Must Be Done, and by knowing that Apple will make TV look and work better than anybody else ever could. The carriers have seen this movie before, and they’d rather have a part in it than outside of it. For a view of the latter, witness the fallen giants called Sony and Nokia. (A friend who worked with the latter called them “a tree laying on the ground,” adding “They put out leaves every year. But that doesn’t mean they’re standing up.”)

I don’t know anything about Apple’s plans. But I know a lot about Apple, as do most of us. Here are the operative facts as they now stand (or at least as I see them):

  1. Apple likes to blow up categories that are stuck. They did it with PCs, laptops, printers, mp3 players, smartphones, music distribution and retailing. To name a few.
  2. TV display today is stuck in 1993. That’s when the ATSC (which defined HDTV standards) settled on the 16:9 format, with 1080 pixels (then called “lines”) of vertical resolution, and with picture clarity and sound quality contained within the data carrying capacity of a TV channel 6MHz wide. This is why all “Full HD” screens remain stuck at 1080 pixels high, no matter how physically large those screens might be. It’s also why more and more stand-alone computer screens are now 1920 x 1080. They’re made for TV. Would Steve Jobs settle for that? No way.
  3. Want a window into the future where Apple makes a TV screen that’s prettier than all others sold? Look no farther than what Apple says about the new iPad‘s resolution:
  4. Cable, satellite and over-the-air channels are still stuck at 6MHz of bandwidth (in the original spectrum-based meaning of that word). They’re also stuck with a need to maximize the number of channels within a finite overall bandwidth. This has resulted in lowered image quality on most channels, even though the images are still, technically, “HD”. That’s another limitation that surely vexed Steve.
  5. The TV set makers (Sony, Visio, Samsung, Panasonic, all of them) have made operating a simple thing woefully complicated, with controls (especially remotes) that defy comprehension. The set-top-box makers have all been nearly as bad for the duration. Same goes for the makers of VCR, DVD, PVR and other media players. Home audio-video system makers too. It’s a freaking mess, and has been since the ’80s.
  6. Steve at AllThingsD on 2 June 2010: “The only way that’s ever going to change is if you can really go back to square one and tear up the set-top-box and redesign it from scratch with a consistent UI, withall these different functions, and get it to the consumer in a way they are willing to pay for. We decided, what product do you want most? A better tv or a better phone? A better TV or a tablet? … The TV will lose until there is a viable go-to-market strategy. That’s the fundamental problem.” He also called Apple TV (as it then stood) a “hobby”, for that reason. But Apple is bigger now, and has far more market reach and clout. In some categories it’s nearly a monopoly already, with at least as much leverage as Microsoft ever had. And you know that Apple hasn’t been idle here.
  7. Steve Jobs was the largest stockholder in Disney. He’s gone, but the leverage isn’t. Disney owns ABC and ESPN.
  8. The main thing that keeps cable in charge of TV content is not the carriers, but ESPN, which represents up to 40% of your cable bill, whether you like sports or not. ESPN isn’t going to bypass cable — they’ve got that distribution system locked in, and vice versa. The whole pro sports system, right down to those overpaid athletes in baseball and the NBA, depend on TV revenues, which in turn rest on advertising to eyeballs over a system made to hold those eyeballs still in real time. “There are a lot of entrenched interests,” says Peter Kafka in this On the Media segment. The only thing that will de-entrench them is serious leverage from somebody who can make go-to-market, UI, quality, and money-flow work. Can Apple do that without Steve? Maybe not. But it’s still the way to bet.

Cable folks have a term for video distribution on the net Net. They call it “over the top“. Of them, that is, and their old piped content system.

That’s actually what many — perhaps most — viewers would prefer: an à la carte choice of “content” (as we have now all come to say). Clearly the end state is one in which you’ll pay for some stuff while other stuff is free. Some of it will be live, and some of it recorded. That much won’t be different. The cable companies will also still make money for keeping you plugged in. That is, you’ll pay for data in any case. You’ll just pay more for some content. Much of that content will be what we now pay for on cable: HBO, ESPN and the rest. We’ll just do away with the whole bottom/top thing because there will be no need for a bottom other than a pipe to carry the content. We might still call some  sources “channels”; and surfing through those might still have a TV-like UI. But only if Apple decides to stick with the convention. Which they won’t, if they come up with a better way to organize things, and make selections easy to make and pay for.

This is why the non-persuasiveness of Take My Money, HBO doesn’t matter. Not in the long run. The ghost of Steve is out there, waiting. You’ll be watching TV his way. Count on it.

We’ll still call it TV, because we’ll still have big screens by that name in our living rooms. But what we watch and listen to won’t be contained by standards set in 1993, or by carriers and other “stakeholders” who never could think outside the box.

Of course, I could be wrong. But no more wrong than the system we have now.

Bonus link.

Another.

I was interviewed for a story recently. (It’s still in the mill.) In the correspondence that followed, the reporter asked me to clarify a statement: “that the idea of selling your data is nuts.” I didn’t remember exactly what I said, so I responded,

I think what I meant was this:

1) The use value of personal data so far exceeds its sale value that it’s insane to compare the two.

Especially because …

2) There never has been a market for selling personal data, and to create one now, just because marketers are sneakily getting that data for free, doesn’t mean there should be one.

Especially because …

3) The sums paid by marketers for personal data are actually tiny on a per-person basis.

4) Selling one’s personal data amounts to marketing exposure of one’s self. It’s like stripping, only less sexy. And for a lot less money.

And added a pointer to For personal data, use value beats sale value.

In When bubbles burst…, Dave writes,

When any hamster-based startup can raise $50 million on a $1 billion market cap, there’s not much market for new ideas. Why bother, when the same-old-stuff can make you rich. But when the bubble fades, it’s time to get creative. Because techwill reboot. The question is, what’s the next wave.

I followed the link to FACEBOOK FALLOUT: Y Combinator’s Paul Graham Just Emailed Portfolio Companies Warning Of ‘Bad Times’ In Silicon Valley, in which Nicholas Carlson begins,

Facebook has flopped on the public markets, and now we have vivid evidence of how badly Silicon Valley is reeling in the fallout.
Paul Graham, cofounder of Silicon Valley’s most important startup incubator, Y Combinator, has sent an email to portfolio companies warning them “bad times” may be ahead.

He warns: “The bad performance of the Facebook IPO will hurt the funding market for earlier stage startups.

“No one knows yet how much. Possibly only a little. Possibly a lot, if it becomes a vicious circle.”

Among the comments is this one:

Adam Lavine:

One dinner with a dour VC does not a Silicon Valley liquidity crisis make.

With that said: would be nice for all of these startups to find CUSTOMERS willing to PAY for their services. The fact that startups that have “easy money built into their models” is an obvious bubble sign in itself.

To which I replied,

@Adam Lavine:

Exactly.The tightening of VC sphincters is an issue, but it’s a lesser one than the paucity of VC-funded business models that make companies accountable to users as customers.

Facebook, Google and Twitter have consumers and customers that are different populations. Users are the consumers, and advertisers are the customers. This does work as a business: for commercial broadcasting it has worked for the duration. But it works at the cost of having minimized accountability to the millions of individuals who use the service but pay nothing for it. Ever tried to get personal service from Google or Facebook? Good luck with that.

Our dependency on Google alone today verges on the absolute. Facebook envies operating Google-grade user containment systems (e.g. Gmail, Google docs, etc.) on the same scale. But neither company is financially accountable to their users (only to their advertisers and stockholders), and neither have worthy competitors, and that’s not good for the markets they contain either.

The whole ad-supported commercial Web we have today is a collection of monocultural silos, each of which is a bubble in itself. (Think of every giant silo as a single point of failure and therefore a giant bubble.)

Another angle: every company deals with two markets — one for its goods and services, and one for itself. In Silicon Valley the latter has overrun the former, time and again. Now is no exception.

Bonus link: http://www.linuxjournal.com/magazine/eof-google-exposure

Just wanted to share that here, and not just there.

Markets are conversations, they say. So yesterday I had one with MRoth, head of product for , the company whose service changes the other day caused a roar of negative buzz, including some from me, here.

Users were baffled by complexities where simplicities used to be. Roger Ebert lamented an “incomprehensible and catastrophic redesign” and explained in his next tweet, “I want to shorten a link, tweet it, and see how many hits and retweets it got. That’s it. Bit.ly now makes it an ordeal.”

That was my complaint as well. And it was heard. A friend with Bitly connections made one between  and me, and good conversation followed for an hour.

We spent much of that time going over work flows. Turns out Roger’s and mine are not the only kind Bitly enables, or cares about, and that’s a challenge for the company. Compiling, curating and sharing bookmarks (which they now call “bitmarks”) is as important for some users as simply shortening URLs is for others. Bitly combined the two in this re-design, and obviously ran into problems. They are now working hard to solve those.

I won’t go into the particulars MRoth shared, because I didn’t take notes and don’t remember them well enough in any case. What matters is that it’s clear to me that Bitly is reaching out, listening, and doing their best to follow up with changes. “Always make new mistakes,” Esthr says, and they’re making them as fast and well as they can.

I will share something I suggested, however, and that’s to look at the work flows around writing, and not just tweeting and other forms of “social” sharing.

We need more and better tools for writing linky text on the Web. Much as I like and appreciate what WordPress and Drupal do, I’m not fond of either as writing systems, mostly because “content management” isn’t writing, and those are content management systems first, and writing systems second.

As an art and a practice, writing is no less a product of its instruments than are music and painting. We not only need pianos, drums and brushes, but Steinways, Ludwigs and Langnickels. Microsoft doesn’t cut it. (Word produces horrible html.) Adobe had a good early Web writing tool with GoLive, but killed it in favor of Dreamweaver, which is awful. There are plenty of fine text editors, including old standbys (e.g. vi and emacs) that work in command shells. Geeky wizards can do wonders with them, but there should be many other instruments for many other kinds of artists.

Far as I know, the only writer and programmer working on a portfolio of writing and publishing instruments today is , and he’s been on the case for thirty years or more. (I believe I first met Dave at the booth at Comdex in Atlanta in 1982, when the program was available only on the Apple II). One of these days, months or years, writers and publishers are going to appreciate Dave’s pioneering work with outlining, sharing linksflowing news and other arts. I’m sure they do to some extent today (where would we all be without RSS?), but what they see is exceeded by what they don’t. Yet.

The older I get, the earlier it seems. For artist-grade writing and publishing tools, it’s clear to me that we’re at the low narrow left end of the adoption curve: not far past the beginning. That spells opportunity for lots of new development projects and companies, including Bitly.

I think the main thing standing in everybody’s way right now is the belief that writing and publishing need to be “social,” as defined by Facebook and Twitter, rather than by society as a whole, which was plenty social before those companies came along. Also plenty personal. Remember personal computing? We hardly talk about that any more, because it’s a redundancy, like personal phoning, or personal texting. But personal, as an adjective, has taken a back seat while social drives.

Here’s a distinction that might help us get back in the driver’s seat: Publishing is social, but writing is personal. The latter is no less a greenfield today than it was in 1982. The difference is that it’s now as big as the Net.

« Older entries § Newer entries »