regulation

You are currently browsing the archive for the regulation category.

When some big outfit with a vested interest in violating your privacy says they are only trying to save small business, grab your wallet. Because the game they’re playing is misdirection away from what they really want.

The most recent case in point is Facebook, which ironically holds the world’s largest database on individual human interests while also failing to understand jack shit about personal boundaries.

This became clear when Facebook placed the ad above and others like it in major publications recently, and mostly made bad news for itself. We saw the same kind of thing in early 2014, when the IAB ran a similar campaign against Mozilla, using ads like this:

That one was to oppose Mozilla’s decision to turn on Do Not Track by default in its Firefox browser. Never mind that Do Not Track was never more than a polite request for websites to not be infected with a beacon, like those worn by marked animals, so one can be tracked away from the website. Had the advertising industry and its dependents in publishing simply listened to that signal, and respected it, we might never have had the GDPR or the CCPA, both of which are still failing at the same mission. (But, credit where due: the GDPR and the CCPA have at least forced websites to put up insincere and misleading opt-out popovers in front of every website whose lawyers are scared of violating the letter—but never the spirit—of those and other privacy laws.)

The IAB succeeded in its campaign against Mozilla and Do Not Track; but the the victory was Pyrrhic, because users decided to install ad blockers instead, which by 2015 was the largest boycott in human history. Plus a raft of privacy laws, with more in the pipeline.

We also got Apple on our side. That’s good, but not good enough.

What we need are working tools of our own. Examples: Global Privacy Control (and all the browsers and add-ons mentioned there), Customer Commons#NoStalking term, the IEEE’s P7012 – Standard for Machine Readable Personal Privacy Terms, and other approaches to solving business problems from the our side—rather than always from the corporate one.

In those movies, we’ll win.

Because if only Apple wins, we still lose.

Dammit, it’s still about what The Cluetrain Manifesto said in the first place, in this “one clue” published almost 21 years ago:

we are not seats or eyeballs or end users or consumers.
we are human beings — and out reach exceeds your grasp.
deal with it.

We have to make them deal. All of them. Not just Apple. We need code, protocols and standards, and not just regulations.

All the projects linked to above can use some help, plus others I’ll list here too if you write to me with them. (Comments here only work for Harvard email addresses, alas. I’m doc at searls dot com.)

In New Digital Realities; New Oversight SolutionsTom Wheeler, Phil Verveer and Gene Kimmelman suggest that “the problems in dealing with digital platform companies” strip the gears of antitrust and other industrial era regulatory machines, and that what we need instead is “a new approach to regulation that replaces industrial era regulation with a new more agile regulatory model better suited for the dynamism of the digital era.” For that they suggest “a new Digital Platform Agency should be created with a new, agile approach to oversight built on risk management rather than micromanagement.” They provide lots of good reasons for this, which you can read in depth here.

I’m on a list where this is being argued. One of those participating is Richard Shockey, who often cites his eponymous law, which says, “The answer is money. What is the question?” I bring that up as background for my own post on the list, which I’ll share here:

The Digital Platform Agency proposal seems to obey a law like Shockey’s that instead says, “The answer is policy. What is the question?”

I think it will help, before we apply that law, to look at modern platforms as something newer than new. Nascent. Larval. Embryonic. Primitive. Epiphenomenal.

It’s not hard to think of them that way if we take a long view on digital life.

Start with this question: is digital tech ever going away?

Whether yes or no, how long will digital tech be with us, mothering boundless inventions and necessities? Centuries? Millenia?

And how long have we had it so far? A few decades? Hell, Facebook and Twitter have only been with us since the late ’00s.

So why start to regulate what can be done with those companies from now on, right now?

I mean, what if platforms are just castles—headquarters of modern duchies and principalities?

Remember when we thought IBM, AT&T and the PTTs in Europe would own and run the world forever?

Remember when the BUNCH was around, and we called IBM “the environment?” Remember EBCDIC?

Remember when Microsoft ruled the world, and we thought they had to be broken up?

Remember when Kodak owned photography, and thought their enemy was Fuji?

Remember when recorded music had to be played by rolls of paper, lengths of tape, or on spinning discs and disks?

Remember when “social media” was a thing, and all the world’s gossip happened on Facebook and Twitter?

Then consider the possibility that all the dominant platforms of today are mortally vulnerable to obsolescence, to collapse under their own weight, or both.

Nay, the certainty.

Every now is a future then, every is a was. And trees don’t grow to the sky.

It’s an easy bet that every platform today is as sure to be succeeded as were stone tablets by paper, scribes by movable type, letterpress by offset, and all of it by xerography, ink jet, laser printing and whatever comes next.

Sure, we do need regulation. But we also need faith in the mortality of every technology that dominates the world at any moment in history, and in the march of progress and obsolescence.

Another thought: if the only answer is policy, the problem is the question.

This suggests yet another another law (really an aphorism, but whatever): “The answer is obsolescence. What is the question?”

As it happens, I wrote about Facebook’s odds for obsolescence two years ago here. An excerpt:

How easy do you think it is for Facebook to change: to respond positively to market and regulatory pressures?

Consider this possibility: it can’t.

One reason is structural. Facebook is comprised of many data centers, each the size of a Walmart or few, scattered around the world and costing many $billions to build and maintain. Those data centers maintain a vast and closed habitat where more than two billion human beings share all kinds of revealing personal shit about themselves and each other, while providing countless ways for anybody on Earth, at any budget level, to micro-target ads at highly characterized human targets, using up to millions of different combinations of targeting characteristics (including ones provided by parties outside Facebook, such as Cambridge Analytica, which have deep psychological profiles of millions of Facebook members). Hey, what could go wrong?

In three words, the whole thing.

The other reason is operational. We can see that in how Facebook has handed fixing what’s wrong with it over to thousands of human beings, all hired to do what The Wall Street Journal calls “The Worst Job in Technology: Staring at Human Depravity to Keep It Off Facebook.” Note that this is not the job of robots, AI, ML or any of the other forms of computing magic you’d like to think Facebook would be good at. Alas, even Facebook is still a long way from teaching machines to know what’s unconscionable. And can’t in the long run, because machines don’t have a conscience, much less an able one.

You know Goethe’s (or hell, Disney’s) story of The Sorceror’s Apprentice? Look it up. It’ll help. Because Mark Zuckerberg is both the the sorcerer and the apprentice in the Facebook version of the story. Worse, Zuck doesn’t have the mastery level of either one.

Nobody, not even Zuck, has enough power to control the evil spirits released by giant machines designed to violate personal privacy, produce echo chambers beyond counting, amplify tribal prejudices (including genocidal ones) and produce many $billions for itself in an advertising business that depends on all of that—while also trying to correct, while they are doing what they were designed to do, the massively complex and settled infrastructural systems that make all if it work.

I’m not saying regulators should do nothing. I am saying that gravity still works, the mighty still fall, and these are facts of nature it will help regulators to take into account.