In a provocative OuiShareFest talk titled You Are the Product, Aral Balkan says this:
I think we are at the point where we have to ask ourselves the very uncomfortable question: What do we call the business of selling everything else about you, that makes you who you are, apart from your physical body? And why, if this is our business, is it not regulated?
While I think regulations too often protect yesterday from last Thursday, I’m in sympathy with Aral on this one. While I’ve been working for years on simple means to signal, for example, whether or not we wish to be tracked when we leave a website, I’m not sure those signals will be respected unless backed by the force of law.
But my mind is open about it.
So there are two questions on the floor here.
- What do we call the unwanted harvesting of personal data (our digital body parts) online?
- What policies, if any, would we recommend to back the expressed wishes of people not to be followed when they are online?
Thanks in advance.
Sorry, but I really don’t like the way this blog post tries to glom on to the morally repugnant Planned Parenthood story.
Digisection. Which Berkman’s lobbying client Google is expert at.
I understand what you are getting at here but I take issue with the slavery comparison. Here’s why:
The slavery depicted in the image at the top of your post involved the actual capture, moving, and purchasing of physical bodies. It happened for over a century and the repercussions are still being felt, namely in the same kind of selling and sharing of information that determines access to credit and means of subsistence.
In answering Aral’s important question I think we are best served by not making comparisons to historical slavery but instead historically situating the winners and losers of the information economy as a continuation of the same sorts of domination. We can draw a pretty clear line from slavery, to jim crow, to the new jim crow that is built upon the deft and precise withholding of intangibles like services and credit enabled in part by the surveillance of bodies.
I like the general approach zeynep tufecki takes with her and the Black Box Society by Frank Pasquale. They both, in the most general terms, say that these are categorically more powerful and precise technologies that allow strong actors to do things they’ve always been doing: maintaining their own power and inequality.
Info reaping, data sowing, beware the bearers of ‘free’; spendy via privacy, Skynet; the net refers to fishing net designed to get your personal info, privacy decimation; it’s cool- you still have 90% of it (for now.) Skynetting is a term many Googlers and other Silicon Valley types actually use (at least in private.)
Legislation can of course help with respecting people’s wishes. Engaging folks that work at these monolithic companies helps too. Many of them want privacy respected too (many employees have ‘confirmed’ what were once theories.) They ‘do’ have info that can help make this better, they are willing to share some inside info.
Some companies ‘are’ bad actors, they can be ignored whilst real privacy concerns are dealt with mixed bag companies like Google, MS, etc. /me is not saying they are trustworthy as companies, just more the world is shades of gray at times. Black and white treatments are best left for things like non debatable items like child abuse, actual Godwin issues, etc. We can force the privacy issue by leaving a place at the table for some of these large corporations to be actually involved in this talk. I’m not foolish, I don’t trust them. If we as consumers demand these issues be talked about, we can decide the folks able to sit at the “grownup table.” Companies will try to make their job as easy as possible, we let them at times. There are many examples I can point to of affecting change when a solid (fact filled) effort is made.
(It’s really appreciated that you removed the pic from original post, learning is a life long process. Sometimes a quick response helps promote our message.) Your questions are legit, and the manner you ask them is a great addition to identity online. 🙂
There’s a positive feedback loop in that the regulatory default position on privacy is opt-out.
Drones carrying Stingray cell phone access points are assumed legal until explicitly banned.
Wall-penetrating FLIR and RADAR to spy on you inside your home are assumed legal until explicitly banned.
Automated bulk license-plate readers are assumed legal until explicitly banned.
Use of EZ-Pass to monitor in-city traffic is assumed legal until explicitly banned.
AdTech and spyware are assumed legal until explicitly banned, even when they infect your device, use up your bandwidth, consume all your storage, and eat up your battery.
These things accrue power in an uneven distribution that grossly favors those already holding a power advantage – vendors, governments, mass media, etc. Those most in position to change this have the least possible incentive to do so and each passing day it becomes exponentially harder to change.
So, yeah, signals won’t be respected without force of law, but no chance of getting those laws changed as long as they are for sale and those with sufficient funds to purchase them have no incentive to do so.
THEFT! in the most optimistic sense. No amount of disclaimers and third party nonsense can paint this as anything other then THEFT.
2. make it illegal with fines starting a 10,000.00 per occurrence.
no quibbling, no half measures. no selling of data for free tickets or cat videos.
designers who write this code or put trackers in Flash, get banned from the net FOREVER.
website owners who use these operators get their domains taken away. FOREVER.
No I don’t want a vendor relationship, or a swap market.
folks got shit to sell, put up a website and make their case.
leave us the fuck alone.
I will let you know if i am interested in parakeet diapers, or pet rocks.
“What policies, if any, would we recommend to back the expressed wishes of people not to be followed when they are online?”
I’m sure you noticed the recent EFF.org release of a perhaps better set of do-not-track rules than the W3C standard proposes. But the action verb in your question is how do we “back” a DNT request or commitment — that is, what gives it the needed teeth?
What about a community blacklist of sites that do not honor DNT requests, using a reputational scoring system? Sort of “This URL publishes material that ignores DNT, use it at your own risk.” Little browser widgets like Disconnect and Privacy Badger are becoming popular enough to give me new hope for D-I-Y privacy enhancement strategies.
Comments are now closed.