Jane and I are here with a great group of presenters and attendees at a conference at Loyola University Chicago School of Law, Privacy in a Data Collection Society. I’m speaking this afternoon on the folly of information sharing as a means of improving cybersecurity, and I’ll post a cleaned-up draft of my remarks here (hopefully, eventually to become an essay). And, I’ll try to post some ad hoc updates on what the speakers have to say.
Update 1: Here is Jane’s abstract:
All Life Is an Experiment. (Sometimes It’s a Controlled Experiment.)
What the Facebook Emotion Contagion Study Can Teach Us About the Policy and Public Perception of Research
Thesis: Our unexamined instincts about social science research lead us to craft laws and public opinions that are backwards. Our disapprobation and legal restrictions apply most strongly to research that is performed by academics and other neutral investigators, that is more methodologically sound, that distributes its burdens more evenhandedly, and that shares its insights with the general public.
Update 2: Meg Jones, on the Right to be Forgotten
- Google v. Spain – Spanish newspaper had right to process information on Gonzales, but Google did not.
- Google assesses individual’s claims under national law
- Lauber / Werle v. Wikipedia – brothers convicted of murdering actor, and sought to have references to them removed from Web sites referring to the crime.
- Martin v. Hearst – CT erasure stature nullified Martin’s arrest. She sued newspaper for publishing about her arrest. Second Circuit: newspaper’s truth is different from her truth.
- Clash of values between Europe and U.S. over forgetting
- [shows clip of Phineas and Ferb “Cyberspace Rules of the Road“]
- Link rot and other ways that information disappears
- Digital immortality? Internet is not the perfect memory we’re afraid of
- Poll on whether Americans ought to have right to remove irrelevant information from search results (39% Yes, 21% No too hard to define, 18% No public record, 15% Yes only minors, 6% Yes except public figures)
Update 3: Felix Wu, How EU Right to be Forgotten Relates to US Law
- Conventional wisdom: EU approach is crazy and would never work in US
- Felix: less incompatible than we think, and the incompatibility is different than commonly believed
- US does have areas where information is removed: Fair Credit Reporting Act (bankruptcies – 1o years)
- Key is sectoral vs. over-arching approach
- We would be surprised to see US adopt, as first omnibus right, a right to be forgotten
- Why not adopt a sector-specific RtbF?
- HIPAA – already specifies certain sensitive information where access is restricted (though HIPAA applies only to covered entities)
- How to think about Google in this context? Is it a new sort of credit report?
- Credit report is defined, in part, by use – Google is used for commercial and non-commercial purposes
- Removal in certain contexts as intermediate step
- Mention of data obscurity as term rather than RtbF! Hailing Woody Hartzog!
- How do we know about periods of data retention by companies?
Update 4: Jane Bambauer, All Life Is an Experiment
- Using Facebook emotional contagion study as vehicle for instincts and laws about research
- Reactions most harsh when research most legitimate – we criticize academics far more than industry
- Sanctions are strongest when study authors disclose results to public
- Facebook’s alteration of scale of emotion in postings led to effect on postings by users seeing them
- Why did this experiment engender controversy, rather than “poke to vote,” for example?
- Objections to ethics of research: lack of informed consent, surreptitious intervention, violation of Common Rule
- FB study undoubtedly violates FIPPs (respect for context)
- God punishes King David with plague for taking a census – only God is to know that information
- Good research requires repurposing data – Google has identified unreported side effects of drugs this way
- Piketty repurposed tax data for his book on wealth distribution
- Surreptitious manipulation of Newsfeed
- Standard part of metrics-driven research
- Bricks-and-mortar retail observes traffic to optimize shelf display
- Individual physicians may select among equally effective treatment options for each patient – may be useful to formalize the experiment since it has better controls
- Sunstein’s “50 Shades of Manipulation” – promotes self interest of manipulator, and designed to bypass cognitive reasoning
- Downstream use of research can fit within this definition, but the research itself does not – it’s a cost to the company, and the company does not know if it bypasses reasoning
- How do we know status quo is preferable?
- Research is less self-serving when it’s shared publicly
- Researchers at Cornell were the ones who took the real hit, but Cornell’s IRB says it’s in compliance
- Even if their research was not exempt from IRB review, it would have qualified for expedited review and exemption from informed consent
- The most legally exposed people were the researchers, not FB or the journal
- Problematic outcomes
- Companies are at a disadvantage when they work with neutral / academic researchers
- Firms are benefited when they avoid formally testing hypotheses and assumptions using randomized control trials
- It’s safer to avoid sharing results with media / public
- Sensible to reform Common Rule
- Require IRB review when intervention would create physical or legal risk if performed for non-research purposes
Brett Frischmann – Being Human in the Twenty-First Century: How Social and Technological Tools are Reshaping Humanity
- Machines and technologies steer us in ways that make us increasingly predictable and manipulable, and ultimately less human
- Post-WWII: concerns about computers overtaking humans – Turing test as exemplar
- We want to be humans who use computers, not humans who are computers
- When does technology replace or diminish our humanity? Can we detect it?
- Hard definitional baseline – what is human?
- Interconnected sensor networks, Internet of Things, Big Data will expand scale / scope of human engineering – ubiquity is key
- Technology / humanity are abstract and complex
- 3 parts to project
- Humans and tools – technological dehumanization
- Human-focused Turing-type tests
- Applications (critique of nudging) – each incremental nudge can be justified, but path of nudging itself may be unjustifiable
- Focus is techno-social engineering of humans: influence, manipulate, construct
- Internet has transformed environments within which we live our lives
- Demand for Big Data is dependent upon sensors on / around humans
- IBM’s Watson as an example of technology approaching Turing line
- Brett is interested in whether humans are approaching Turing line – conditions under which they’re indistinguishable from a machine
- What happens if human passes test and appears machine? Consequences?
- On-line contracting: designed to nudge you to click I Agree
Deven Desai – Associational Freedom and Data Hoarding
- FBI has stated preference for using warrant for GPS tracking
- Concern for associational freedom and interplay with Fourth Amendment
- Freedom to develop ideas before speaking – vital to self-governance
- Sedition Act criminalizes speech and assembly separately
- Meet-ups and activists are current incarnations of assembly concerns – fear of backward-looking surveillance
- Protect precursors to speech
- Bugging in public places undercuts associational freedom
- Digital data can be hoarded, and lack of rules on law enforcement use leads inexorably to accumulation
- Key limits
- Apply limits retrospectively as well for searches in data troves
- Return – government must return or delete data
Helen Nissenbaum – Big Data’s End Run Around Informed Consent
- Full title of paper: Anonymity and Consent
- Big Data: epistemological paradigm – faith in power of data to produce knowledge
- Ethics of big data – what happens when the data is about individuals?
- Anonymity breaks link between data and identifiable individual
- Thesis: big data poses insurmountable challenges to anonymity and consent – renders them ineffective in quest to privacy
- Notice & consent enshrined in U.S. privacy regulation (FIPPS, GLBA, FERPA, VPPA, GLB, notice and opt-out requirements)
- Require consent from subjects if one deviates from substantive rules
- Notice and choice regime of ToS online
- GLBA gives you very little chance to opt-out
- Critiques of notice and consent as theoretical matter and in operational challenges
- Challenges to N&C increasing
- More actors, information, flow
- Impossible to predict future uses or consequences
- Transparency dilemma: impossible to have a policy that is both comprehensible and comprehensive
- Public lives of others: inferences based on network analysis, social networks, representative sample
- Informed consent may have to be abandoned, which is acceptable because informed consent is a means rather than an end (which is privacy)
- Privacy as control over information is wrong definitional approach
- Instead, privacy as contextual integrity
- Ideal informational norms: settle competing interests / preferences / desires best; promote ethical and political values; promote context-specific ends and values for social integrity
- Patient consent operates as permission for limited departures from standards / expectations
- Key role of background assumptions and societal constraints
- Privacy policies should shrink in importance, and societal limitations should wax in importance, in terms of constraining information flow
Comments Off on Privacy in a Data Collection Society
Filed under: Anonymity, Digital Media, First Amendment, Google, Intermediaries, international, Internet & Society, Law School, Privacy, Scholarship, Search Engines, Security