You are viewing a read-only archive of the Blogs.Harvard network. Learn more.

IP addresses are PII

ø

In a 2007 letter to the FTC, several privacy organizations, including the Electronic Frontier Foundation, Center for Democracy and Technology, and World Privacy Forum, suggested adopting definitions of common online privacy terms. Their proposed definition of “personally identifiable information” is different from that of most search engines. According to the privacy organizations, PII includes not only names, addresses, and social security numbers, but also IP addresses and “unique or non-unique identifying elements associated with a particular individual.” Information counts as PII if it can “permit a set of behaviors or actions to be consistently associated with a particular individual or computer user,” even if the individual is not identified by name or in any other way.

Additionally, the letter defined non-personally identifiable information as “aggregated data not associated with any individual or any individual identifier,” and sensitive data as PII that has to do with health, finances, sexual orientation, social security numbers, insurance numbers, or government-issued ID numbers. Behavioral tracking was defined as “the practice of collecting and compiling a record of individual consumers’ activities, interests, preferences, and/or communications over time, and behavioral targeting was defined as “using behavioral tracking to serve advertisements and/or otherwise market to a consumer based on his or her behavioral record.”

I agree with these definitions. Even though an IP address may not be PII for a search engine alone, it certainly is for an Internet service provider. Any record of an individual’s behavior enables them to be profiled and targeted, regardless of whether the record is tied to a name, postal address, or social security number. I also agree that it is important for companies, consumers, and the government to agree on the definitions of commonly-used terms such as these. Otherwise it would be impossible for consumers to be fully informed of websites’ privacy practices.

Some other principles proposed in the letter include:

  1. Websites cannot help themselves to data from users’ computers and should respect users’ choices to delete cookies by not continuing to set new cookies each time a user visits the site.
  2. Websites shouldn’t bury important information in long, confusing privacy policies.
  3. If a website puts software on a user’s computer that the user does not want, the user should be able to delete the software. 

The letter also proposed steps the government should take to insure these principles are followed:

  1. Create a Do Not Track List similar to the Do-Not-Call Registry. To do this, sites that conduct behavioral tracking must submit their domain names to the FTC, the FTC must educate the public about the Do Not Track List and make it possible to sign up on its website, and browsers must make it possible to use and update the List and prevent websites from tracking users in accordance with the preferences that they have expressed on the List.
  2. Require companies that conduct behavioral tracking to provide users with access to the data held about them.
  3. Make it possible for the FTC to easily check up on companies to make sure they are complying with all regulations.
  4. Establish a national Online Consumer Protection Advisory Committee made up of state Attorneys General and representatives from various privacy and consumer organizations to investigate new methods of tracking and develop new laws as necessary to make sure privacy rights are protected. 

Source: 

Consumer Rights and Protections in the Behavioral Advertising Sector. <http://www.worldprivacyforum.org/pdf/ConsumerProtections_FTC_ConsensusDoc_Final_s.pdf>.

Self-regulation isn’t enough (part 2)

ø

The report that I cited in my previous post made an interesting comparison between Internet marketing and telemarketing. Before the FTC created the Telemarketing Do-Not-Call Registry, the telemarketing industry made attempts at self-regulation. The system developed by the Direct Marketing Association (DMA), a group of marketing companies, was called the Telephone Preference Service (TPS).  In order to opt out of annoying telemarketing calls, people had to either write a letter or pay a fee online and share their credit card number. The TPS applied only to telemarketing companies that were members of the DMA. It is difficult to tell by its name exactly what the TPS is, and the DMA made virtually no efforts to publicize its existence.

On the other hand, the FTC’s solution, the Do-Not-Call Registry has a descriptive, memorable name and URL (donotcall.gov) and was widely publicized. Enrollment is free by Internet, mail, or phone, and the Registry applies to far more telemarketers than the TPS did.

In 17 years, less than 5 million people signed up with the TPS. By comparison, 10 million pepole signed up with the Do-Not-Call Registry on its first day of operation. By 2005, 60 million people had registered. Clearly, government regulation is feasible and can be done in a way that does not violate the rights of those who value the benefits of tracking. People who want to receive telemarketing calls are free to do so, while the rest of us are free from unwanted intrusions during dinner. Just as the government has acted to protect consumers’ privacy from telemarketers, the government could feasibly adopt similar measures for Internet privacy. Behavioral tracking has become so prevalent and extensive that government action may be the only solution that works.

Source:

1. Hoofnagle, Chris. “Privacy Self-Regulation: A Decade of Disappointment.” EPIC. 4 Mar. 2005. 4 Jan. 2008 <http://epic.org/reports/decadedisappoint.html>.

Self-regulation isn’t enough

ø

Because of the lack of legal protections for online privacy, the only restrictions that govern companies’ behavioral tracking are ones that the companies create themselves. So far, self-regulation has not resulted in enough privacy protection for consumers.

Way back in 1997, the FTC recommended that websites adopt some sort of anonymous payment system: the “federal government should wait and see whether private industry solutions adequately respond to consumer concerns about privacy … that arise with the growth of electronic payment systems, and then step in to regulate only if those efforts — be they market-created responses, voluntary self-regulation or technological fixes, or some combination of these — are inadequate.” (1) More than ten years later, there are still no common, easy to use online payment systems that preserve anonymity and privacy.

In 1999, the FTC and U.S. Department of Commerce announced the creation of the National Advertising Initiative (NAI) in response to a federal investigation into DoubleClick’s plan to buy large quantities of personal data from commercial data broker Abacus Direct. About a year later, the NAI announced a set of principles for self-regulation, calling for notice of websites’ privacy practices, some ability to opt out, and “reasonable” security of data. (1) There was no real means of enforcement, however, and companies that were members of the NAI were allowed to transfer data among themselves without restriction, as long as the data was only used for advertising. Furthermore, the principles applied only to members of the NAI, and eventually membership dwindled to only two companies – DoubleClick and Atlas DMT. (1)

Another attempt at self-regulation was the Individual Reference Services Group (IRSG) principles, developed by a group of data brokers – companies that sell people’s personal information to advertisers, insurers, landlords, private eyes, and the government. (1) Members of the IRSG were allowed to sell almost any personal information to “qualified subscribers,” and consumers could only opt-out of having their data sold to the “general public,” a category that did not include any of the member companies’ typical customers. (1)

The closest the FTC has come to passing privacy legislation was in 2000, when they recommended, 3 to 2, that commercial websites and ad companies be required to comply with five basic privacy principles: notice, choice, access, security, and accountability. However, a new FTC chairman was appointed in 2001 and the FTC decided to give self-regulation another chance. (1)

Since then, websites have only increased their abilities and willingness to track people’s behavior. Cookie technology has become more powerful, and cookies are increasingly set by third party advertising sites in addition to the sites that a user actually visits. Web beacons are also used extensively so that ad networks can track people’s visits to third-party sites. Digital rights management (DRM) also poses a threat to privacy, as users are increasingly required to provide identification in order to access content. Every copy of Windows Media Player is equipped with a unique ID that makes it possible to track what content people view. (1) Additionally, more and more news sites have begun requiring the disclosure of personal details in order to view their content. In surveys conducted by the Electronic Privacy Information Center (EPIC) in the 1990s, news sites customarily did not require registration of any sort. However, EPIC reported in 2005 that 7 of the top 25 news sites require the disclosure of personal information such as name, address, and email address, and 5 require the disclosure of non-personally identifiable information such as birth date, gender, and zip code. (1) These invasions of privacy cause users to resort to creating fake identities, and this causes companies to demand information even more invasively and use commercial databases to verify that the information is true.

Another problem with self-regulation is that companies have not made efforts to inform the public about how their personal data are being collected and used. Accordingly to a 2003 Annenberg survey, 57% of Internet users believe that if a company has a privacy policy, it will not share information with other entities. (1) Additionally, a Pew survey found that 56% of Internet users could not identify a cookie. (1)

Source:

1. Hoofnagle, Chris. “Privacy Self-Regulation: A Decade of Disappointment.” EPIC. 4 Mar. 2005. 4 Jan. 2008 <http://epic.org/reports/decadedisappoint.html>.

Log in