Archive for the 'freedom of expression' Category

Sec. 512: Chilling Effects? Chilling Effects!


BNA’s Internet Law News draws my attention to an interesting executive summary of a forthcoming report on takedown notices under sec. 512 of the DMCA authored by Jennifer M. Urban and Laura Quilter of the Samuelson Law, Technology, and Public Policy Clinic at University of California, Berkeley. Among the findings (analyzing 876 notices submitted to the Chilling Effects Project; read more on the methodology on p. 6 et seq.):

  • “Thirty percent of notices demanded takedown for claims that presented an obvious question for a court (a clear fair use argument, complaints about uncopyrightable material, and the like);
  • Notices to traditional ISP’s included a substantial number of demands to remove files from peer-to-peer networks (which are not actually covered under the takedown statute, and which an OSP can only honor by terminating the target’s Internet access entirely); and
  • One out of 11 included significant statutory flaws that render the notice unusable (for example, failing to adequately identify infringing material).”

In addition, the researchers found that over half of notices sent to Google to demand removal of links in the index were sent by businesses targeting apparent competitors, and that over a third of the notices targeted sites outside the United States.

Join us at WSIS: Today 2PM, Expo 2


At WSIS, Berkman Center Fellows Rebecca MacKinnon and Ethan Zuckerman lead Expression Under Repression from 14:00 – 17:30, in Expo 2 (no. 3103). The event is hosted by Hivos; Citizen Lab Director of Technical Research Nart Villeneuve will present on “Internet Filtering: Realities and Myths,” and my colleagues will discuss the Filtering in Tunisia Report. Please join us, this will be a great session!

Blog Filtering


This, for good reasons, gets a lot of attention. See also the work of my fellow Fellow Rebecca MacKinnon. Eaerlier coverage and experiments (December 2004) here. I’m interested (to be sure: that doesn’t mean that I’m in favor of it!) in blog filtering from an information quality perspective. Viewed from that angle, MSN spaces filtering in the US is of similar interest to me, although it is obvious that the effects are much different.

German Search Engines: Compliance With Own Code of Conduct?


Earlier this year, we reported that at all major search engines in Germany (Google, Lycos Europe, MSN Deutschland, AOL Deutschland, Yahoo, T-Online, and t-info) have reached an agreement to filter harmful-to-minors content.Recently, Marcell Machill tested the search engines’ complicance with their own code of conduct. Find a summary of the results here.

Declaration on Human Rights and Internet


EDRI-gram provides an overview of the Declaration on Human Rights and the Rule of Law in the Information Society that has recently been adopted by the the Council of Europe’s Committee of Ministers (see also press release.) The author of the EDRI-gram report concludes:

“… from a digital civil rights point of view, on close reading the declaration doesn’t offer any specific new rights to internet users when it comes to privacy, freedom of speech and access to knowledge. Though these rights and freedoms are all mentioned and reaffirmed repeatedly in the declaration, they are balanced against ‘challenges’ posed by the Internet, such as violation of intellectual property rights, access to illegal and harmful content and “circumstances that lead to the adoption of measures to curtail the exercise of human rights in the Information Society in the context of law enforcement or the fight against terrorism.”

Live from Global Flow of Information Conference


James Grimmelmann and others are blogging live from ISP’s terrific “The Global Flow of Information” conference at Yale Law School. Earlier today, Berkman’s executive director John Palfrey delivered a talk about Information Governance in general and Internet governance in particular. John pointet out, among other things, that the Internet governance discussion is incredibly amorphous. After outlining the current state and scope of the governance discussion, John made the argument that it might be helpful to identify — and foucs on — specific governance problems and/or issues such as filtering rather than to think and talk that much about “omnibus regulation” as we call it in Europe. Such a problem-oriented approach will prompt a discussion of what we really value about the Internet and what the guiding principles for Internet regulation should be.
Congratulations and thanks to our friends at the ISP for putting together such a great conference program with a truly impressive line-up of speakers.

Today’s readings


I’m catching up (terribly delayed) with a couple of interesting articles, research papers, and news reports. Here’s a selection of today’s recommended readings:

* M. Davison and B. Hugentholtz’s piece “Football fixtures, horseraces and spinoffs: the ECJ domesticates the database right

* Natalie Helberger’s Indicare article “Thou shalt not mislead thy customer! The pitfalls of labelling and transparency


* Recent Pew report on Music and Video Downloading report on Bertelsmann’s new P2P service

* Heise on “Google News: how far does freedom of speech go?

* I almost forgot this one: Heise on the German Green Party talking about side effects of search engines

Happy Easter.

Human Rights, Internet & Culture


Interesting post and pointers by CyberBug on Human Rights and the Internet — more to come, stay tuned. Check it out.

Search Engine Filtering Agreement (Germany)


EDRI-gram, a bi-weekly newsletter about digital civil rights in Europe, draws our attention to an earlier report by German online newsletter Heise, which reported a couple of days ago that all major search engines in Germany (Google, Lycos Europe, MSN Deutschland, AOL Deutschland, Yahoo, T-Online, and t-info) have reached an agreement to filter harmful-to-minors content which will make it much more difficult for German users to access such content. For this purpose, the search engines agreed to establish and run a self-regulatory organization that will block websites considered to be harmful based on a list of URLs provided by a government agency in charge with media content classification. According to the Heise report, the search engines take these steps because they fear that European legislators might become active if the harmful-to-minors-problem isn’t addressed by the industry itself.
Among many interesting details: (1) The search engines are not allowed to make public which sites are filtered. (2) It seems unclear how content considered to be harmful to minors can be searched and accessed by adults under the regime. Again, clash of cultures. For a much earlier (2002) analysis of Google content filtering in Germany, see this report by Professor Jonathan Zittrain and former Berkmaniac Ben Edelman.

Live from Class on Harmful Speech


As some of you might know by now, I’m co-teaching with John Palfrey a course called Internet & Society: Technologies and Politics of Control at Harvard Extension School. On tonight’s menu is a rather indigestive topic: harmful speech on the Net. John has the lead, and he starts where last class ended: The shift from consumers to active users/creators — a shift many of us think is great. Tonight, however, John takes a different route and focuses on the down- and dark side of the new information environment.

The starting place is the fact that Internet speech is different. John makes three points:

* Net creates potential for aggregation of data where none was possible/economically feasible before
* Internet has made it easier (although it might get much more difficult in the future) to speak anonymously.
* Access becomes possible over great distance at any time – speech that is posted here can be heard around the world.

John now describes the growth of online communities back in the times (i.e. mid 90ies) where ISP offered not only access to the Internet, but were in the business of creating online communities and providing content for their users. The communities were idea and issues focused, town-meeting like with a benevolent dictator style government (aka ability to exclude/terminate access.)

Fade. John tells us the story of (Ken) Zeran v. American Online (In this context, we briefly discuss CDA sec. 230) and Jake Baker to illustrate how things, at some point, turned from good to tricky. The in-class discussion is now on how the results of the two cases can (if at all) be reconciled.Break.

Back to second half of class 6.

Uups, we get a cold call from JP. He asks his TAs and co-teacher how First Amendment and equivalents work in the U.S., Canada, and the EU. Tim gives us a great 1-minute overview of First Amendment law in the U.S. and makes clear that it is primarily a right that protects against governmental viewpoint-censorship. Courts, at the outset of a case, have to make a judgment what standard of review applies to restrictions on free speech – strict scrutiny (e.g. political speech/content-based restriction) or lower standards of review.

Susie talks about Canadian law that is similar re: state-actor requirement. Presumption: Free Speech, everything is protected. Only exemption: violent action (expression through action). But: Government is allowed to restrict fundamental rights if seems legitimate in a democratic society. There’s a five-step-balancing test that looks, inter alia, into individual rights and state interests. Approach is different in Canada, but principles similar.

I now talk a bit about European approaches. My point is, I guess, that the U.S. approach to Internet-harmful speech regulation is, roughly, more speech, i.e. a pro-speech approach. Europe has taken an alternative approach, i.e. an anti-hate approach. Measures have been taken at the national level (e.g. German Penal Code), but also at the level of international law (e.g. European Convention on Human Rights, and, Internet-specific, Convention on Cybercrime, Additional Protocol.) How can we explain these different approaches? There are many elements, e.g. historical facts (e.g. Nazi propaganda in Europe vs. imprisonment of American during WW I for criticizing US participation in war); political/cultural system (i.e. relativistic conception of democracy in the US); trust/distrust in courts; different interpretations of individualism.

John now presents a couple of examples of what we can find on the Net – a slide entitled “The Good, The Bad, and the Ugly (literally). The list includes Nuremberg Files, Babes of the Web, free porn, free music, how to make a bomb, how to make/grow various drugs, etc. — The point is certainly that there are downsides to the shift from passive receivers to active users and creators, respectively.

John now introduces a new set of themes, asking what we cannot avoid on the web: SPAM (potentially a form of protected commercial speech), pornography, Viruses/Affects of viruses, advertising. Possible solutions: The resurgence of online communities reformulated around social networks, ability to exclude, feeling of living room rather than information bazaar. Social software such as Friendster, The Facebook. Technological approaches such as filtering, Pop-up blockers, SPAM blockers, family-friendly user agreements.

We end with a list of issues on the current agenda (“What are the problems?):

* US: 1996 – today: Protecting children online
* Companies: trade secrets (Apple, Diebold)
* Protecting citizens from seeing harmful information: religious; moral (porn); politics; drugs/alcohol; women’s issues.

Finally, John presents not-yet-released Berkman research (sorry, can’t blog: censorship) and, in different context, circulates the Grokster amicus submitted by HLS faculty members.

Interesting class, thanks to all.. Have a safe ride home.

Log in