Archive for December 3rd, 2005

Google’s Alan Davidson on Areas of Special Concern


Alan Davidson, Washington Policy Counsel and head of Google’s new Washington DC government affairs office, made several interesting remarks in his panel statement. Among them: He identified the following two areas that are of special concern to search engine providers:

(1) Conceptual shift in speech regulation

  • Old approach (offline media): focused on publishers, readers
  • New & emerging generation of speech regulation: focus on deliverers – intermediaries are supposed to police the networks. Examples where this approach is currently up for discussion in D.C.: access to pharmaceutical products, blocking of gaming websites
  • Assessment: It’s not a good idea to target intermediaries: Due process, procedural problem: intermediary, e.g., can’t tell whether or not a particular site featuring copyrighted content is a fair use or not; by going after the intermediary you take the publisher out of equation, can’t go to courts to argue the case
  • Misguided, because search engines are only in the business of indexing existing content; they’re not editors (can’t be, given the scale.)

(2) Government access to information

  • Increasing pressures to provide personalized information (search history, etc.) to third parties
  • Best privacy policy doesn’t help if government wants information for national security reasons; standards really low; plus: search engines not allowed to inform users that info has been passed on to third parties.

Ed Felten on the Search Space


Here are some keywords of Ed Felten’s presentation here at Yale’s search conference:

  • Talks about what search is
  • Search is broader than we think that it is
  • Three steps, processes or elements: (1) observe, (2) analyze/learn/, (3) serve users
  • Observation of information, either crawled in the web or the university library, real world
  • Put information in a DB, so that it’s available in electronic/digital form. Analyze, index, learn, model that information; put some sort of value on top of it
  • Index, model, … built from it allows you to serve users, answering queries, answer questions
  • Broad definition, it’s not only search engines such as Yahoo, it also includes Google print, fixture sites (e.g. baseball statistics), attributes of P2P file sharing systems; also applies to consumer reporting organizations, e.g. choicepoint
  • Interesting issue among others: Question whether search is internal or external to the world it is studying. Eg. Ebay has searchable search engine inside for objects/auctions. Component that is part of the service they provide. Bidder’s Edge tried to build external search; eBay wasn’t happy. Other interesting case: Grokster, e.g., has/had internal search engine. BitTorrent didn’t provide a search engine, provided only transfer and got significant legal advantage.
  • Other interesting aspect: analyzing and learning brings the value, e.g. Google’s PageRank is the value added. Analysis step is where the heavy thinking happens and value is created. Interesting, b/c legal challenges have not challenged analysis element, but are challenge to crawling = observation stuff (e.g. eBay v. Bidder’s Edge).
  • Decentralization and P2P design. Complex issue. If analyzing and learning is key, but “observation” element of the search process is the target of law, it’s likely that we try to decentralize the observation part.
  • In sum, search is broad, we’re very early in development of this technology.

Update: More on the other panels here.

Log in