You are viewing a read-only archive of the Blogs.Harvard network. Learn more.

Blogs and Spam:

One place that all bloggers and search engines look to find weblogs is weblogs.com and blogger changes.xml file. These are the present authorities which defines what a weblog is. So, what if spammers wage a spam war on these sites, how do we block them? There was an interesting discussion about this on Daves demo site. The best solution I think that came up was, a vouching system where every new blog has to be vouched by someone before it came on the weblogs list (by Steven Nieker). If a blog turns out to be spam, the voucher gets spanked.

My solution to spam: Let us start by looking at what spam is. Spam is basically unwanted email, right? So, how do we get the computers to know what we want and what we don’t want? Write some AI code? NO. Just let people sort the weblogs they visit into a directory structure and look at what they like The persian blogs then go into their own directory and the bulk e-mailers (thats what spammers call themselves) go into their own directory. Everyone gets what they want. Weblogs.com visitors will put the sites they visit into the directory structure in another frame. Now weblogs.com has two things, the number of people that clicked on a blog and the directory locations of that blog. Lots of interesting things may arise out of this metadata.

Comments are closed.