So I’m running a YaCy node – which is a pretty awesome project to create a search engine indexed “by the people, for the people.”
YaCy provides a java servent that can index internal resources and external web pages. You have MANY controls over what and how it’s indexing and the resources allocated to it. There are tons of built-in analytics and logging for the stats geek in you.
It’s still rough, but seems damned promising. A bonus – it uses jQuery and Solr.
I really like the idea of indexing all the content you care about and also providing that index to the world at large to search, but I have concerns over the long-term impact of more ‘bots crawling the web. I would like to see YaCy figure out a way to minimize it’s impact on a global level – if every yacy node is indexing the same sites, it could easily escalate to a DDoS-level problem. Perhaps they’re already working on this issue.
I despise implementing image rollovers, but this makes it almost tolerable. Any <img> tag with a “hover” attribute referring to an image URL will have a rollover behavior attached to it. For bonus points, the rollover image will be pre-loaded, so there’s no momentary delay the first time an on-state image is loaded.
In your HTML source:
<img src="/images/button.gif" hover="/images/button_on.gif" />
<img src="/images/another_button.gif" hover="/images/another_button_on.gif" />
// Preload rollover
var imageEl = jQuery("<img alt="" />");
// swap the image.
var hoverSrc = jQuery(this).attr('hover');
var regSrc = jQuery(this).attr('src');
Thanks jQuery for being so awesome.