You are viewing a read-only archive of the Blogs.Harvard network. Learn more.
Skip to content

The never ending robots.txt

While looking over logs for a server of mine I decided to write some code that would help me deter someone sniffing my server for weaknesses. The first thing I decided to write was a robots.txt file that had a few different qualities.
1) It would never end
2) It would not bog down the CPU
3) It would not repeat
4) It would be a valid robots.txt file

At the time I was using PHP for another project and so it was already configured on my server. I reused a password generation function and stuck it in a time delayed infinite loop. Then I changed the .txt handler on the server to be PHP.

@header("Content-Type: text/plain");
@header("Pragma: no-cache");
@header("Expires: 0");
$standardStatement = "User-agent: * \n";
print $standardStatement;

function randpass() {
$chars = "1234567890abcdefGHIJKLMNOPQRSTUVWxyzABCDEFghijklmnopqrstuvwXYZ1234567890";
$thepass = '';
for($i=0;$i<11;$i++)
{
$thepass .= $chars{rand() % 39};
}

return $thepass;
}

while (true):
$newpath = randpass();
print "Disallow: /$newpath\n";
usleep(6000);
endwhile;

Post a Comment

You must be logged in to post a comment.