Tuesday, October 11, 2011

10/06/2011

Using a robots.txt file to restrict crawlers is helpful to protect
passwords for databases when your architecture must store them in
plaintext. The restriction is trust-based, in that there isn't really
any authority that will punish those who crawl your passwords. If a
company gets caught crawling and selling passwords, they would be
blasted by the media, which is pretty much the only real incentive to
use this specification.
Kalin Jonas