On Tuesday at the RSA 2006 conference, George Kurtz, of McAfee/Foundstone, spoke about the need for companies to check their public data that's available on the Internet. While companies may not have their payroll.xls files visible, they may be broadcasting their robot.txt files, files that tell Web crawlers what not to include in their search engine indexes. How is that a problem? If you type inurl:robot.txt in Google, you might be able to see the contents of that file and subdirectories that weren't meant to be public. As an example, Kurtz showed a robot.txt from whitehouse.gov, listing all the subdirectories on Iraq and 9/11 that the Bush administration didn't want to surface on Google and other search engines. Kurt recommends that companies use No Archive metatags and even then password-protect all sensitive documents within restricted subdirectories. He also recommended Google-hacking your own system.