Google has a program that routinely scours the corners of the internet for its search database. It collect all the information it runs into which is a good thing in most cases. However If you want to protect the info on your site, google, as well as all search bots I can think of will respect the rules set in a robot.txt file. The syntax of this file I am unsure of however, try Googling it.