Enter a reason for deleting this comment
Is there a way to prevent a spider from attaching itself to a URL that you do not want on the search engines?
Yes there is. Create a new file called robots.txt, and put this in it:
Where * is the useragent of the search engine or spider and / is the url to the page.
You can read more about robots.txt here:
Are you sure you want to delete this post?
Thank you for the information. I am amazed by the venom of spiders and I know that many website owners don’t even know that a spider is working on their link. Maybe I have to read the article in the link that you had indicated. Truly this can be of great help to me because I am in the stage of gathering data about this topic.
Use your robots.txt file or use .htaccess to actually blog the spider (you can block by IP or by user-client).
Thanks for the information guys.