Enter a reason for deleting this comment
I wanted to know if its true robot.txt are no longer needed for google seo purposes
Robots.txt is still in use today like it has been ever since it was first created for the purpose of telling spiders which pages on your site are where and which ones it can and can't crawl/index. For example, you could tell it where your sitemap is by putting the link to your sitemap in the Robots.txt file but also tell it not to crawl and index certain pages on your site that you wouldn't want to show in the SERPS like admin login pages or directories where your sites files are kept to. It prevents crawling of those directories and thus stops them showing up in the SERPS. They are just one very small part of good website design and on-page SEO and are often overlooked by webmasters even today.
They are still valid if you don't want the search spiders to crawl certain pages. This is what the robot.txt file is for. They didn't really have any seo purpose anyway and most people don't bother with them.
I use them to block certain pages that I don't want indexed.
me same used it to block certain pages
To be honest I do not know about robot.txt because I am not really well versed with SEO yet. I am glad to encounter this discussion to broaden my knowledge with the search engine crawlers. My question now is the robot.txt is a necessity of the site otherwise the site will not be inspected by the crawler as if it is not updated with the contents because dormant sites are not checked by the crawlers. Correct me if I’m wrong.