-2

I've seen tutorials/articles discussing using Robots.txt. Is this still a necessary practice? Do we still need to use this technique?

4

5 回答 5

2

Robots.txt file is not necessary but it is recommended for those who want to block few pages or folders on your website being crawled by search engine crawlers.

于 2013-04-24T05:56:52.550 回答
1

I agree with the above answer. Robot.txt file is used for blocking pages and folders from crawling by search engines. For eg. You can block the search engines from crawling and indexing the Session IDs created, which in rare cases could become a security threat! Other than this, I don't see much importance.

于 2013-04-24T06:35:15.347 回答
1

The way that a lot of the robots crawl through your site and rank your page has changed recently as well.

I believe for a short period of time the use of Robot.txt may have helped quite a bit, but no adays most other options you'll take in regards to SEO will have more of a positive impact than this little .txt file ever will.

Same goes for backlinks, they used to be far far more important than they are now for you getting ranked.

于 2013-04-24T11:06:58.947 回答
1

Robots.txt is not for indexing . its used to blocks the things that you don't want search engines to index

于 2013-04-24T15:35:55.433 回答
1

Robots.txt can help with indexation with large sites, if you use it to reveal an XML sitemap file.

Like this:

Sitemap: http://www.domain.com/sitemap.xml

Within the XML file, you can list up to 50,000 URLs for search engines to index. There are plugins for many content management systems that can generate and update these files automatically.

于 2013-04-27T12:50:11.197 回答