1

According to this http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449

Disallow: /page1/
all page1 URLs will be disallowed i.e page1/foo/bar will also get blocked.
Disallow: /page1

Only page1 will be blocked and page1/foo/bar will be allowed.
But this is not happening , how can I block only page1 and allow page1/foo/bar to be crawled

EDIT : Actual Issue is that same Page is crawled twice in different paths as /page and /page/

4

1 回答 1

0

Why don't you just add a robots metatag?

<meta name="robots" content="noindex, nofollow, noarchive"/>
于 2012-05-17T09:01:35.867 回答