Should I block duplicate pages using robots.txt?

Filed Under ( SEO, Tips, google ) by admin on 18-03-2010

Tagged Under : , , ,

Halfdeck from Davis, CA asks: “If Google crawls 1,000 pages/day, Googlebot crawling many dupe content pages may slow down indexing of a large site. In that scenario, do you recommend blocking dupes using robots.txt or is using META ROBOTS NOINDEX,NOFOLLOW a better alternative?”

Short answer: No, don’t block them using robots.txt. Learn more about duplicate content here: http://www.google.com/support/webmasters/bin/answer.py?answer=66359

Related posts:

  1. How will Google search work with pages built with GWT?
  2. What is page impression? What’s the difference between ad, ad unit, and page impressions?
  3. What is Organic Traffic?

Post a comment

HI