Robots.TXT file
-
I have made my blog public over an hour ago. yet my robots.txt says “disallow” across the board and “fetch as google” on google webmaster tools says it’s being blocked from crawling. Any ideas?
Blog url: http://wandererofthewaste.wordpress.com/The blog I need help with is: (visible only to logged in users)
-
Hey there,
It can take a little time for this file to be updated. If you take a look at it now, everything looks to be in order.
Best,
P. -
Thanks for helping so quickly!, but I just peeked (after clearing my cache) and this is what I see:
User-agent: IRLbot
Crawl-delay: 3600User-agent: *
Disallow: /next/User-agent: *
Disallow: /mshots/v1/# har har
User-agent: *
Disallow: /activate/User-agent: *
Disallow: /wp-login.phpUser-agent: *
Disallow: /signup/User-agent: *
Disallow: /related-tags.phpUser-agent: *
Disallow: /public-api/# MT refugees
User-agent: *
Disallow: /cgi-bin/User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/Google fetch still not working either..
-
Hey there,
That robots.txt file looks correct–all content is not blocked, just some select, non-content pages.
Google’s Webmaster Tools may cache robots.txt–try it again in a few hours and see if things are cooperating.
Best,
P. -
Ah, ok. I saw “disallow” and thought that meant it’s all blocked. Will try later. Thank you for helping me.
Peter
- The topic ‘Robots.TXT file’ is closed to new replies.