Google error with robots.txt file, can't seem to fix it

  • Hello,
    My website is http://www.paintingsunny.com

    I continue to get a crawl error from Google saying they can’t connect to my site due to my robots.txt file – the error says:

    “http://www.paintingsunny.com/: Googlebot can’t access your site
    Sep 8, 2013

    Over the last 24 hours, Googlebot encountered 3 errors while attempting to access your robots.txt. To ensure that we didn’t crawl any pages listed in that file, we postponed our crawl. Your site’s overall robots.txt error rate is 100.0%.
    You can see more details about these errors in Webmaster Tools.

    Recommended action
    If the site error rate is 100%:

    Using a web browser, attempt to access http://www.paintingsunny.com/robots.txt. If you are able to access it from your browser, then your site may be configured to deny access to googlebot. Check the configuration of your firewall and site to ensure that you are not denying access to googlebot.
    If your robots.txt is a static page, verify that your web service has proper permissions to access the file.
    If your robots.txt is dynamically generated, verify that the scripts that generate the robots.txt are properly configured and have permission to run. Check the logs for your website to see if your scripts are failing, and if so attempt to diagnose the cause of the failure.”

    When I go to my robots.txt file (located at http://www.paintingsunny.com/robots.txt) , it says:

    “User-agent: *
    Disallow: /wp-admin/
    Disallow: /wp-includes/

    Sitemap: http://paintingsunny.com/sitemap.xml.gz”

    I’ve tried to fix it various ways but I don’t have access to edit my robots.txt file, and I don’t believe I’ve chosen to disallow anything in my settings. Any ideas on how to fix this? :(

    Thanks so much! Dani @ Painting Sunny

    The blog I need help with is: (visible only to logged in users)

  • Howdy,

    The robots.txt file is fine and is acting like it should at the moment.

    Looking at your site, though, it appears you’re hosting on your own system using WordPress.org (see the difference at WordPress.com and WordPress.org).

    This could have been caused by a range of reasons—issues with your host or Google. For further help, you should contact your host and/or the support forums at WordPress.org ( http://wordpress.org/support ).

    Thanks!

  • The topic ‘Google error with robots.txt file, can't seem to fix it’ is closed to new replies.