Googlebot can’t access your site?

Googlebot can’t access your site?

Google BotDue to crawling issues where Googlebot can’t access one of our sites that’s why I wanted to share this scope, since experiences can be the best way to learn. To know more about its definition here’s Google answers :  access denied errors. This kind of issues must be noted and give solution immediately since it can lead to a serious effect on your website if you are going to take it for granted. Using the Google webmaster tool in the crawl errors section I received a notification (see the image below).

Googlebot can't access your site

You will see also below that the site has problem based on it’s crawl stat since it has a low rate of page index this week.

Crawl Stat

What I did?

• Read the notification and look at the overall error rate for DNS queries of the site. Is it 100% or below? As you see the site has an overall error rate of 33%, so what’s next?

• Read the recommended action for your error rate. Since according to Google webmaster on their post about from crawl error alerts, the reason are the following:

 Either down or misconfiguration in DNS server
 Firewall off on your web server
 Refusing connection of your server from Gbot
 Server is either down or overloaded
 Inaccessible site’s robot.txt

Since it’s less than 100%, I tried to consider the possibilities of checking the web server if it’s down or overloaded. Where in one of the recommended action to take is to contact the hosting provider and discuss or ask about this configuration issues in order that you can also talk about the possibility of allocation of more resources for the DNS service.

• Double check the real cause or do further investigation. In my case it was an overloading issue.
What if it’s a hundred percent you’ll just need to follow the recommendations given by Google just see the above image or simply do the following:

 Checking of the DNS setting
 Check if your hosting account is verified or active

• Do possible actions immediately. Before I contact the hosting service, I started to remove or lessen those insignificant plug-ins and test if can solve the issue but if not I will immediately connect with the hosting provider.

• Check if the solution found is effective through using Fetch as Google to verify if Googlebot can now access you’re site.

Fetch as Google

Luckily with just removing those insignificant plug-ins and check again the webmaster tool to fetch as Google the website can now be accessed by Gbot. But if you if you think that that plug-ins are still useful for you all you need to do is contact your hosting provider in order that you can talk about allocation of more resources for DNS.

An important reminder if you’re familiar about this and you have received this kind of notice it’s always an effective option to contact your hosting provider since they know if it’s their fault or it’s on your own, before contacting just make sure that your site is truly down, before checking make sure that you clear the cache and cookies.

Also take note if you’re continually getting high rate of DNS error, you must immediately talk to your hosting provider. It’s good if they can fix it immediately but what if not, just simply make a decision or it’s time to think about this: “maybe it’s time to change my hosting provider“.

online strategic solution


  1. Hey, I have this issue recently , it shows SUCCESS on Fetch as Google , but when I checked Crawl Errors – it shows Google can’t access your site, I have contacted my hosting provider , they say no problem from their end, so is there any way I can change Robot.txt file to solve this issue or any other thing ?

  2. Hello Mani,

    Can you send me the url of your site for me to look over.
    algomez recently posted..In Focus : Link Detective Tool for Competitive Backlink ResearchMy Profile

  3. I have a similar issues. Googlebot cannot crawl SOME of my pages. BlueHost says there is nothing wrong on their end but I was able to recreate the error with a DNS look up tool. I have no idea how to fix this problem . It all started back in Sept. 22, 2012 and I had no idea. Due to this my sites have been loosing rank. Can you help me? Your help will be greatly appreciated.

  4. Hello Dayse,

    Please send me the url of your site and can you add me to your google webmaster account.

  5. Al Gomez, this is third reply with the same issue.. :) I have the same issue. But I would like to elaborate it; currently my website is running properly; since Oct 20 its traffic dropped. I have check its indexation ( it shows the old tags over indexation whereas crawled date is latest.
    Even I have removed robots.txt file to perhaps it exist it doesn’t had any wrong content I just blocked gallery page.
    And most important also there are other sites are present on the same server but they don’t have any issue. Only this site has.. :(

    Please check the URL at

    Thank you..
    Cristi :)

  6. Hi Critina,

    I just want to share an answer for that, maybe that there is something wrong with what Google and the user see on your website. Therefore you can try using Fetch as GoogleBot feature in GWT inorder to know how Google see your page.Using fecth as Googlebot can make you know sometimes it might be the reason that your website has been hacked or there is some kind of link injected on your website, therefore you need to check it using Google Webmaster Tool.

  7. Hi,
    When I log in in to webmaster tool and click on Crawl Errors ->>> It is shows Google can’t access your site, then I have contacted my hosting provider , they say no problem from their end, so is there any way I can change Robot.txt file to solve this issue or any other thing ?
    I have not add Google verification tag in my index page is it the problem for this?

  8. Hi veeruk,
    Yes, you should try adding Google verification tag. If it is not working seems that we should look for another one or share it here.

  9. Al Gomez, I have the same problems as many have here and I would like your opinion as I am not able to fix, google tells me I can not access my site but after 1 week and have it again, driving me crazy of my hosting say that everything is correct.

    Thanks ..

  10. I am facing this problem when i visited my google webmater page i saw a message google can’t access your website your error rate is 42.7 percent, i checked everything at my end and found nothing wrong, then contacted to hosting service provider (Godaddy), they replied back of nothing wrong at their end, i checked in the google fetched it fetched it succesffully then next day i got this message in GWT, and now when i am writing the third time i have received this message in GWT, kindly guide me an immediate solution.

  11. @Kamal can you add me to your google webmaster account Algomez17, let me take a look your site indexing status. This is a broad question that needs enough information from your site of why Googlebot can’t access your site, lot of factors to consider.

  12. i also facing this problems i checked all which is possible but i don’t have any idea.If you(algomez) don’t mind can you check my site. Googlebot can’t access your site
    April 5, 2013
    Over the last 24 hours, Googlebot encountered 6 errors while attempting to access your robots.txt. To ensure that we didn’t crawl any pages listed in that file, we postponed our crawl. Your site’s overall robots.txt error rate is 100.0%.

    Recommended action
    If the site error rate is 100%:

    Using a web browser, attempt to access If you are able to access it from your browser, then your site may be configured to deny access to googlebot. Check the configuration of your firewall and site to ensure that you are not denying access to googlebot.

  13. I’m having the same issue myself and the hosting provider said everything is fine. Do you have any other recommendations?

  14. You have to check your Google webmaster tool and you have to re-check also on the plug-ins you are using, sometimes there are codes that creates errors.

Speak Your Mind


CommentLuv badge