Google Webmaster

Websites’ Content Invisibility to Google Crawler

How Google crawl a website has been a long standing issue. It becomes even more bothering when your robots.text file can’t be seen by Google crawler when in fact other browsers can see it. Previously, I featured an article about determining if Googlebot has been denied access from your site. With the help of Matt Cuts I will tell you what’s the root cause of becoming invisible to Google crawler and how you can resolve the problem.

straight from matt cutts

Matt Cutts Addressed the Issue

The issue is about googlebot’s inability to crawl robots.txt file 50% of the time when it has a 100% success rate with other browsers. Some websites using a plain nginx server and a mit.edu host can generate 100% fetch rate but not with Google. You can conclude that Google seemed to have a problem with their crawlers. However, it really isn’t about Google, it’s more on how you’ve done cloaking. The practice of cloaking or using of redirect urls should be properly executed. Cloaking is a SEO technique by which content is differently seen by the crawlers and browsers/users. Not everybody may have done it, but when you try to trick the search engine, the result may bounce adversely. Though it can be viewable in other browsers, it may not be viewable in Google because what you’ve done is an unintentional reverse cloaking. Take the scenario, you cloak the content of your website. The content may be served to browsers as well as user but when the time the Googlebot shall enter, it will serve it empty. Another but rare reason for being invisible to Google crawlers is improper configuration of different systems and hosts. If this happens you can contact your hosting company.

Google webmaster tool can solve the issue

The Google Webmaster’s Solution

It’s pretty hard thing when Googlebot sees nothing in your website. With the tool Fetch as GoogleBot feature in Google Webmaster Tools you can see the page as it is seen by Google. This is essential when you are doing some troubleshooting due to page’s poor performances in search results. What The Fetch as Google Tool Features:

fetch as Google features

Along with Fetch as Google you can also use HTML suggestions and Crawl errorsto check the crawlability of your website. It will allow you to improve the configuration of the elements essential for google crawlers.

using fetch as Google

How To Use Fetch as Google Tool?

First, you must be added and verified in Google Webmasters Tools, then follow these instructions:

instructions for fetch as Google

More often than not, the issue of not being seen in Google is brought by websites being hacked. Using the tools from Google is important in troubleshooting websites’ performances in search results. It will also make the website owner aware of the status of his website not just with the user but also as to how Googlebots read the content of the site. Lastly, one must also practice ethical SEO, tricking google is not a good option. It might be effective at first but it can’t be too long for sooner or later googlebots will realize a no value website. Don’t wait for Google to find it out, make it right and you’ll keep the track of being on the rank.

Here is the full Video of WebmasterHelp:

online strategic solution

algomez

SEO Consultant, Online Marketer & Blogger, Web Developer & DLINKERS Founder. “Chose a job you love, and you will never have to work a day in your life.”

Related Articles

Back to top button