Friday, September 2, 2011

Most Website’s Crawl Errors

Hello Everyone

Google is a gold mine for SEO persons such it provides lot of tools for online marketing or Search engine optimization. Today we are going to discuss about Google webmaster tool which is one of the most important tools for SEO after Google Analytics.

In Google webmaster tools, section crawl errors shows crawler errors of respective indexed website. People mostly ignore this section as they might feel it’s not helpful for websites’ ranking. Crawling is a one of initial steps of SEO activities from Search engines’ area and tool webmaster shows kind of crawl errors due to that website is not able to crawl well and ultimately it impacts search engines’ presence on websites.

I would like to share some crawling errors and solutions that people most find on website. Just take a look!
Most Website’s Crawl Errors

  • Error with HTTP status codes - shows Googlebot info about website and requested pages.  Below are common error codes which people get.

            200 - Shows acknowledgement of successful returned pages
            404 - Tells requested pages avaiablity on server
            503 – Tell whole websites’ server is provisionally out of stock

  • Unfollowed URLs’ Errors – Indicates URL related issues while crawl website by Search engine
     
            Using text browser for website’s test
            The traps of dynamic pages in website 
            Missing use of 301 permanent redirection from old URL to New one
    
  • URLs restriction by robots.txt - Happens due to prohibition of robots.txt for Googlebot.

  • URL unreachable errors - Happens due to down or busy server or encountered with DNS.
301 Permanent redirection and robots.txt file plays vital role in solving crawl errors of Google webmaster tools.

I hope you like this session; will meet tomorrow with new topic.


No comments:

Post a Comment