123 Internet Group - Blog: 06_07
Milton Keynes 01908 231 230
Northampton 01604 231 231
Bedford & Luton 01234 432 431
Peterborough 01733 822 821
  • 123 Internet Group
  • Web Design
  • Web Development
  • Search Engine Marketing SEO
  • Secure Web Hosting & Email
  • Social Media Networking
  • Outsourced Specialist SEO
  • Email Marketing
  • Online Page-Turn PDF Brochures
  • Text / SMS Marketing Services
  • VOIP / IP Telephony Services
  • Mobile Apps
  • CD / DVD Duplication & Full CD / DVD Printing
  • Low Cost, Competitive, High Quality Colour Printing

Wednesday, July 19, 2006

Today's Google Bots and What They Do

Google currently indexes over 8 billion web pages. However, before these pages were placed in the index, they were each crawled by a special spider known as the GoogleBot. Unfortunately, many web masters do not know about the internal workings of this virtual robot.

In fact, Google actually uses a number of spiders to crawl the Web. You can catch these spiders by examining your log files.

This article will attempt to reveal some of the most important Google spiders, their function, and how they affect you as a web master. We'll start with the well-known GoogleBot.


Googlebot, as you probably know, is the search bot used by Google to scour the web for new pages. Googlebot has two versions, deepbot and freshbot. Deepbot is a deep crawler that tries to folow every link on the web and download as many pages as it can for the Google index. It also examines the internal structure of a site, giving a complete picture for the index.

Freshbot, on the other hand, is a newer bot that crawls the web looking for fresh content. The Google freshbot was implemented to take some of the pressure off of the GoogleBot. The freshbot recalls pages already in the index and then crawls them for new, modified, or updated pages. In this way, Google is better equipped to keep up with the ever-changing Web.

This means that the more you update your web site with new, quality content, the more the Googlebot will come by to chëck you out.

If you'd like to see the Googlebot crawling around your web property more often, you need to obtain quality inbound links. However, there is also one more step that you should take. If you haven't already done so, you should create a Google Sitemap for your site.

Creating a Google sitemap allows you to communicate with Google, telling them about your most important pages, new pages, and updated pages. In return, Google will provide you with some valuable information as well. Google Sitemaps will tell you about pages it was unable to crawl and links it was unable to follow. This allows you to pinpoint problems and fix them so that you can gain increased exposure in the search results.

The next Google bot in our lineup is known as the MediaBot.

MediaBot - used to analyze Adsense pages
useragent: Mediapartners-Google

MediaBot is the Google crawler for Adsense Publishers. Mediabot is used to determine wich ads Google should display on Adsense pages.

Google recommends that webmasters specifically add a command in their robots.txt file that grants Mediabot access to their entire site. To do this, simply enter the following code into your robots.txt file:

User-agent: Mediapartners-Google*

This will ensure that the MediaBot is able to place relevant Adsense ads on your site.

Keep in mind that ads can still be shown on a page if the MediaBot has not yet visited. If that is the case, the ads chosen will be based on the overall theme of the other pages on the site. If no ads can be chosen, the dreaded public service announcements are displayed instead.

There is a strong debate over whether or not the MediaBot is giving websites with Adsense an advantage in the search engines.

Even Matt Cutts has confirmed that the Adsense Mediabot has indexed webpages for Google's main index.

He states, "Pages with AdSense will not be indexed more frequently. It's literally just a crawl cache, so if e.g. our news crawl fetched a page and then Googlebot wanted the same page, we'd retrieve the page from the crawl cache. But there's no boost at all in rankings if you're in AdSense or Google News. You don't get any more pages crawled either."

Matt Cutts claims that your website does not get any advantage by using Adsense. However, in my mind, simply getting your site updated in and of itself is an advantage.

This is very similar to Google Analytics, which also promotes a slightly higher degree of spider activity.

Those who run Google Analytics on their site can expect additional spider activity.

However, you certainly shouldn't depend on any of these tools for getting your site indexed. The key to frequent spidering is having quality inbound links, quality content, and frequent updates.

Have images on your site? If so, you have likely been visited by our next Google spider, the ImageBot.

ImageBot - used to crawl for the Image Search
useragent: GoogleBot-Image

The Imagebot prowls the Web for images to place in Google's image search. Images are ranked based upon their filename, surrounding text, alt text, and page title.

If you have a website that is primarily image based, then you would definitely want to optimize your images to receive some extra Google traffïc.

On the other hand, some web sites may not benefit from Google image search. In most cases, the traffïc from the Image search engine is very low quality and rarely converts into buyers. Many people are often just looking for images that they can swipe. So, if you want to save some bandwidth, use your robots.txt file to block ImageBot from accessing your image directory.

One of the few exceptions I would make is if you have a site dedicated to downloadable images.

Our final bot is completely dedicated to the Google Adwords program.

AdsBot - Checks Adwords landing pages for quality
useragent: AdsBot-Google

AdsBot is one of Google's newest spiders. This new crawler is being used to analyze the content of advertising landing pages, which helps determine the Quality score that Google assigns to your ads.

Google uses this Quality score in combination with the amount you are willing to bid to determine the position of your ads. Therefore, ads with a high quality score can rank higher even if other advertisers are paying more than you.

This is one of Google's many efforts to ensure that they are delivering the best results to their users.

Can you still block being spidered? Of course, but it will lower your overall Adwords quality score, which could end up lowering the positioning of your ads. If possible, it is best to give AdsBot complete access to your site.

Today's Google bots are becoming more advanced all the time. However, nothing beats relevant, quality, updated content. Deliver that and the search engines will eat it up.

About The Author
Kim Roach is a staff writer and editor for the SiteProNews and SEO-News newsletters. You can contact Kim at: kim @ seo-news.com

This article may be freely distributed without modification and provided that the copyright notice and author information remain intact.

posted by Unknown @ 7:14 am 0 comments

Monday, July 17, 2006

How To Make Your Backlinks Count

I think that we all know now how important backlinks are for the search engine ranking success of our web site. I also think that we all know now that one way to get these backlinks is by doing link exchanges with other sites. (For the uninitiated, backlinks are links on other web sites that point to ours.)

But do we know that there are various things we need to chëck before agreeing to exchanging links with another web site? After all, the whole point of exchanging links with others is to benefit from the ranking of their web sites. It is therefore imperative that we investigate ahead of time as to whether exchanging links with another site is to our advantage or not.

I have thought about this whole idea of link exchange preparation and came up with a 3-step process that involves an analysis of 3 pages of the web site we want to exchange links with. The 3 pages are:

the home page where we will need to chëck 7 things about that page,

the directory page: this is the page that contains a list of categories that the site has put together in a bid to organize its link exchanges. We need to chëck 7 things here,

the backlink page: this is the page that will contain our backlink. We also need to chëck 7 things here.
Note that in some cases, the 'target' web site will not have a directory page. In that case, our analysis will be a 2-step one rather than a 3-step one.

In this first article, of maybe 2 or 3, we shall look at the 7 things (organized in 2 groups) we should chëck about the home page of the target web site.

1. We need to chëck what Google thinks of the site. This is done by looking at some things that Google is happy to report about a site. The idea here is to see if there are any problems with the site, from Google's point of view, that would cause us to decide not to exchange links with them.

I suggest that you look at:

a) how many pages the site has indexed? This can be done by using 'site:www.site.com' in the Google search box,

b) how many backlinks does Google report for the site? This can be done by using 'links:www.site.com' in the Google search box,

c) is the site listed in Google's index? This can be done by using 'http://www.site.com' in the Google search box,

d) what is the Page Rank of the site/home page? There are several ways of doing this. One way is to get the Google toolbar and visit the web site to see its PR. Another way is to use one of the many web sites on the Internet that enables us to find the PR of any web site. And still another way is to look for software that will tell you the PR of any web site.

Although you will generally need to look at these 4 things together in a sort of table in order to decide whether the target site passes the first step or not, there are definite results that would cause me to decide straight away not to exchange links with a site:

if the PR of the site is zero,

if the site is not listed in the Google index,

if the site does not have any pages indexed by Google.

You will note here that I am using Google as the first step in the preparation. In fact, we can use any search engine if we want to but given that Google is more fussy than others when it comes to backlinks, I would suggest using Google in the above first step.

2. I would then suggest that you look at 3 tags of the home page of the target site, in a bid to determine if the site has a theme compatible with yours:

a) Its title tag,

b) Its description tag,

c) Its keywords tag.

There are several ways of checking the content of a web page's tags. If your browser, such as MSIE, allows you to view the content of a web page, then you will need to visit the home page and clíck on the right menu command to view the content of the page. Then you will need to scroll to the top of the page to view the content of its tags. The alternative is to look for web sites on the Internet that enable you to view the content of the tags of any web pages. There are many such web sites on the net.

The reason for this step is that Google takes into account the theme of web sites when looking at backlinks. This means that backlinks from a site that has a theme compatible with yours will be of higher value than one that comes from a site that has nothing to do with what your site is about.

As mentioned above, this preparation is for the home page of the target web site and is only the first step. There are 2 more steps: one for the directory page and one for the backlink page.

It is only when the target web site passes the 3 steps that you should feel confident about exchanging links with them in that the link exchange will be to your advantage. Needless to say that the target web site will also need to conduct a similar 3-step analysis of your site prior to wanting to exchange links with you.

About The Author
Serge M Botans is the CEO of http://www.seo-analysis.com online-tools where you will find 2 frëe custom SEO tools. One of these tools will enable you to conduct the link exchange preparation mentioned in this article.

posted by Unknown @ 7:43 am 0 comments