Friday, April 3, 2009

How Does Google Index A Site

As a starting point to elaborate this issue is to firstly understand on program used by Google in spidering and crawling a site/blog which is generally called Bot, it is the crawler or spider program that indexes pages for Google daily or frequently depends on popularity of a blog/site.

Google Bot is divided into three bots:

1. Adsense bot
2. Freshbot
3. DeepCrawl

Adsense bot, the name speaks for it self, yes it’s used by Google to recognizes what is publisher’s pages is all about, by receiving message from JavaScript of Adsense code we added in our template, so it can deliver relevant ads with the page content.

Freshbot is the most active bot among other bots, it regularly visits a blog to crawls the most popular pages. It doesn’t matter if that is one page or thousands. But the frequency of its visit measured on popularity of a site/blog, certainly the more popular a site the more frequent Freshbot will visit it. News site like BBC, CNN or online marketplace site like Ebay.com crawled in a matter of minutes a day.

Each site/blog must be having deeper links, thus what happen to them if Freshbot finds them during its visit? Freshbot will place the links within a database, and put them in queue till the time of DeepCrawl to take actions.

DeepCrawl bot is arranged to visit a blog monthly, before its visit, it firstly checks deeper links placed in queue of database, then grab them all to use as references when crawling a blog. Because it’s set to visit a blog monthly, thus it will take a month to get your entire content fully indexed in Google.

Therefore as a blogger/webmaster, our task to get the top position is doing a regular or may be a frequent update to a blog with fresh, informative or entertaining contents and build a valuable link building campaign, certainly the last yet the most important IS PATIENT.



Comments :

0 comments to “How Does Google Index A Site”


Post a Comment