What is Googlebot

Googlebot is responsible for adding web sites to the organic search index of Google.com. He is but one piece of the Google organic search puzzle: Googlebot is the crawler, the index is master database of all pages know to Google, there is a PageRank calculation for each page in the index, there is a search algorithm that ranks pages for specific user queries. Googlebot is the Michael Jordan of web robots-- he routinely visits billions of web pages in search of new content, giving his bosses, Larry and Sergey, a significant competitive advantage in creating the world’s most popular search engine. Google.com delivers approximately 70% of the unpaid, organic traffic to websites.

In general, Googlebot tries to help ensure that the pages he submits comply with Larry & Sergey’s Webmaster Guidelines. In practice, many different parties make decisions about which pages go into The Index or become visible to consumers, so for the purposes of this persona, we’ll attribute them all to Googlebot. Googlebot functions as a search bot to crawl content on a site and interpret the contents of a user's created robots.txt file (e.g., www.myhostplace.com/robots.txt). The searchable bots (robots) work by reading Web pages; then, they make the content of the pages available to all Google services (done by Google's caching proxy).

Note: Googlebot's requests to Web servers are done by a user-agent string containing "Googlebot," and requests to a host address contain "googlebot.com."

0 comments:

Post a Comment

Contact Form

Name

Email *

Message *