Browse Definitions :
Definition

Googlebot

Contributor(s): Matthew Haughn

Googlebot is a web crawling software search bot (also known as a spider or webcrawler) that gathers the web page information used to supply Google search engine results pages (SERP).

Googlebot collects documents from the web to build Google’s search index. Through constantly gathering documents, the software discovers new pages and updates to existing pages. Googlebot uses a distributed design spanning many computers so it can grow as the web does.

The webcrawler uses algorithms to determine what sites to browse, what rates to browse at and how many pages to fetch from. Googlebot begins with a list generated from previous sessions. This list is then augmented by the sitemaps provided by webmasters. The software crawls all linked elements in the webpages it browses, noting new sites, updates to sites and dead links. The information gathered is used to update Google’s index of the web.

Googlebot creates an index within the limitations set forth by webmasters in their robots.txt files. Should a webmaster wish to keep pages hidden from Google search, for example, he can block Googlebot in a robots.txt file at the top-level folder of the site. To prevent Googlebot from following any links on a given page of a site, he can include the nofollow meta tag; to prevent the bot from following individual links, the webmaster can add rel="nofollow" to the links themselves.

A site’s webmaster might detect visits every few seconds from computers at google.com, showing the user-agent Googlebot. Generally, Google tries to index as much of a site as it can without overwhelming the site’s bandwidth. If a webmaster finds that Googlebot is using too much bandwidth, they can set a rate on Google’s search console homepage that will remain in effect for 90 days.

Presenting at the 2011 SearchLove conference, Josh Giardino claimed that Googlebot is actually the Chrome browser. That would mean that Googlebot has not only the ability to browse pages in text, as crawlers do, but can also run scripts and media as web browsers do. That capacity could allow Googlebot to find hidden information and perform other tasks that are not acknowledged by Google. Giardino went so far as to say that Googlebot may be the original reason that the company created Chrome.

This was last updated in June 2017

Continue Reading About Googlebot

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

File Extensions and File Formats

SearchCompliance

  • smart contract

    A smart contract, also known as a cryptocontract, is a computer program that directly controls the transfer of digital currencies...

  • risk map (risk heat map)

    A risk map, also known as a risk heat map, is a data visualization tool for communicating specific risks an organization faces. A...

  • internal audit (IA)

    An internal audit (IA) is an organizational initiative to monitor and analyze its own business operations in order to determine ...

SearchSecurity

SearchHealthIT

  • Health IT (health information technology)

    Health IT (health information technology) is the area of IT involving the design, development, creation, use and maintenance of ...

  • fee-for-service (FFS)

    Fee-for-service (FFS) is a payment model in which doctors, hospitals, and medical practices charge separately for each service ...

  • biomedical informatics

    Biomedical informatics is the branch of health informatics that uses data to help clinicians, researchers and scientists improve ...

SearchDisasterRecovery

  • risk mitigation

    Risk mitigation is a strategy to prepare for and lessen the effects of threats faced by a data center.

  • ransomware recovery

    Ransomware recovery is the process of resuming options following a cyberattack that demands payment in exchange for unlocking ...

  • natural disaster recovery

    Natural disaster recovery is the process of recovering data and resuming business operations following a natural disaster.

SearchStorage

  • RAID 5

    RAID 5 is a redundant array of independent disks configuration that uses disk striping with parity.

  • non-volatile storage (NVS)

    Non-volatile storage (NVS) is a broad collection of technologies and devices that do not require a continuous power supply to ...

  • petabyte

    A petabyte is a measure of memory or data storage capacity that is equal to 2 to the 50th power of bytes.

Close