Browse Definitions :
Definition

crawl depth

Contributor(s): Stan Gibilisco

Crawl depth is the extent to which a search engine indexes pages within a website. Most sites contain multiple pages, which in turn can contain subpages. The pages and subpages grow deeper in a manner similar to the way folders and subfolders (or directories and subdirectories) grow deeper in computer storage.

In general, the further down in the Web site hierarchy a particular page appears, the smaller the chance that it will appear with a high rank in a search engine results page (SERP). A Web site's home page has a crawl depth of 0 by default. Pages in the same site that are linked directly (with one click) from within the home page have a crawl depth of 1; pages that are linked directly from within crawl-depth-1 pages have a crawl depth of 2, and so on.

A crawler -- also known as a spider or bot -- is a program that visits websites and reads their pages and other information in order to create entries for a search engine index.

This was last updated in October 2012

Join the conversation

2 comments

Send me notifications when other members comment.

Please create a username to comment.

Thanks for your answer. I need to know one thing. My site having multiple subdomains and hence i need to calculate the depth of the requested url in the subdomain and i need to skip crawling that subdomain if depth is not 1. I did alll but i would like know reponse.meta["depth"] will return a depth of the url in entire site But how to calculate the depth of the subdomain requested url
Cancel
thank you for this amazing topic informations
Cancel

-ADS BY GOOGLE

File Extensions and File Formats

Powered by:

SearchCompliance

  • compliance audit

    A compliance audit is a comprehensive review of an organization's adherence to regulatory guidelines.

  • regulatory compliance

    Regulatory compliance is an organization's adherence to laws, regulations, guidelines and specifications relevant to its business...

  • Whistleblower Protection Act

    The Whistleblower Protection Act of 1989 is a law that protects federal government employees in the United States from ...

SearchSecurity

  • brute force attack

    Brute force (also known as brute force cracking) is a trial and error method used by application programs to decode encrypted ...

  • spyware

    Spyware is software that is installed on a computing device without the user's knowledge. Spyware can be difficult to detect; ...

  • ATM black box attack

    An ATM black box attack, also referred to as jackpotting, is a type of banking-system crime in which the perpetrators bore holes ...

SearchHealthIT

SearchDisasterRecovery

  • business continuity and disaster recovery (BCDR)

    Business continuity and disaster recovery (BCDR) are closely related practices that describe an organization's preparation for ...

  • warm site

    A warm site is a type of facility an organization uses to recover its technology infrastructure when its primary data center goes...

  • disaster recovery (DR) test

    A disaster recovery test (DR test) is the examination of each step in a disaster recovery plan as outlined in an organization's ...

SearchStorage

  • disk array

    A disk array, also called a storage array, is a data storage system used for block-based storage, file-based storage or object ...

  • enterprise storage

    Enterprise storage is a centralized repository for business information that provides common data management, protection and data...

  • optical storage

    Optical storage is any storage type in which data is written and read with a laser. Typically, data is written to optical media, ...

Close