Browse Definitions :
Definition

crawl depth

Contributor(s): Stan Gibilisco

Crawl depth is the extent to which a search engine indexes pages within a website. Most sites contain multiple pages, which in turn can contain subpages. The pages and subpages grow deeper in a manner similar to the way folders and subfolders (or directories and subdirectories) grow deeper in computer storage.

In general, the further down in the Web site hierarchy a particular page appears, the smaller the chance that it will appear with a high rank in a search engine results page (SERP). A Web site's home page has a crawl depth of 0 by default. Pages in the same site that are linked directly (with one click) from within the home page have a crawl depth of 1; pages that are linked directly from within crawl-depth-1 pages have a crawl depth of 2, and so on.

A crawler -- also known as a spider or bot -- is a program that visits websites and reads their pages and other information in order to create entries for a search engine index.

This was last updated in October 2012

Join the conversation

2 comments

Send me notifications when other members comment.

Please create a username to comment.

Thanks for your answer. I need to know one thing. My site having multiple subdomains and hence i need to calculate the depth of the requested url in the subdomain and i need to skip crawling that subdomain if depth is not 1. I did alll but i would like know reponse.meta["depth"] will return a depth of the url in entire site But how to calculate the depth of the subdomain requested url
Cancel
thank you for this amazing topic informations
Cancel

-ADS BY GOOGLE

File Extensions and File Formats

SearchCompliance

SearchSecurity

  • computer worm

    A computer worm is a type of malicious software program whose primary function is to infect other computers while remaining ...

  • Single Sign-On (SSO)

    Single sign-on (SSO) is a session and user authentication service that permits a user to use one set of login credentials (e.g., ...

  • Certified Information Systems Auditor (CISA)

    Certified Information Systems Auditor (CISA) is a certification issued by ISACA to people in charge of ensuring that an ...

SearchHealthIT

SearchDisasterRecovery

  • business continuity plan (BCP)

    A business continuity plan (BCP) is a document that consists of the critical information an organization needs to continue ...

  • disaster recovery team

    A disaster recovery team is a group of individuals focused on planning, implementing, maintaining, auditing and testing an ...

  • cloud insurance

    Cloud insurance is any type of financial or data protection obtained by a cloud service provider. 

SearchStorage

  • VRAM (video RAM)

    VRAM (video RAM) is a reference to any type of random access memory (RAM) used to store image data for a computer display.

  • Kilo, mega, giga, tera, peta, exa, zetta and all that

    Kilo, mega, giga, tera, peta, exa, zetta are among the list of prefixes used to denote the quantity of something, such as a byte ...

  • flash memory

    Flash memory, also known as flash storage, is a type of nonvolatile memory that erases data in units called blocks.

Close