Browse Definitions :
Definition

crawl depth

Contributor(s): Stan Gibilisco

Crawl depth is the extent to which a search engine indexes pages within a website. Most sites contain multiple pages, which in turn can contain subpages. The pages and subpages grow deeper in a manner similar to the way folders and subfolders (or directories and subdirectories) grow deeper in computer storage.

In general, the further down in the Web site hierarchy a particular page appears, the smaller the chance that it will appear with a high rank in a search engine results page (SERP). A Web site's home page has a crawl depth of 0 by default. Pages in the same site that are linked directly (with one click) from within the home page have a crawl depth of 1; pages that are linked directly from within crawl-depth-1 pages have a crawl depth of 2, and so on.

A crawler -- also known as a spider or bot -- is a program that visits websites and reads their pages and other information in order to create entries for a search engine index.

This was last updated in October 2012

Join the conversation

2 comments

Send me notifications when other members comment.

Please create a username to comment.

Thanks for your answer. I need to know one thing. My site having multiple subdomains and hence i need to calculate the depth of the requested url in the subdomain and i need to skip crawling that subdomain if depth is not 1. I did alll but i would like know reponse.meta["depth"] will return a depth of the url in entire site But how to calculate the depth of the subdomain requested url
Cancel
thank you for this amazing topic informations
Cancel

-ADS BY GOOGLE

File Extensions and File Formats

SearchCompliance

  • California Consumer Privacy Act (CCPA)

    The California Consumer Privacy Act (CCPA) is legislation in the state of California that supports an individual's right to ...

  • compliance audit

    A compliance audit is a comprehensive review of an organization's adherence to regulatory guidelines.

  • regulatory compliance

    Regulatory compliance is an organization's adherence to laws, regulations, guidelines and specifications relevant to its business...

SearchSecurity

  • privilege creep

    Privilege creep is the gradual accumulation of access rights beyond what an individual needs to do his job. In IT, a privilege is...

  • BlueKeep (CVE-2019-0708)

    BlueKeep (CVE-2019-0708) is a vulnerability in the Remote Desktop (RDP) protocol that affects Windows 7, Windows XP, Server 2003 ...

  • endpoint detection and response (EDR)

    Endpoint detection and response (EDR) is a category of tools and technology used for protecting computer hardware devices–called ...

SearchHealthIT

SearchDisasterRecovery

  • disaster recovery team

    A disaster recovery team is a group of individuals focused on planning, implementing, maintaining, auditing and testing an ...

  • cloud insurance

    Cloud insurance is any type of financial or data protection obtained by a cloud service provider. 

  • business continuity software

    Business continuity software is an application or suite designed to make business continuity planning/business continuity ...

SearchStorage

  • Hadoop as a service (HaaS)

    Hadoop as a service (HaaS), also known as Hadoop in the cloud, is a big data analytics framework that stores and analyzes data in...

  • blockchain storage

    Blockchain storage is a way of saving data in a decentralized network which utilizes the unused hard disk space of users across ...

  • disk mirroring (RAID 1)

    RAID 1 is one of the most common RAID levels and the most reliable. Data is written to two places simultaneously, so if one disk ...

Close