Browse Definitions :
Definition

crawl depth

Contributor(s): Stan Gibilisco

Crawl depth is the extent to which a search engine indexes pages within a website. Most sites contain multiple pages, which in turn can contain subpages. The pages and subpages grow deeper in a manner similar to the way folders and subfolders (or directories and subdirectories) grow deeper in computer storage.

In general, the further down in the Web site hierarchy a particular page appears, the smaller the chance that it will appear with a high rank in a search engine results page (SERP). A Web site's home page has a crawl depth of 0 by default. Pages in the same site that are linked directly (with one click) from within the home page have a crawl depth of 1; pages that are linked directly from within crawl-depth-1 pages have a crawl depth of 2, and so on.

A crawler -- also known as a spider or bot -- is a program that visits websites and reads their pages and other information in order to create entries for a search engine index.

This was last updated in October 2012

Join the conversation

2 comments

Send me notifications when other members comment.

Please create a username to comment.

Thanks for your answer. I need to know one thing. My site having multiple subdomains and hence i need to calculate the depth of the requested url in the subdomain and i need to skip crawling that subdomain if depth is not 1. I did alll but i would like know reponse.meta["depth"] will return a depth of the url in entire site But how to calculate the depth of the subdomain requested url
Cancel
thank you for this amazing topic informations
Cancel

-ADS BY GOOGLE

File Extensions and File Formats

SearchCompliance

  • PCI DSS (Payment Card Industry Data Security Standard)

    The Payment Card Industry Data Security Standard (PCI DSS) is a widely accepted set of policies and procedures intended to ...

  • risk management

    Risk management is the process of identifying, assessing and controlling threats to an organization's capital and earnings.

  • compliance framework

    A compliance framework is a structured set of guidelines that details an organization's processes for maintaining accordance with...

SearchSecurity

  • Trojan horse (computing)

    In computing, a Trojan horse is a program downloaded and installed on a computer that appears harmless, but is, in fact, ...

  • identity theft

    Identity theft, also known as identity fraud, is a crime in which an imposter obtains key pieces of personally identifiable ...

  • DNS over HTTPS (DoH)

    DNS over HTTPS (DoH) is a relatively new protocol that encrypts domain name system traffic by passing DNS queries through a ...

SearchHealthIT

  • telemedicine (telehealth)

    Telemedicine is the remote delivery of healthcare services, such as health assessments or consultations, over the ...

  • Project Nightingale

    Project Nightingale is a controversial partnership between Google and Ascension, the second largest health system in the United ...

  • medical practice management (MPM) software

    Medical practice management (MPM) software is a collection of computerized services used by healthcare professionals and ...

SearchDisasterRecovery

SearchStorage

  • M.2 SSD

    An M.2 SSD is a solid-state drive (SSD) that conforms to a computer industry specification and is used in internally mounted ...

  • kilobyte (KB or Kbyte)

    A kilobyte (KB or Kbyte) is a unit of measurement for computer memory or data storage used by mathematics and computer science ...

  • virtual memory

    Virtual memory is a memory management capability of an operating system (OS) that uses hardware and software to allow a computer ...

Close