Browse Definitions :
Definition

robots.txt

Contributor(s): Matthew Haughn

Robots.txt is a file on a website that instructs search engine crawlers which parts of the site should not be accessed by search engine bot programs. Robots.txt is a plaintext file but uses special commands and syntax for webcrawlers. Though not officially standardized, robots.txt is generally followed by all search engines.

Spider programs, such as Googlebot, index a website using instructions set forth by the site's webmaster. Sometimes a webmaster may have parts of site that have not have been optimized for search engines, or some parts of websites might be prone to exploitation by spammers through, for example, link spam on a page that features user generated content (UGC). Should a webmaster wish to keep pages hidden from Google search, he can block the page with a robots.txt file at the top-level folder of the site.Robots.txt is also known as “the robot exclusion protocol.” Preventing crawlers from indexing spammy content means the page will not be considered when determining PageRank and placement in search engine results pages (SERP). 

The nofollow tag is another way to control webcrawler behavior. The nofollow tag stops crawlers from tallying links within pages for determining PageRank. Webmasters can use nofollow to avoid search engine optimization (SEO) penalties. To prevent Googlebot from following any links on a given page of a site, the webmaster can include a nofollow meta tag in the robots.txt file; to prevent the bot from following individual links, they can add rel="nofollow" to the links themselves.

This was last updated in June 2017

Continue Reading About robots.txt

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

File Extensions and File Formats

Powered by:

SearchCompliance

  • PCI DSS (Payment Card Industry Data Security Standard)

    The Payment Card Industry Data Security Standard (PCI DSS) is a widely accepted set of policies and procedures intended to ...

  • risk management

    Risk management is the process of identifying, assessing and controlling threats to an organization's capital and earnings.

  • compliance framework

    A compliance framework is a structured set of guidelines that details an organization's processes for maintaining accordance with...

SearchSecurity

  • Trojan horse (computing)

    In computing, a Trojan horse is a program downloaded and installed on a computer that appears harmless, but is, in fact, ...

  • identity theft

    Identity theft, also known as identity fraud, is a crime in which an imposter obtains key pieces of personally identifiable ...

  • DNS over HTTPS (DoH)

    DNS over HTTPS (DoH) is a relatively new protocol that encrypts domain name system traffic by passing DNS queries through a ...

SearchHealthIT

  • telemedicine (telehealth)

    Telemedicine is the remote delivery of healthcare services, such as health assessments or consultations, over the ...

  • Project Nightingale

    Project Nightingale is a controversial partnership between Google and Ascension, the second largest health system in the United ...

  • medical practice management (MPM) software

    Medical practice management (MPM) software is a collection of computerized services used by healthcare professionals and ...

SearchDisasterRecovery

SearchStorage

  • M.2 SSD

    An M.2 SSD is a solid-state drive (SSD) that conforms to a computer industry specification and is used in internally mounted ...

  • kilobyte (KB or Kbyte)

    A kilobyte (KB or Kbyte) is a unit of measurement for computer memory or data storage used by mathematics and computer science ...

  • virtual memory

    Virtual memory is a memory management capability of an operating system (OS) that uses hardware and software to allow a computer ...

Close