Mavention Robots.txt for Sitecore

Generating robots.txt files with Mavention Robots.txt for Sitecore

The robots exclusion protocol (REP), or robots.txt is a text file webmasters create to instruct robots (typically search engine robots) how to crawl and index pages on their website. Although it’s not an official standard, using robots.txt is a common approach used for Internet-facing websites to exclude pieces of your website from crawling. Find out how you can easily create and manage the robots.txt file with Mavention Robots.txt for Sitecore.