The robots exclusion protocol (REP), or robots.txt is a text file webmasters create to instruct robots (typically search engine robots) how to crawl and index pages on their website. Although it’s not an official standard, using robots.txt is a common approach used for Internet-facing websites to exclude pieces of your website from crawling. Find out how you can easily create and manage the robots.txt file with Mavention Robots.txt for Sitecore.
Recently i followed the Sitecore XP 8 Website Development for .NET Developers (WND) training. I am happy to announce that I passed the exam and that I am now a Sitecore XP 8.0 Certified Professional Developer.
SharePoint Search by default crawls the contents of all site columns so that they are available for search queries. While most would consider that to be a good attribute there are times when it can be a hindrance, especially when content is shown on public facing websites which is not relevant to the content on the page or the metadata about the page.