Dynamically Create robots.txt for Multi-Site in Sitecore

Sitecore doesn’t give multisite robots.txt out of the box and we need to build this functionality from scratch. In this article, we will learn How to Dynamically Create robots.txt for Multi-Site in Sitecore.

We need to create a custom processor which handles the text format and then we need to override the HttpRequestProcessor pipeline in Sitecore.

How to Dynamically Create robots.txt for Multi-Site in Sitecore

Step 1: Create a Robots Data Template. And add the field name Site Robots.

Dynamically Create robots.txt for Multi-Site in Sitecore

Implementing Sitecore Multisite Robots.txt

Step 2: We can set the default text in the standard values for the robots.txt file. For example, it’s not a good practice for search engines to crawl the Sitecore pages like ShowConfig.aspx etc. Hence we can add the below text into the standard value to block crawlers crawling  the Sitecore files

Step 3: Based on your content architecture inherit or create an item which is derived from Robots data template and add the text in the editor.

Step 4: Creating the custom HttpRequestProcessor

Create a class file and add the below code in the class file.

The code is pretty straight forward. If the request URL ends with robots.txt then the request is intercepted and we will fetch the contents of robots.txt from the item and display it as plain text format. In case the field is left blank the default  value which we have set to the standard value will be populated here.

Step 5: The final step is to patch up the config file. The Configuration file must allow the txt format file and we need to register the HttpRequestProcessor and map it to the service we created.

Browse the site and hit the robots.txt URL and you should be able to see the dynamically generated robots.txt text. This code also supports multisite. You need not change anything. Create a language version and add the content to it.

You may also like...

1 Response

  1. November 1, 2016

    […] I’ve seen some solutions on creating the robots.txt dynamically but didn’t work for me. I wanted to handle the robots.txt before any Sitecore-related processing was done to keep the solution lean.  This means that I cannot use the httpRequestBegin pipeline to access the Site context, etc.  (If you can use the httpRequestbegin, see srinivas_r’s post). […]

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.