Hi everyone,
For a project I’m currently working on I needed a dynamic robots.txt. Because our test environment is public facing we want to keep it from being indexed by Google etc. It took a bit of Googling to find a solution that worked, but in the end it was actually pretty simple.
Here’s the action in one of the API Controllers:
public class UtilitiesController : CustomBaseApiController
{
[Route("Robots.txt")]
[HttpGet]
public HttpResponseMessage GetRobotsFile()
{
var resp = new HttpResponseMessage(HttpStatusCode.OK);
var stringBuilder = new StringBuilder();
if (Helpers.IsProduction())
{
// Allow bots in production
stringBuilder.AppendLine("user-agent: *");
stringBuilder.AppendLine("disallow: ");
}
else
{
// Don't allow bots in non-production environments
stringBuilder.AppendLine("user-agent: *");
stringBuilder.AppendLine("disallow: *");
}
resp.Content = new StringContent(stringBuilder.ToString());
return resp;
}
}
Also need to add the following to your web.config so that the robots.txt file can processed by the routing handler. Without this IIS will attempt to serve it as a static file and will return a 404 when it’s not found:
In production you’ll end up with the following:
user-agent: * disallow:
And any other environments:
user-agent: * disallow: *
Thanks to these answers on stackoverflow for the info:
https://stackoverflow.com/a/52270877/522859
https://stackoverflow.com/a/17037935/522859