Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I manage robots.txt in Azure app service with a staging slot

I have an ASP.NET MVC app running in an Azure app service with one staging slot, and a build and release pipeline in VSTS.

I want the production instance to have Allow / in robots.txt and Disallow / in the staging slot at all times.

Currently we are changing robots.txt manually every time we do a swap but this is error prone

How can I automate this process?

To solve this problem I did consider creating the robots.txt file dynamically based on app settings set in the Azure portal (set to stay with the slot), however this won't work since after the swap happens prod will have the staging Disallow rule.

Can anyone advise the best way to manage this?

like image 401
Vinyl Warmth Avatar asked Dec 13 '17 19:12

Vinyl Warmth


People also ask

What is staging slot in Azure?

You can validate app changes in a staging deployment slot before swapping it with the production slot. Deploying an app to a slot first and swapping it into production makes sure that all instances of the slot are warmed up before being swapped into production. This eliminates downtime when you deploy your app.

How do deployment slots work in Azure?

Azure Functions deployment slots allow your function app to run different instances called "slots". Slots are different environments exposed via a publicly available endpoint. One app instance is always mapped to the production slot, and you can swap instances assigned to a slot on demand.

What does deployment slot setting mean?

Deployment Slot Setting With this setting in place, database connection strings are not swapped when the slots are swapped. So staging slot will always point to the staging database and production slot will always point to the production database.


1 Answers

Robots are mainly used by search engines to crawl and check pages on the public websites. Staging and other deployment slots are not public (and should not be public — unless you have a good reason for that), and thus it doesn't make much sense to configure and manage it. Secondly, in most cases I would recommend to redirect any public request to your production slot and keep staging offline and active for internal use cases only. This would also help you to manage the analytics and logs coming from the public only, and not being polluted with internal and deployment slots stuff.

Anyways, if you are still inclined to do this, then there is one way that you can manage this. Write your own routing to control the robots file, and then render a content-type: text/plain page, which would be dynamic based on whether it is a staging or production request. Something like this,

// Create the robots.txt file dynamically, by controlling the URL handler
[Route("robots.txt")]
public ContentResult DynamicRobotsFile()
{
    StringBuilder content = new StringBuilder();
    content.AppendLine("user-agent: *");

    // Check the condition by URL or Environment variable
    if(allow) {
        content.AppendLine("Allow: /");
    else {
        content.AppendLine("Disallow: /");
    }

    return this.Content(stringBuilder.ToString(), "text/plain", Encoding.UTF8);
}

This way you can manage how the robots.txt is created and you would be able to control the allow disallow for the robots. You can create a separate controller or an action only in the home controller of your app.

Now that you know how to do, you can setup the environment variables for the production/staging slots to check other requirements.

like image 181
Afzaal Ahmad Zeeshan Avatar answered Oct 02 '22 09:10

Afzaal Ahmad Zeeshan