AWS Installing AWS Inspector Agent on Windows EC2 Instance

Hi everyone,

Just a quick post on installing the AWS Inspector Agent on a Windows EC2 instance.

Open PowerShell and run the following command:

(new-object System.Net.WebClient).DownloadFile('https://inspector-agent.amazonaws.com/windows/installer/latest/AWSAgentInstall.exe','C:UsersAdministratorDesktopAWSAgentInstall.exe')

On your desktop, right click on AWSAgentInstall.exe and select run as administrator. Follow the prompts.

Go to run, and execute services.msc. You should now see the Amazon SSM Agent:

If you go to your amazon console > amazon inspector > assessment targets > Click on your relevant target > Preview Target:

Your agent status should now be healthy.

Thanks to these links for the info:
https://superuser.com/a/330754/124014
https://docs.aws.amazon.com/inspector/latest/userguide/inspector_installing-uninstalling-agents.html#install-windows

Dynamic Robots.txt with Web Api 2

Hi everyone,

For a project I’m currently working on I needed a dynamic robots.txt. Because our test environment is public facing we want to keep it from being indexed by Google etc. It took a bit of Googling to find a solution that worked, but in the end it was actually pretty simple.

Here’s the action in one of the API Controllers:

    public class UtilitiesController : CustomBaseApiController
    {
        [Route("Robots.txt")]
        [HttpGet]
        public HttpResponseMessage GetRobotsFile()
        {
            var resp = new HttpResponseMessage(HttpStatusCode.OK);
            var stringBuilder = new StringBuilder();

            if (Helpers.IsProduction())
            {
                // Allow bots in production
                stringBuilder.AppendLine("user-agent: *");
                stringBuilder.AppendLine("disallow: ");
            }
            else
            {
                // Don't allow bots in non-production environments
                stringBuilder.AppendLine("user-agent: *");
                stringBuilder.AppendLine("disallow: *");
            }

            resp.Content = new StringContent(stringBuilder.ToString());

            return resp;
        }
    }

Also need to add the following to your web.config so that the robots.txt file can processed by the routing handler. Without this IIS will attempt to serve it as a static file and will return a 404 when it’s not found:


    
    
        
        
        
    
    

In production you’ll end up with the following:

user-agent: *
disallow:

And any other environments:

user-agent: *
disallow: *

Thanks to these answers on stackoverflow for the info:
https://stackoverflow.com/a/52270877/522859
https://stackoverflow.com/a/17037935/522859