Automating Scout Suite Scans for AWS

Airman
5 min readNov 13, 2023

This post is part 1 of a series:

Image source: NCC Group ScoutSuite GitHub repo

Background

Scout Suite was created by NCC Group and is an open source multi-cloud security-auditing tool, which enables security posture assessment of cloud environments. Scout Suite is written in Python and available here: https://github.com/nccgroup/ScoutSuite

This guide outlines how to automate Scout Suite scans of your AWS environment. We will build a Scout Suite Docker image and configure an AWS Lambda function to enable manually triggered scans of your account. In subsequent guides I will show how to run the scans on schedule, and how to automate the whole process of deploying the scan resources using Infrastructure as Code.

Scout Suite Container Image

Scout Suite documentation provides instructions on how to build and use a container image, but I decided to build a custom image, mainly for the following reasons:

  • I can make the container image smaller (in my tests 1.8G vs 4.2G) by removing unneeded dependencies (GCP, Azure, etc.).
  • I want the image to run a scan and upload the scan report to S3 once it completes.
  • I want to be able to run the scans as a Lambda function.

You can find the source for building the container image in my GitHub here: https://github.com/airman604/dockerfiles/tree/master/ScoutSuite

The Dockerfile is pretty simple. It’s using python:3-slim base image to reduce the resulting image size (compared to the full python base image), adds a non-root user to run the scan as, and adds a script to be used as the ENTRYPOINT

entrypoint.sh has a couple interesting things:

  • It expects S3_BUCKET and S3_PREFIX environment variables to be set, and uploads the Scout Suite report to the specified S3 location as scout-<TIMESTAMP>.tar.gz
  • It uses AWS_LAMBDA_RUNTIME variable to detect whether the container is running as a Lambda function. If that is the case, it will use the Lambda runtime API to “accept” the invocation, and will also return an invocation response at the end. Without this logic, Lambda will always think the function terminated with an error. This is not a fully functioning Lambda runtime as it exits once a single scan is completed. This is not what normal Lambda runtimes do (they’re supposed to wait for the next invocation), but I think implementation of the full runtime is not worth the hassle considering we’re not likely to run multiple scans in a rapid succession.

Building the Image

To build the image:

  • Clone the repository
  • Build the image
  • Push the image to a registry (needed for the next section)

Below are the commands with explanations. Before running these, ensure your AWS CLI is configured with credentials that have access to ECR. Also, replace the values of AWS_REGION and AWS_ACCOUNT_ID variables.

# clone the repository
git clone https://github.com/airman604/dockerfiles.git

# build the image
cd dockerfiles/ScoutSuite
docker build -t scoutsuite-aws .

# push the image to ECR:
# - before running these, ensure your AWS CLI is configured
# with credentials that have access to ECR
# - replace the following variables with your values
AWS_REGION=us-west-2
AWS_ACCOUNT_ID=123456789012
# create the repository if needed
aws ecr create-repository --repository-name scoutsuite-aws --region "$AWS_REGION"
# login to the repository
aws ecr get-login-password --region "$AWS_REGION" | docker login --username AWS --password-stdin "${AWS_ACCOUNT_ID}.dkr.ecr.${AWS_REGION}.amazonaws.com"
# tag the image
docker tag scoutsuite-aws:latest "${AWS_ACCOUNT_ID}.dkr.ecr.${AWS_REGION}.amazonaws.com/scoutsuite-aws:latest"
# push the image
docker push "${AWS_ACCOUNT_ID}.dkr.ecr.${AWS_REGION}.amazonaws.com/scoutsuite-aws:latest"

Using the Image Manually

Now let’s create and run a Lambda function to execute the scan. Lambda is a reasonable option as Lambda functions are easy to setup and invoke with no infrastructure management overhead. One caveat is that maximum execution time of Lambda functions cannot be longer than 15 minutes. In my testing this hasn’t been an issue, with the scans taking 3–4 minutes, but you might be hitting this limit for accounts with a lot of resources.

Here’s how to deploy our image as a Lambda function:

  • Login to AWS, open Lambda console and select Create Function.
  • Select Container Image as the type, set the function name, select container image created in the previous section, and (important!) select the correct Architecture. If you built the container on a Mac with Apple silicon, your image will be built for arm64 architecture. Note: you can change this by providing --platform argument to the docker build command and specifying either linux/arm64 for ARM or linux/amd64 for x86_x64.
  • Update the maximum execution time to 15 min (max allowed by Lambda) and the memory allocation to 2Gb. To do this, go to Configuration tab, then select General configuration and Edit:
  • Create an S3 bucket where you want the reports to be saved and note the bucket name.
  • Give the Lambda function IAM permissions to run Scout Suite scans and upload reports to S3. Open the function, select Configuration tab, then select Permissions on the left and under Execution Role click on the role that Lambda automatically created. In the IAM console that opens, click Add Permissions, then Attach policies. Under Filter by Type select AWS Managed — job function, then search for and add ReadOnlyAccess and SecurityAudit policies (these are needed to run ScoutSuite scans). Then select Add Permissions again and add the following inline policy (switch the Policy Editor to JSON view, replace S3 bucket name and path with the location where reports should be uploaded):
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "s3write",
"Action": "s3:PutObject",
"Effect": "Allow",
"Resource": "arn:aws:s3:::<YOUR_S3_BUCKET_NAME>/<YOUR>/<S3>/<PATH>/*"
}
]
}
  • Your Lambda role should look similar to this:
  • Open the Lambda function again, select Configuration tab, then Environment variables on the left, then click Edit and add S3_BUCKET and S3_PREFIX environment variables pointing to your bucket with reports and path where to upload the reports (should be consistent with the IAM policy you created in the previous step):

Now you can run the Lambda (go to Test tab, then select Test) and, if everything is setup correctly, check the report in the S3 bucket. Here’s an example output:

What’s Next?

Stay tuned! In the upcoming guides I’ll describe how to:

  • Configure a schedule to automatically run the Lambda
  • Configure email notifications for new scan reports
  • Automate the whole deployment through Terraform or CDK

--

--

Airman

Random rumblings about #InfoSec. The opinions expressed here are my own and not necessarily those of my employer.