LocalStack’s guide to run AWS serverless environment locally : Discover the power of Lambda + Docker + SQS

Because local testing is a must have for every developer, Localstack provides an AWS-like environment without breaking the bank.

[UPDATE 15–03–2021] — A work around for the last docker engine is to specify change to . https://github.com/localstack/localstack/issues/3700

[UPDATE 10–03–2021] — It seems like the new docker engine version do not work with the latest localstack image (problem with the port 53 forwarding). Keep the 20.10.2 version of docker-engine. If you are on MacOs download the 3.1.0 docker desktop version which is composed of the good version.

As AWS stands for one of the greatest cloud providers, one day you will probably have to deploy something on it : lambda functions, S3 storage, Kinesis Streams, SQS queues, DynamoDB… Even if devops is even more accessible with the AWS CDK (infrastructure as code using typescript, python and other languages), multi-environments management could be time consuming and quite expensive.

That’s what localstack is for, it provides a local environment with all the AWS resources available in your machine with the same AWS API.

Moreover, serverless event-based production infrastructure can’t be reproduced at all without Localstack. How can you trigger a lambda function when your SQS queue just received a message which has been pushed by one of your workers ? We are currently evolving in a world where serverless renews the way we used to build our infrastructure.

What’s Localstack ?

Localstack is an open-source project launched by Atlassian which mocks each AWS resources on your local machine. A big part is free such as Cloudformation, Dynamo, EC2, Kinesis, S3 but a great UI and some services need the «pro» version of localstack like EMR, docker lambda, Athena (everything is described here).

Anyway, you still will have 14 days free trial after registering. Such enough to complete this tutorial. The product is maintained and is evolving. By the way, you will be automatically added to their slack community channel as soon as your subscribe !

Localstack UI

Lambda news

While lambda was already an insane tool to manage performance and scalability in your infrastructure, it became even more powerful in the autumn 2020 during the AWS re:invent by supporting dockerized function. What an upgrade !


  • Up to a 10GB docker image is allowed, which should be sufficient for all your dependencies for any programming languages.
  • And because a great news never comes alone, lambda functions are now billed per milliseconds of time processing. (If optimized, it will save a lots of your money)

One of the common use of lambda is to bind them to S3 storage or SQS queue to perform specific actions. However, it can be a mess to setup from scratch without any third parties.

I will walk you through the complete road of how I have approached this. (with valuable tips at each step! )

What are we doing and why ?

In a serverless architecture, a powerful combo is to link an SQS queue and a lambda function to deal with asynchronous workflow. Something is pushing messages to any kind of queue and when you rise the number of events specified in the configuration a lambda is automatically deployed to handle those messages.

Serverless architecture

This is a pay on-demand serverless infrastructure which minimizes costs and allows scalability.

The biggest pain of this beautiful structure is to reproduce this workflow on your machine.

That’s what localstack is for! Use the exact same configuration than your staging/production environment without any additional $$$ expenses.

Since docker image has been supported by lambda, nothing explains how to deal with it!



Localstack could be installed either using pip and launched with but we will use the consistent method regardless of your current working environment : Docker.

If you choose the first method, I can’t guaranty you the full guide will work as well as described. Don’t worry, just docker basis are required.

We are going through the complete developer workflow to ensure you that no secret remains lock. First of all, create a test folder where you will setup all of this project. (A GitHub repository will be linked at the end, but follow this tutorial to understand everything).

Localstack config

The is our configuration entry point where your will be able to make everything works perfectly. Create a with the value.

docker-compose.yml file

What the hell does that mean ?

We just created a docker service named localstack which is based on the image provided by the localstack team. We use the 0.12.6 version because everything seems to work on it right now.

As docker creates it’s own environment, we need to precise which local ports are bound to which container’s ports. For instance, we bind the ports range from 4566 to 4620 of your machine to the same of the container.

Some environment variables are needed to make this works :

  • is used to provide more logs inside the container
  • lists services that you want to deploy
  • specifies the targeted region of your resources
  • tells localstack to use dedicated docker container to run your lambda functions (seems to be the best way to reproduce an AWS real infrastructure)
  • and are additional config for lambda docker execution
  • is the dedicated folder path used by localstack to save its own data

Volumes are needed as docker does not store any state. Using them will allow persistent data and avoid build everything each time you launch your stack.

The .env file is specified to use the LOCALSTACK_API_KEY value. Please keep your credentials secrets

This config should be enough to make it work. Now, you need to open a terminal at the same path of your docker-compose file.

Test it!

For the first run, do not specify the option to run it in background :

Localstack start logs

You should now have those logs with :

  • INFO:botocore.credentials: Found credentials in environment variables
  • INFO:localstack_ext.bootstrap.licensing: Successfully activated API key
  • Waiting for all LocalStack services to be ready. Ready.

Check health and go to this url : http://localhost:4566/health

{"services": {"lambda": "running", "logs": "running", "s3": "running", "sns": "running", "sqs": "running", "cloudwatch": "running"}}
Localstack dashboard status

!! Tips !!

  • Be careful with the latest tag. It is recommended to use it but the team often pushes updates on it without creating a new tag. So if you delete your localstack docker image even if you’ve pinned the latest one, you might have created a non-consistent environment. (You can see the last push date from the docker hub repository on the tag section)
  • As explained before, we need to have the localstack pro version to make docker lambda works. After sign in, you will have to fill the billing information and create a subscription to be able to copy your own API_KEY from the dashboard tab.
  • It is really recommend to use a .env to add your LOCALSTACK_API_KEY=XXXXX and use a .dockerignore and a .gitignore with .env. Avoid adding your personal API_KEY anywhere.
  • When your docker runs, you will have to wait a bit (might be day) to see the status on the cloud UI to be on «running» state and have infos on your resources. But no worries, everything is still working well (the logs table on the dashboard will be up to date)
  • I had to add sns service to the list even if I do not use it to make everything work (logs are straight forward)
  • If you want to use a local dashboard, you can use the image and the PORT_WEB env variable. It will be accessible from the http://localhost:8080 url.
  • If you are using python lambda function, do not use the EDGE_PORT=4566 variable, it was not working for me and led to many hours of debugging with the localstack team — on slack community channel — to find the problem (might be solved right now but don’t be a fool, the default value is 4566 as well, no need to explicitly add it).
  • Note that on MacOS you may have to run if contains a symbolic link that cannot be mounted by Docker.

Now your Localstack env is setup, you can keep it running in background

Build your lambda image

Create a dedicated folder and create a Dockerfile as followed :

Python lambda dockerfile

Our based image comes from AWS and imports everything we want to trigger our lambda function. After what we copy our requirements file and install it.

The CMD is composed of the

And a file in a folder which contains the lambda function code. Nothing impressive, just some logs to make sure everything works fine.

Lambda function

We can now build it!



We are now able to test our image to be sure everything works before deploying it with localstack. We have to run it with port mapping to execute a curl on it.

Docker lambda executed with curl

Congrats! Your image is well formatted, we can now make it work all together

Make it work all together

Update our to create a dedicated bridge network to allow your containers to see each other. And add it either to the and the image. Rebuild everything now to avoid many creepy errors..

Full docker-compose.yml file

Be sure your localstack is up and running as described before and you have a well formatted lambda function image. Here all the commands we will use in this tutorial.

Create and invoke lambda

  • Create the lambda function with the good image tag of your function. We need to specify a dummy role even if it is not created.

Create-function output

Don’t be afraid, even if the PackageType is set to Zip, the lambda function is still based on a docker container.

Test it

Invoke output

Result should appear in a response.json file with word inside.

Update lambda

You can update your lambda if you have made some changes on your image by running this command :

Create SQS queue and bind it

Create a dedicated sqs queue :

“QueueUrl”: “http://localhost:4566/000000000000/test-lambda-queue

You can now bind it! Specify the — batch-size 1 argument to trigger a lambda function on each message received in your queue.

“UUID”: “ebff1d61-d0d0–4513–89cd-26dd7aa78f13”,
“StartingPosition”: “LATEST”,
“BatchSize”: 1,
“EventSourceArn”: “arn:aws:sqs:us-east-1:000000000000:test-lambda-queue”,
“FunctionArn”: “arn:aws:lambda:us-east-1:000000000000:function:test-lambda”,
“LastModified”: “2021–02–24T23:32:51+01:00”,
“LastProcessingResult”: “OK”,
“State”: “Enabled”,
“StateTransitionReason”: “User action”

In the localstack dashboard you can now see your linked details resources you’ve just created.

Localstack Dashboard

Send message and check you docker dashboard!

“MD5OfMessageBody”: “ce114e4501d2f4e2dcea3e17b546f339”,
“MessageId”: “0ddc5ea2-feef-95e1–3861–7dc1fd213295”

Check your dashboard, a new container will be created!

Docker dashboard

Check the logs : Everything works fine!

Lambda container logs

!! Tips !!

  • You need to define a bridge network to allow containers to communicate
  • If you want to use awslocal config in a script, we highly recommend to add to your file to disable pager program (especially if you are using iterm), should prevent those problems. Otherwise, your script will prompt the result of some commands. (https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-files.html)


You can download this example (with a node.js lambda image as well) in the following Github repository : https://github.com/theodorebourgeon/template-localstack-lambda-docker-sqs

Everything is moving fast, AWS and localstack, but I have tried to provide a consistent tutorial to make your docker lambda function works with localstack.

I choose SQS as an event-broker, but you can setup Kinesis as well to make your local architecture match your staging environment.

I hope you enjoyed it! Feel free to share, comment and give claps!

Made with ❤

Data Engineer at @folk

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store