Automate python libraries deployment at AWS Lambda layer from Pipfile with Terraform

AWS Lambda, a part of Amazon Web Services (AWS) is a serverless computing service that works as FaaS (Function as a Service). A FaaS is a service which provides users to develop and manage application service without thinking about infrastructure.

Terraform is an Infrastructure as Code (IaC) tool which is developed by Hasi Corp to manage resources of cloud services like AWS, Google Cloud, Azure, etc. It is open-source and developed by golang.

It is always challenging to zip the codes and upload them for AWS Lambda every time at the time of deployment. The more complex part is to upload codes of libraries e.g python libraries. At Craftsmen, we need to manage a lot of lambdas for various development purposes. So a smart solution for uploading lambda function code and libraries while deployment is a crying need.

Our approach is to upload function codes as function level and libraries in lambda layer. The reasons are 1. Share python libraries between lambdas 2. The console code editor can only visualize 3MB of code. So uploading libraries can make a chance to see codes in console editor.

Setting the project

Let’s start by setting the project skeleton. We are going to use pipenv because it’s more developer-friendly to maintain dev and release packages. First, we install pipenv from here. Then we will install terraform from here
# Create a project directory
mkdir lambda-with-terraform
cd lambda-with-terraform
# Create lambda code directory
mkdir lambda
# Create Terraform directory
mkdir terraform
# Add file in lambda directory
touch lambda/
# Add Pipfile in project root directory
touch Pipfile
# So our skeleton will look like this
├── lambda
│   └──
├── Pipfile
└── terraform

Add python libraries

We will only use a single library called requestsin [packages] and pipenv at [dev-packages]. Also, we are going to use python 3.8. Let’s add all to Pipfile.

# Pipfile
name = "pypi"
url = ""
verify_ssl = true
pytest = "==5.3.5"
pipenv = "==2020.8.13"
requests = "==2.23.0"
python_version = "3.8"
Initiate python virtual environment by pipenv with
pipenv install
This will add the Pipfile.lock file which contains information on all python packages.

Add function code

Let’s start with a simple lambda function which just gets a web page and log it.
import requests
import logging
LOGGER = logging.getLogger()
def lambda_handler(event, context):
    response = requests.get("")

Add Terraform code

At first, we will add some files for Terraform
# Create file for lambda terraform codes
touch terraform/
# Create file for lambda layer terraform codes
touch terraform/
# Create python file for make pip requirements file from Pipfile
touch terraform/
# Add shell script to generate pip requirements and make zip file of # lambda libraries
touch terraform/
chmod a+x terraform/
In, a python argument parser will be added which gets the filename of pip requirements e.g. requirements.txt/p>
# terraform/
import argparse
from pipenv.project import Project
from pipenv.utils import convert_deps_to_pip
def _make_requirements_file(file_path):
    pipfile = Project(chdir=False).parsed_pipfile
    requirements = convert_deps_to_pip(pipfile['packages'], r=False)
    with open(file_path, 'w') as req_file:
def run():
    parser = argparse.ArgumentParser()
    args = parser.parse_args()
if __name__ == "__main__":
Now let’s add shell script to generate a zip file for lambda layer with python libraries
# terraform/
echo "Module dir $MODULE_DIR"
echo "Destination dir $DESTINATION_DIR"
echo "Target dir $TARGET_DIR"
mkdir -p "$TARGET_DIR"
python3 "$MODULE_DIR"/ --file_path "$REQUIREMENTS_FILE_PATH"
pip install -r "$REQUIREMENTS_FILE_PATH" -t "$TARGET_DIR"/python
(cd "$TARGET_DIR" && zip -r "$DESTINATION_DIR"/"$ZIPFILE_NAME".zip ./* -x "*.dist-info*" -x "*__pycache__*" -x "*.egg-info*")
rm -r "$TARGET_DIR"
Now add lambda layer terraform code
// terraform/
locals {
  // All lambda codes zip and layer zip file directory
  lambda_artifact_dir = "${path.module}/lambda_zip"
  lambda_layer_zipfile_name = "layer"
  python_version = "python${data.external.python_version.result.version}"
// Grab python version from Pipfile. Default is 3.8 if not mentioned // in Pipfile
data "external" "python_version" {
  program = [
    "from pipenv.project import Project as P; import json; _p = P(chdir=False); print(json.dumps({'version': _p.required_python_version or '3.8'}))"
// Generate zipfile for lambda layer
resource "null_resource" "build_lambda_layer" {
  provisioner "local-exec" {
    when    = create
    command = "./${path.module}/"
    environment = {
      DESTINATION_DIR = abspath(local.lambda_artifact_dir)
      MODULE_DIR      = abspath(path.module)
      ZIPFILE_NAME      = local.lambda_layer_zipfile_name
  triggers = {
    // Trigger only when something changes in Pipfile
    run_on_pipfile_change = filemd5("${abspath(path.module)}/../Pipfile")
resource "aws_lambda_layer_version" "lambda_layer" {
  filename            = "${local.lambda_artifact_dir}/${local.lambda_layer_zipfile_name}.zip"
  layer_name          = "lambda_layer"
  compatible_runtimes = [local.python_version]
  // It will run after lambda layer zipfile build
  depends_on          = [null_resource.build_lambda_layer]
  lifecycle {
    create_before_destroy = true
Finally, we are going to add lambda terraform code
// terraform/
// Zip lambda function codes
data "archive_file" "lambda_zip_file" {
  output_path = "${local.lambda_artifact_dir}/"
  source_dir  = "${path.module}/../lambda"
  excludes    = ["__pycache__", "*.pyc"]
  type        = "zip"
data "aws_iam_policy_document" "lambda_assume_role" {
  version = "2012-10-17"
  statement {
    sid    = "LambdaAssumeRole"
    effect = "Allow"
    actions = [
    principals {
      identifiers = [
      type = "Service"
// Lambda IAM role
resource "aws_iam_role" "lambda_role" {
  name               = "test-lambda-role"
  assume_role_policy = data.aws_iam_policy_document.lambda_assume_role.json
  lifecycle {
    create_before_destroy = true
// Lambda function terraform code
resource "aws_lambda_function" "lambda_function" {
  function_name    = "test-lambda-function"
  filename         = data.archive_file.lambda_zip_file.output_path
  source_code_hash = data.archive_file.lambda_zip_file.output_base64sha256
  handler          = "handler.lambda_handler"
  role             = aws_iam_role.lambda_role.arn
  runtime          = local.python_version
  layers           = [aws_lambda_layer_version.lambda_layer.arn]
  lifecycle {
    create_before_destroy = true

Time to test

To test if everything works, we have to add Terraform provider. In our case the provider is AWS. Let’s add a file in terraform directory
# terraform/
provider "aws" {
  region  = "eu-west-1"
  profile = "aws-profile-name-from-aws-config-file-at-your-machine"
The project will look like
├── lambda
│   └──
├── Pipfile
├── Pipfile.lock
└── terraform
Let’s build infrastructure 😹
# Activate pipenv virtual environment
pipenv shell
# Go to terraform directory
cd terraform
# Initialize terraform 
terraform init
# Check terraform infrastructure component to deploy
terraform plan
# And deploy with
terraform apply
# If you want to destroy all
terraform destroy
Picture of Mahabubur Rahaman Melon

Mahabubur Rahaman Melon

Software Development Engineer

Hire Exceptional Developers Quickly

Share this blog on

Hire Your Software Development Team

Let us help you pull out your hassle recruiting potential software engineers and get the ultimate result exceeding your needs.

Contact Us Directly


Plot # 272, Lane # 3 (Eastern Road) DOHS Baridhara, Dhaka 1206

Talk to Us
Scroll to Top