• Simple AWS
  • Posts
  • CI/CD for Multiple Lambda Functions

CI/CD for Multiple Lambda Functions

Continuously Deploying multiple Lambda functions from a single repo

Your team is still working on last week's web app that facilitates a peer-to-peer (P2P) book exchange. Users can list their books for exchange, browse available books, initiate trades, and manage their profiles.

You've built the following services, each its own Lambda function:

  • listBook: Allows users to list a book they'd like to exchange, storing the information in DynamoDB.

  • browseBooks: Retrieves the list of available books from DynamoDB.

  • initiateTrade: Processes trade requests between users.

  • manageProfile: Allows users to update their profile information, stored in Cognito.

With my previous article on managing multiple AWS Lambda functions you solved Codebase management, Dependency management, Monitoring and debugging and Performance. You also solved Concurrency with Using SQS to Throttle Writes to DynamoDB.

Things look good on the AWS infrastructure side, but you're left with these questions about CI/CD:

  • Do I use a monorepo, or a separare repo per function?

  • How do I structure my CI/CD pipeline?

  • How do I deploy a change that impacts several functions?

We're going to use the following AWS services:

  • AWS CodeCommit: A fully-managed source control service that hosts Git repositories, allowing you to store and manage your app's source code. Think GitHub or Bitbucket, but done by AWS.

  • AWS CodeBuild: A fully-managed build service that compiles your app's source code, runs tests, and produces build artifacts.

  • AWS CodePipeline: A fully-managed continuous deployment service that helps you automate your release pipelines. You can orchestrate various stages, such as source code retrieval, build, and deployment, which are resolved by other services like CodeCommit, CodeBuild, and CodeDeploy.

  • Lambda: Our serverless functions!

  • Serverless Application Model (SAM): An open-source framework for building serverless applications. It extends AWS CloudFormation to provide a simplified way of defining API Gateway APIs, Lambda functions, and DynamoDB tables.

Create a single git repository (monorepo) for all functions and set up a CI/CD pipeline for it

Step 0: Set up your SAM template

We'll use the same SAM template from the previous issue of Simple AWS, but we'll add an API Gateway, remove the DynamoDB table and the associated permissions (so we don't lose focus, in reality you'll still need them), and add some parameters. Here's the updated version:

AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: SAM Template with Lambda Layers

Parameters:
  LayerARN:
    Description: ARN of the Lambda Layer
    Type: String

Globals:
  Function:
    AutoPublishAlias: prod

Resources:
  listBookFunction:
    Type: AWS::Serverless::Function
    Properties:
      ProvisionedConcurrencyConfig:
        ProvisionedConcurrentExecutions: 5
      CodeUri: listBook/
      Handler: listBook.handler
      Runtime: nodejs14.x
      Layers:
        - !Ref LayerARN
      Events:
        HttpPost:
          Type: Api
          Properties:
            Path: /book
            Method: post
            RestApiId:
              Ref: BooksApi
      Policies:
        - AWSXRayDaemonWriteAccess
      Tracing: Active
  BooksApi:
    Type: AWS::Serverless::Api
    Properties:
      StageName: prod
      DefinitionBody:
        swagger: "2.0"
        info:
          title:
            Ref: AWS::StackName
        paths:
          /book:
            post:
              x-amazon-apigateway-integration:
                httpMethod: POST
                type: aws_proxy
                uri:
                  Fn::Sub: arn:aws:apigateway:${AWS::Region}:lambda:path/2015-03-31/functions/${listBookFunction.Alias}/invocations

# The same configuration applies to the other 3 functions...

Step 1: Create a Monorepo for All Your Lambda Functions

Set up a git repository and push it to your favorite tool (GitHub, GitLab, CodeCommit, etc). In this case I'll use CodeCommit, here's how to set it up.

Create a directory for each function. It should look like this:

/simple-aws-p2p-books
|-- /listBook
|   |-- index.js
|   |-- package.json
|-- /browseBooks
|   |-- index.js
|   |-- package.json
|-- /initiateTrade
|   |-- index.js
|   |-- package.json
|-- /manageProfile
|   |-- index.js
|   |-- package.json
|-- template.yaml

Ensure that all the dependencies that are specific to each Lambda function are listed in their respective package.json files.

Step 2: Create an S3 bucket for the build artifacts

Go to S3 and create a new Bucket for the build artifacts. I called mine SIMPLE_AWS_P2P_BOOKS_ARTIFACTS so you'll need to pick a different name (and replace it in the next step).

Step 3: Create a Build Specification File

The buildspec.yml file tells CodeBuild how to build your project. Put it in your root directory. Here are the contents (replace SIMPLE_AWS_P2P_BOOKS_ARTIFACTS with the name of the bucket you just created):

version: 0.2

phases:
  install:
    runtime-versions:
      nodejs: 14
    commands:
      - echo Installing source NPM dependencies...
      - cd listBook && npm install
      - cd ../browseBooks && npm install
      - cd ../initiateTrade && npm install
      - cd ../manageProfile && npm install
  build:
    commands:
      - echo Building SAM Application...
      - aws cloudformation package --template-file template.yaml --output-template-file packaged-template.yaml --s3-bucket SIMPLE_AWS_P2P_BOOKS_ARTIFACTS
  post_build:
    commands:
      - echo Build completed on `date`
artifacts:
  files:
    - packaged-template.yaml

Step 4: Create a CodeBuild Project

  1. Open the CodeBuild dashboard in the AWS Management Console.

  2. Click on "Create build project" and configure the project settings:

    1. Project name: Enter a unique name for the build project, such as SimpleAWSBuilder.

    2. Source: Choose "CodeCommit" as the source provider. Then, select your git repo and the "master" branch.

  3. Configure the environment settings for your build project:

    1. Environment image: Select "Managed image."

    2. Operating system: Choose "Amazon Linux 2."

    3. Runtime(s): Select "Standard" and choose the latest available image version.

    4. Image: Choose the latest one.

    5. Service role: Choose New service role and enter for Role name "SimpleAWSCodeBuildServiceRole".

  4. For Buildspec, just leave the default settings. CodeBuild will use the buildspec.yml file you created earlier.

  5. Click "Create build project" to create your new build project.

Step 5: Give CodeBuild the necessary IAM permissions

  1. Go to the IAM console and on the menu on the left click Roles.

  2. Search for the role you just created for CodeBuild (the name should be SimpleAWSCodeBuildServiceRole) and click on the name.

  3. Click Add permissions and click Attach policies. Click Create policy.

  4. Click the JSON tab and replace the contents with the contents below.

  5. Click Next (Tags), Next (Review), give your policy a name such as "SimpleAWSCodeBuildPolicy" and click Create.

  6. Go back to the role creation tab (you can close the current one), click the Refresh button on the right, and type "SimpleAWSCodeBuildPolicy" in the search box.

  7. Click the checkbox on the left of the SimpleAWSCodeBuildPolicy policy and click Add permissions.

  8. Clear the search box and enter "AWSCodeBuildAdminAccess" in the search box.

  9. Click the checkbox on the left of the AWSCodeBuildAdminAccess policy and click Add permissions.

  10. Save the changes.

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "LambdaPermissions",
            "Effect": "Allow",
            "Action": [
                "lambda:UpdateFunctionCode",
                "lambda:PublishVersion"
            ],
            "Resource": "*"
        },
        {
            "Sid": "S3Permissions",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:GetObjectVersion",
                "s3:GetBucketVersioning",
                "s3:PutObject"
            ],
            "Resource": "*"
        },
        {
            "Sid": "IAMPassRole",
            "Effect": "Allow",
            "Action": "iam:PassRole",
            "Resource": "*",
            "Condition": {
                "StringEquals": {
                    "iam:PassedToService": "lambda.amazonaws.com"
                }
            }
        },
        {
            "Sid": "CloudFormationPermissions",
            "Effect": "Allow",
            "Action": [
                "cloudformation:CreateChangeSet",
                "cloudformation:DescribeChangeSet",
                "cloudformation:ExecuteChangeSet",
                "cloudformation:DeleteChangeSet"
            ],
            "Resource": "*"
        }
    ]
}

Step 6: Create an IAM Role for CodePipeline and one for CodeDeploy

  1. Go to the IAM console and on the menu on the left click Roles. Then click the "Create role" button.

  2. In the "Create role" screen, select "AWS service", and then choose "CodePipeline" as the service that will use this role.

  3. Click on the "Next: Permissions" button.

  4. Click on the "Create Policy" button.

  5. In the JSON tab, paste the policy below, then click "Review policy".

  6. Give it a name and description and click "Create Policy".

  7. Go back to the tab where you were creating the IAM role and refresh the policies list.

  8. Search for the policy you just created by the name you gave it, select the checkbox next to it, and then click on "Next: Tags".

  9. Click on "Next: Review".

  10. Give your new role a name, something like SimpleAWSCodePipelineServiceRole, and then click on "Create role".

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "S3AccessForSAMAndCodePipeline",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:PutObject",
                "s3:ListBucket",
                "s3:GetBucketVersioning"
            ],
            "Resource": "*"
        },
        {
            "Sid": "CodeBuild",
            "Effect": "Allow",
            "Action": [
                "codebuild:StartBuild",
                "codebuild:BatchGetBuilds"
            ],
            "Resource": "*"
        },
        {
            "Sid": "CodeCommitAndS3",
            "Effect": "Allow",
            "Action": [
                "codecommit:GetBranch",
                "codecommit:GetCommit",
                "codecommit:GetRepository",
                "codecommit:ListRepositories",
                "s3:GetObjectVersion",
                "s3:GetBucketVersioning",
                "s3:GetObject",
                "s3:GetBucketLocation",
                "s3:ListAllMyBuckets"
            ],
            "Resource": "*"
        }
    ]
}

Repeat the same steps to create a role for CodeDeploy, but name it something like SimpleAWSCodeDeployServiceRole, and use the following permissions:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "CloudFormationAndSAM",
            "Effect": "Allow",
            "Action": [
                "cloudformation:CreateChangeSet",
                "cloudformation:DeleteStack",
                "cloudformation:DescribeStacks",
                "cloudformation:DescribeChangeSet",
                "cloudformation:ExecuteChangeSet",
                "cloudformation:UpdateStack"
            ],
            "Resource": "*"
        },
        {
            "Sid": "LambdaFunctionAndLayer",
            "Effect": "Allow",
            "Action": [
                "lambda:GetFunction",
                "lambda:CreateFunction",
                "lambda:DeleteFunction",
                "lambda:UpdateFunctionCode",
                "lambda:UpdateFunctionConfiguration",
                "lambda:PublishVersion",
                "lambda:UpdateAlias",
                "lambda:GetLayerVersion",
                "lambda:PublishLayerVersion"
            ],
            "Resource": "*"
        },
        {
            "Sid": "APIGateway",
            "Effect": "Allow",
            "Action": [
                "apigateway:GET",
                "apigateway:POST",
                "apigateway:PUT",
                "apigateway:PATCH",
                "apigateway:DELETE"
            ],
            "Resource": "*"
        },
        {
            "Sid": "IAMPassRole",
            "Effect": "Allow",
            "Action": [
                "iam:PassRole"
            ],
            "Resource": "*",
            "Condition": {
                "StringEquals": {
                    "iam:PassedToService": "lambda.amazonaws.com"
                }
            }
        },
        {
            "Sid": "PassToCloudFormation",
            "Effect": "Allow",
            "Action": "iam:PassRole",
            "Resource": "*",
            "Condition": {
                "StringEquals": {
                    "iam:PassedToService": "cloudformation.amazonaws.com"
                }
            }
        },
        {
            "Sid": "S3AccessForCloudFormation",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:PutObject"
            ],
            "Resource": "*"
        }
    ]
}

Step 7: Create a CodePipeline pipeline

  1. In the AWS Console go to the CodePipeline dashboard.

  2. Click on "Create pipeline."

  3. Configure the pipeline settings:

    1. Pipeline name: Give your pipeline a unique name, such as SimpleAWSPipeline.

    2. Service role: Enter the name of the role you created in Step 6

  4. Click "Next."

  5. Configure the source stage:

    1. Source provider: Choose "AWS CodeCommit."

    2. Repository name: Select the CodeCommit repository you created earlier.

    3. Branch name: Select the "master" branch.

    4. Change detection options: Choose "Amazon CloudWatch Events (recommended)" to automatically trigger the pipeline when there's a new commit.

  6. Click "Next."

  7. Configure the build stage:

    1. Build provider: Choose "AWS CodeBuild".

    2. Region: Select your region.

    3. Project name: Choose the CodeBuild project you created earlier.

  8. Click "Next."

  9. Configure the deploy stage:

    1. Deploy provider: Choose "AWS CloudFormation"

    2. Region: Select your region

    3. Action mode: Choose "Create or update a stack"

    4. Stack name: Enter a name for the stack, such as SimpleAwsBooks

    5. Template:

      1. Artifact name: Choose "BuildArtifact"

      2. File name: Enter "packaged-template.yaml"

      3. Template file path: Leave it as it is

    6. Capabilities: Select "CAPABILITY_IAM"

    7. Role name: Enter the ARN of the role you created in Step 6

  10. Click "Next."

  11. Review your pipeline settings and click "Create pipeline".

Explanation

Step 0: Set up your SAM template

I removed everything related to DynamoDB, so we could focus on the other parts. I also added an API Gateway pointing to a Lambda alias called "prod".

An extra thing is AutoPublishAlias: prod, which makes SAM automatically update the alias "prod" to point to the newest version of the function. You might want to do this with an alias called "staging" if your deployment involves some manual testing. The good news is that you can always point the "prod" alias to the old version to revert the changes.

Just to clarify, both the DynamoDB table and the API Gateway are part of the same solution. I didn't include the API Gateway in last week's issue and didn't include the DynamoDB table in this issue so we could focus on the problems at hand, and because adding everything would turn this issue into a book.

Step 1: Create a Monorepo for All Your Lambda Functions

There's a loooong discussion to be had on monorepos vs multi-repos. Someone could write a book on this, but not me: My experience and knowledge don't go that deep on this topic. Some discussion points though:

  • Monorepos with a single CI/CD pipeline allow you to update several services at the same time. This is great when you're using functional services, where features are an emergent behavior of combining multiple services, and updating a feature requires updating several services. If you don't know what I'm talking about, check the past issue on microservice design.

  • Monorepos are bad at only deploying what's changed. CloudFormation helps a lot with this for the Lambda functions, but it has its limits. So would other IaC tools like Terraform.

  • Monorepos can get HUGE. I created a single SAM template and a single Lambda layer shared by all functions, but a real app could have 50 functions, 20 layers shared across some but not all functions, and so many SAM templates that you don't even know where to start.

  • If you're using multi-repos and are separating your infrastructure per repo, you'll need to start looking into StackSets. Even then, managing infrastructure dependencies is going to be a pain, especially when something fails and you need to roll back.

  • Granting permissions at the git directory level instead of at the git repository level is not so easy, and requires the use of Submodules. It's not impossible, but it's not fun.

  • Once you commit to one strategy, it's entirely possible but quite difficult to change.

tl;dr: this is worth spending 8 hours reading about and asking the opinion of everyone you know.

Step 2: Create an S3 bucket for the build artifacts

CodeBuild will package our SAM template into a CloudFormation template, which then CodeDeploy deploys. Since they're not the same service and they don't run on the same instance, they need some place to store the build artifact. That's our S3 bucket.

Step 3: Create a Build Specification File

The file buildspec.yml tells CodeBuild what to do.

Step 4: Create a CodeBuild Project

CodeBuild just runs the build.

Step 5: Give CodeBuild the necessary IAM permissions

And it needs permissions to do what it does.

Step 6: Create an IAM Role for CodePipeline

CodePipeline needs permissions to do its pipeline stuff, like starting a CodeBuild project. That's what the first IAM Role is for.

CodeDeploy needs two sets of permissions: The ones to do its deploy stuff, such as update a CloudFormation stack, and the ones that CloudFormation will need to do its job of actually creating the infrastructure. The reason behind this is that everything in AWS that wants to do an action needs explicit permissions for it, and that includes CloudFormation. When you create a stack from the console, you are passing to CloudFormation the permissions of the current role that you're using (it's the default, you can pick another role). The same applies to CodeDeploy: It will pass its own role to CloudFormation.

Step 7: Create a CodePipeline pipeline

We've got all the pieces in place, time to link them all together using a pipeline.

Discussion

Let's see if I can anticipate a few questions. If you have any others, feel free to write back!

Will the pipeline trigger on any change to any function, and also on changes to the template.yaml file?

Yes. The pipeline doesn't actually trigger on changes per se, it triggers on commits to the branch "master", as set up on the Source step of the CodePipeline pipeline creation process (right at the beginning of Step 7). That trigger doesn't make any distinctions about what was actually changed.

Will the pipeline create a new version of all functions, including those which didn't didn't have changes?

Nope. It will only create new versions of the Lambda functions that have changed. If the source code or dependencies of a function haven't changed, no new version of that function is created. This is handled by the AWS CloudFormation service, which only updates resources that have changed in the template.

Will the pipeline publish a new Lambda layer and update the ARN of the Layer that the functions are using?

No, it won't. It should! But this issue was getting really long, and I figured it would be too much to add that. Here's how you can do that in a monorepo:

  • Check whether the shared code has changed: You can run a hash of the new code, and compare it to the has of the old code (which you can either have stored somewhere previously, or calculate by downloading the existing layer from S3)

  • If there were changes, publish a new version

  • Update the Layer ARN in all functions: The easiest way to do this is by making the Layer ARN a CloudFormation parameter, which I already did in Step 0. You could also do a text replace.

This is a bit complex but totally doable for one or two layers. If you had 20 layers, you'd be much better off with multiple repos.

Best Practices for a CI/CD Pipeline for AWS Lambda Functions

Operational Excellence

  • Add some tests: This newsletter is focused on the AWS side of things, not so much on the code side. That's why I didn't add any tests, or any step in CodeBuild to run tests. You should have tests though.

Security

  • Enable WAF on API Gateway: WAF is a web application firewall that helps protect your applications against common web exploits. You don't even need to change the code, just add it to your API Gateway.

Reliability

  • Roll back at the API Gateway level: If you deploy everything as a single unit (like we're doing here), rolling back means changing the code of the Lambda function again, which is slower. You can leave that code there, just re-target the "prod" Lambda alias so it points to the old version. That way you're switching traffic faster. This is effectively a blue/green deployment, which we'll cover extensively in a future issue.

Performance Efficiency

  • Right-size your Lambdas: Find the sweet spot for the memory configuration, so your Lambdas have enough resources to work well. Remember that CPU is tied to memory.

  • Make your packages as small as possible: Don't include unnecessary libraries, they consume memory.

Cost Optimization

  • Delete old Lambda layers and function versions: These aren't included in the Lambda pricing, but they're stored in an S3 bucket used for Lambda artifacts, and you're billed for that storage.

Resources

Just one resource, but one of the best I've shared. There's this guy I've been following on LinkedIn for several years now, and I've always looked up to him, both because of his knowledge and because of how he learns stuff by building it and sharing it.

Him and a friend of his are launching a mentoring program for people interested in Site Reliability Engineering (SRE), and the first cohort will be free. It starts this Saturday (tomorrow). You should definitely sign up! I did, and I'm really looking forward to it.

Did you like this issue?

Login or Subscribe to participate in polls.

Join the conversation

or to participate.