Deploy Automatically to AWS Lambda via Azure Pipelines

Microsoft recently announced Azure DevOps, a collection of services for software development teams. The individual services are quite extensible and have a generous free tier for small teams.

Specifically, the Azure Pipelines service is intriguing, offering a full-featured continuous integration / continuous deployment platform with 1,800 free build minutes per month.

In a previous post, we performed a walkthrough of a basic continuous deployment pipeline from Bitbucket to AWS Lambda via Bitbucket Pipelines. In this post, we’ll demonstrate how to setup a similar configuration with Azure Repos and Azure Pipelines, and give some ideas for next steps.

When complete, after pushing changes to your AWS Lambda function to your Azure Repo, they will automatically be deployed to AWS.


This walkthrough assumes that you have an AWS account, as well as a general familiarity with Git and AWS Lambda.

The walkthrough includes the initial creation of the Azure Repo and AWS Lambda function. If you are working with an existing Lambda and/or Azure Repo, you may be able to skip some steps below.

Create AWS Lambda Function

Login to the AWS Console, navigate to the Lambda section and click Create Function.

On the resulting Create Function page, select Author From Scratch and supply the following values:

  • Name: PipelineTest
  • Runtime: Node.js 6.10
  • Role: Create new role from template(s)
  • Role name: PipelineTestRole
  • Policy templates: Can be left blank

Create AWS IAM User

We need to create an IAM user with permissions to create/update AWS Lambda functions.

  • 1. Navigate to the AWS IAM page, then navigate to Users and then Add User.
  • 2. Supply a username and select Programmatic Access for the access type, then click Next.
  • 3. On the permissions page, select Attach Existing Policies Directly and then select appropriate permissions for your user, then click Review. Note: For this demo, I used the AWSLambdaFullAccess policy, but you may want to consider different policies. As always, use caution when dealing with AWS security.
  • 4. Review the user details and then click Create User.

  • 5. On the resulting page, note the Access Key ID and Secret Access Key, as they will be needed in a following step.

Create Azure Organization and Project

Next, we’ll start our Azure DevOps configuration by creating an organization, project, and source code repository.

To get started, head over to the Azure DevOps site and follow the prompts to get started. This will walk you through authentication and terms of service agreements, and choosing a name for your new organization.


Next, we can choose a name for our project and then create it.

Our project is now created! This is the level in the hierarchy where we’ll configure the Pipelines and Repos services.

Create and Clone Azure Repo

In the project view from the previous step, navigate to Repos to get details on how to initialize the Git repository and clone it to your local system. You can choose any method; I used SSH (details here).

Add Files to Repo

You’ll need to add the following two files to the root of your locally cloned repository.

exports.handler = (event, context, callback) => {
    // TODO implement
    callback(null, 'Hello from Lambda via Azure Pipelines');

This is the Javascript file for your Lambda function.


This file is used by the node-lambda utility to load some values during the execution of the Azure Pipeline we will configure shortly. Most values will be supplied directly to node-lambda via CLI flags at runtime, but this file is still required.

Once you’ve created these two files locally, commit and push to Azure and you’ll be able to view them on the web.

Create New Pipeline

Our code is now in the repository, so we’re ready to set up our build pipeline to deploy it to AWS.

To get started, from the project menu, navigate to Pipelines -> New Pipeline and select your repository. We’re given a choice of templates; choose Node.js as that will give us a head start on the pipeline we need to configure.

The default Node.js pipeline is as follows:

# Node.js
# Build a general Node.js project with npm.
# Add steps that analyze code, save build artifacts, deploy, and more:

  vmImage: 'Ubuntu 16.04'

- task: NodeTool@0
    versionSpec: '8.x'
  displayName: 'Install Node.js'

- script: |
    npm install
    npm run build
  displayName: 'npm install and build'

This is a good start, but we also need to install the node-lambda utility, and then deploy to AWS. Update your azure-pipelines.yml with the following:

  vmImage: 'Ubuntu 16.04'

- task: NodeTool@0
    versionSpec: '8.x'
  displayName: 'Install Node.js'

- script: |
    npm install
    npm install node-lambda -g
  displayName: 'npm install'

- script: |
    node-lambda deploy -a $(AWS_ACCESS_KEY) -s $(AWS_SECRET_KEY) -o $(AWS_ROLE) -r $(AWS_REGION) -n $(AWS_LAMBDA_FUNCTION_NAME) --excludeGlobs "azure-pipelines.yml"
  displayName: 'Deploy to AWS'

Note that we’ve added the node-lambda installation to the second step, and we’ve added a third step that performs the AWS deployment. (For more detail on the format of this YAML file, check out the excellent documentation.)

Once complete, click Save and run. The pipeline will execute, and it will fail!

Clicking on the failed build step brings up the logs, which make clear that we haven’t yet configured our variables for the various AWS settings. Let’s do that next and re-run the pipeline.

Configure Variables

From the main Pipeline view, click Edit, and then locate the Variables tab.

We’ll need to add the five variables from the node-lambda command in the pipeline.

  • – AWS_ACCESS_KEY: For the IAM user created above.
  • – AWS_SECRET_KEY: For the IAM user created above.
  • – AWS_ROLE: The ARN for the role created above. This can be located in AWS by navigating to IAM -> Roles, selecting the newly created role, and copying the role ARN.
  • – AWS_REGION: The name of the region where you created the Lambda.
  • – AWS_LAMBDA_FUNCTION_NAME: The name of the Lambda.

Note that for the variables containing credentials, we’ve checked the lock icon indicating that the variable should be treated as a secret. This encrypts the data at rest and masks the data from any log output. (See the documentation page on pipeline variables for additional details.)

From here, we can click Save & queue and enter a commit message, and our pipeline should re-run. We can navigate back to the pipeline via the left menu to check the status.

Looks good! You can click on any of the steps to view the log output for that step. Note that in the Deploy to AWS step, the AWS credentials we marked as “secret” above are masked from the logs.


The pipeline should now be configured to deploy our Lambda to AWS on every commit to the master branch. To test it, we can test it by modifying the index.js file. You can modify it in your local clone of the repository and then commit and push, or you can use Azure’s web editor to make a basic edit.

Once the commit is made, navigate back to the Pipelines page and a new pipeline execution should be in progress (or already completed).

Future Work

Of course, this is a simplistic example of an Azure Pipeline intended to show a basic build with AWS integration. Here are some possible next steps for further development:

  • – Dig deeper into the Azure Pipelines configuration to add functionality to your pipeline (e.g. releases, approvals)
  • – Check out Azure Test Plans to test your Lambda function as part of each build
  • – Configure a variable group so you can re-use your AWS credentials in multiple pipelines without recreating the variables each time
  • – Use node-lambda to perform local testing of your Lambda function