Keeping Functions Warm - How To Fix AWS Lambda Cold Start Issues

Written by Gonçalo NevesEdit this post


Cold start in computing = duration of time it takes to boot a system

In this blog post we'll tackle this issue with AWS Lambda + Serverless.

#Cold Start in Serverless ❄️ - problem

The Function-as-a-Service (FaaS) paradigm allows developers to do more than ever with less resources. Unfortunately, cold starts can be an issue.

What is a cold start in Serverless?

A cold start happens when you execute an inactive (cold) function for the first time. It occurs while your cloud provider provisions your selected runtime container and then runs your function. This process, referred to as a cold start, will increase your execution time considerably.

While you're actually running your function it will stay active (hot), meaning your container stays alive - ready and waiting for execution. But eventually after a period of inactivity, your cloud provider will drop the container and your function will become cold again.

Where are the bottlenecks and when?

Knowing your service performance bottleneck is essential. Which functions are slowing down and when? From small to big services, it's common to find one function that slows down your service logic because it doesn't run as often as needed to keep its container alive.

One of our cold functions was the reset email service during off-peak hours. It took on average more than double the amount of time to get the reset password email from UTC+1 23:00 to UTC+1 06:00 (London).

Understanding AWS cold starts:

When using AWS Lambda, provisioning of your function's container can take >5 seconds. That makes it impossible to guarantee <1 second responses to events such as API Gateway, DynamoDB, CloudWatch, S3, etc.

This analysis of AWS Lambda + private VPC container initialization times concluded:

  • Run-times and memory size don't affect container initialization time
  • Lambda within a private VPC increases container initialization time
  • Containers are not reused after ~15 minutes of inactivity

#Make them warm ♨ - solution

To solve this problem in a couple of cold Lambdas @Fidel I wrote a plugin called serverless-plugin-warmup that allows you to keep all your Lambdas hot.

WarmUP does this by creating one scheduled event Lambda that invokes all the Lambdas you select in a configured time interval (default: 5 minutes) or a specific time, forcing your containers to stay alive.

#WarmUP Plugin - installation

Install via npm in the root of your Serverless service:

npm install serverless-plugin-warmup --save-dev
  • Add the plugin to the plugins array in your Serverless serverless.yml:
  - serverless-plugin-warmup
  • Add warmup: true property to all functions you want to be warm:
    warmup: true
  • WarmUP to be able to invoke lambdas requires the following Policy Statement in iamRoleStatements:
  - Effect: 'Allow'
      - 'lambda:InvokeFunction'
    Resource: "*"
  • Add an early callback call when the event source is serverless-plugin-warmup. You should do this early exit before running your code logic, it will save your execution duration and cost.
module.exports.lambdaToWarm = function(event, context, callback) {
  /** Immediate response for WarmUP plugin */
  if (event.source === 'serverless-plugin-warmup') {
    console.log('WarmUP - Lambda is warm!')
    return callback(null, 'Lambda is warm!')
  ... add lambda logic after

Perfect! Now all of your Lambdas are hot, and you have less to worry about.

You can find more info here about options, event source and estimated cost.

#Provider support - future

I only work with AWS, so adding support for other providers is a welcome contribution!

About Gonçalo Neves

Gonçalo Neves is based in London as the lead developer at Fidel Limited. On a mission to build the future of Loyalty. Loves to sail and dive.

Serverless Blog

The blog on serverless & event-driven compute

New to serverless?

To get started, pop open your terminal & run

npm install serverless -g

how? learn more


Join 12,000+ other serverless devs & keep up to speed on the latest serverless trends