Nest application example
This example demonstrates how to setup a Nest application.
Use Cases
- Setup & deploy a Nest Application starter
Running the app locally
npm start
Which should result in:
$ sls offline startServerless: Compiling with Typescript...Serverless: Using local tsconfig.jsonServerless: Typescript compiled.Serverless: Watching typescript files...Serverless: Starting Offline: dev/us-east-1.Serverless: Routes for main:Serverless: ANY /{proxy*}Serverless: Offline listening on http://localhost:3000
Then browse http://localhost:3000/hello
The logs should be :
Serverless: ANY /hello (λ: main)[Nest] 7956 - 2018-12-13 10:34:22 [NestFactory] Starting Nest application... +6933ms[Nest] 7956 - 2018-12-13 10:34:22 [InstanceLoader] AppModule dependencies initialized +4ms[Nest] 7956 - 2018-12-13 10:34:22 [RoutesResolver] AppController {/}: +2ms[Nest] 7956 - 2018-12-13 10:34:22 [RouterExplorer] Mapped {/hello, GET} route +1ms[Nest] 7956 - 2018-12-13 10:34:22 [NestApplication] Nest application successfully started +1msServerless: [200] {"statusCode":200,"body":"Hello World!","headers":{"x-powered-by":"Express","content-type":"text/html; charset=utf-8","content-length":"12","etag":"W/\"c-Lve95gjOVATpfV8EL5X4nxwjKHE\"","date":"Thu, 13 Dec 2018 09:34:22 GMT","connection":"keep-alive"},"isBase64Encoded":false}
Skiping cache invalidation
Skiping cache invalidation is the same behavior as a deployed function
npm start -- --skipCacheInvalidation
Deploy
In order to deploy the endpoint, simply run:
sls deploy
The expected result should be similar to:
$ sls deployServerless: Compiling with Typescript...Serverless: Using local tsconfig.jsonServerless: Typescript compiled.Serverless: Packaging service...Serverless: Excluding development dependencies...Serverless: Creating Stack...Serverless: Checking Stack create progress........Serverless: Stack create finished...Serverless: Uploading CloudFormation file to S3...Serverless: Uploading artifacts...Serverless: Uploading service .zip file to S3 (32.6 MB)...Serverless: Validating template...Serverless: Updating Stack...Serverless: Checking Stack update progress.................................Serverless: Stack update finished...Service Informationservice: serverless-nest-examplestage: devregion: us-east-1stack: serverless-nest-example-devapi keys: Noneendpoints: ANY - https://XXXXXXX.execute-api.us-east-1.amazonaws.com/dev/{proxy?}functions: main: serverless-nest-example-dev-mainlayers: None
Usage
Send an HTTP request directly to the endpoint using a tool like curl
curl https://XXXXXXX.execute-api.us-east-1.amazonaws.com/dev/hello
Tail logs
sls logs --function main --tail
Scaling
By default, AWS Lambda limits the total concurrent executions across all functions within a given region to 100. The default limit is a safety limit that protects you from costs due to potential runaway or recursive functions during initial development and testing. To increase this limit above the default, follow the steps in To request a limit increase for concurrent executions.
Cold start
Cold start may cause latencies for your application See : https://serverless.com/blog/keep-your-lambdas-warm/
These behavior can be fixed with the plugin serverless-plugin-warmup
- Install the plugin
npm install serverless-plugin-warmup --save-dev
- Enable the plugin
plugins: - '@hewmen/serverless-plugin-typescript' - serverless-plugin-optimize - serverless-offline - serverless-plugin-warmupcustom: # Enable warmup on all functions (only for production and staging) warmup: - production - staging
Benchmark
A basic benchmark script can be used locally, it performs 1000 "GET" requests on "http://localhost:3000/hello"
# /!\ The app must run locallynpm start # Or npm start -- --skipCacheInvalidation for better performances# Run benchnode bench.js
The expected result should be similar to:
$ node bench.js1000 "GET" requests to "http://localhost:3000/hello"total: 8809.733msAverage: 8.794ms