Building Serverless Apps using the Serverless Stack Framework

Created: 17 June 2021

Updated: 03 September 2023

Prior to doing any of the below you will require your ~/.aws/credentials file to be configured with the credentials for your AWS account

Serverless Stack Framework

SST Framework is a framework built on top of CDK for working with Lambdas and other CDK constructs

It provides easy CDK setups and a streamlined debug and deploy process and even has integration with the VSCode debugger to debug stacks on AWS

Init Project

To init a new project use the following command:

Terminal window
1
npx create-serverless-stack@latest my-sst-app --language typescript

Which will create a Serverless Stack applocation using TypeScript

Run the App

You can run the created project in using the config defined in the sst.json file:

1
{
2
"name": "my-sst-app",
3
"stage": "dev",
4
"region": "us-east-1",
5
"lint": true,
6
"typeCheck": true
7
}

Using the following commands command will build then deploy a dev stack and allow you to interact with it via AWS/browser/Postman/etc.

Terminal window
1
npm run start

Additionally, running using the above command will also start the application with hot reloading enabled so when you save files the corresponding AWS resources will be redeployed so you can continue testing

The Files

The application is structured like a relatively normal Lambda/CDK app with lib which contains the following CDK code:

Stack

lib/index.ts

1
import MyStack from './MyStack'
2
import * as sst from '@serverless-stack/resources'
3
4
export default function main(app: sst.App): void {
5
// Set default runtime for all functions
6
app.setDefaultFunctionProps({
7
runtime: 'nodejs12.x',
8
})
9
10
new MyStack(app, 'my-stack')
11
12
// Add more stacks
13
}

lib/MyStack.ts

1
import * as sst from '@serverless-stack/resources'
2
3
export default class MyStack extends sst.Stack {
4
constructor(scope: sst.App, id: string, props?: sst.StackProps) {
5
super(scope, id, props)
6
7
// Create the HTTP API
8
const api = new sst.Api(this, 'Api', {
9
routes: {
10
'GET /': 'src/lambda.handler',
11
},
12
})
13
14
// Show API endpoint in output
15
this.addOutputs({
16
ApiEndpoint: api.httpApi.apiEndpoint,
17
})
18
}
19
}

And src which contains the lambda code:

src/lambda.ts

1
import { APIGatewayProxyEventV2, APIGatewayProxyHandlerV2 } from 'aws-lambda'
2
3
export const handler: APIGatewayProxyHandlerV2 = async (
4
event: APIGatewayProxyEventV2
5
) => {
6
return {
7
statusCode: 200,
8
headers: { 'Content-Type': 'text/plain' },
9
body: `Hello, World! Your request was received at ${event.requestContext.time}.`,
10
}
11
}

Add a new Endpoint

Using the defined constructs it’s really easy for us to add an additional endpoint:

src/hello.ts

1
import { APIGatewayProxyEventV2, APIGatewayProxyHandlerV2 } from 'aws-lambda'
2
3
export const handler: APIGatewayProxyHandlerV2 = async (
4
event: APIGatewayProxyEventV2
5
) => {
6
const response = {
7
data: 'Hello, World! This is another lambda but with JSON',
8
}
9
10
return {
11
statusCode: 200,
12
headers: { 'Content-Type': 'application/json' },
13
body: JSON.stringify(response),
14
}
15
}

And then in the stack we just update the routes:

lib/MyStack.ts

1
const api = new sst.Api(this, 'Api', {
2
routes: {
3
'GET /': 'src/lambda.handler',
4
'GET /hello': 'src/hello.handler', // new endpoint handler
5
},
6
})

So that the full stack looks like this:

lib/MyStack.ts

1
import * as sst from '@serverless-stack/resources'
2
3
export default class MyStack extends sst.Stack {
4
constructor(scope: sst.App, id: string, props?: sst.StackProps) {
5
super(scope, id, props)
6
7
// Create the HTTP API
8
const api = new sst.Api(this, 'Api', {
9
routes: {
10
'GET /': 'src/lambda.handler',
11
'GET /hello': 'src/hello.handler',
12
},
13
})
14
15
// Show API endpoint in output
16
this.addOutputs({
17
ApiEndpoint: api.httpApi.apiEndpoint,
18
})
19
}
20
}

VSCode Debugging

SST supports VSCode Debugging, all that’s required is for you to create a .vscode/launch.json filw with the following content:

.vscode/launch.json

1
{
2
"version": "0.2.0",
3
"configurations": [
4
{
5
"name": "Debug SST Start",
6
"type": "node",
7
"request": "launch",
8
"runtimeExecutable": "npm",
9
"runtimeArgs": ["start"],
10
"port": 9229,
11
"skipFiles": ["<node_internals>/**"]
12
},
13
{
14
"name": "Debug SST Tests",
15
"type": "node",
16
"request": "launch",
17
"runtimeExecutable": "${workspaceRoot}/node_modules/.bin/sst",
18
"args": ["test", "--runInBand", "--no-cache", "--watchAll=false"],
19
"cwd": "${workspaceRoot}",
20
"protocol": "inspector",
21
"console": "integratedTerminal",
22
"internalConsoleOptions": "neverOpen",
23
"env": { "CI": "true" },
24
"disableOptimisticBPs": true
25
}
26
]
27
}

This will then allow you to run Debug SST Start which will configure the AWS resources using the npm start command and connect the debugger to the instance so you can debug your functions locally as well as make use of the automated function deployment

Add a DB

From these docs

We can define our table using the sst.Table class:

1
const table = new sst.Table(this, 'Notes', {
2
fields: {
3
userId: sst.TableFieldType.STRING,
4
noteId: sst.TableFieldType.NUMBER,
5
},
6
primaryIndex: {
7
partitionKey: 'userId',
8
sortKey: 'noteId',
9
},
10
})

Next, we can add some endpoint definitions for the functions we’ll create as well as access to the table name via the environment:

1
const api = new sst.Api(this, 'Api', {
2
defaultFunctionProps: {
3
timeout: 60, // increase timeout so we can debug
4
environment: {
5
tableName: table.dynamodbTable.tableName,
6
},
7
},
8
routes: {
9
// .. other routes
10
'GET /notes': 'src/notes/getAll.handler', // userId in query
11
'GET /notes/{noteId}': 'src/notes/get.handler', // userId in query
12
'POST /notes': 'src/notes/create.handler',
13
},
14
})

And lastly we can grant the permissions to our api to access the table

1
api.attachPermissions([table])

Adding the above to the MyStack.ts file results in the following:

1
import * as sst from '@serverless-stack/resources'
2
3
export default class MyStack extends sst.Stack {
4
constructor(scope: sst.App, id: string, props?: sst.StackProps) {
5
super(scope, id, props)
6
7
const table = new sst.Table(this, 'Notes', {
8
fields: {
9
userId: sst.TableFieldType.STRING,
10
noteId: sst.TableFieldType.STRING,
11
},
12
primaryIndex: {
13
partitionKey: 'userId',
14
sortKey: 'noteId',
15
},
16
})
17
18
// Create the HTTP API
19
const api = new sst.Api(this, 'Api', {
20
defaultFunctionProps: {
21
timeout: 60, // increase timeout so we can debug
22
environment: {
23
tableName: table.dynamodbTable.tableName,
24
},
25
},
26
routes: {
27
// .. other routes
28
'GET /notes': 'src/notes/getAll.handler', // userId in query
29
'GET /notes/{noteId}': 'src/notes/get.handler', // userId in query
30
'POST /notes': 'src/notes/create.handler',
31
},
32
})
33
34
api.attachPermissions([table])
35
36
// Show API endpoint in output
37
this.addOutputs({
38
ApiEndpoint: api.httpApi.apiEndpoint,
39
})
40
}
41
}

Before we go any further, we need to install some dependencies in our app, particularly uuid for generating unique id’s for notes, we can install a dependency with:

Terminal window
1
npm install uuid
2
npm install aws-sdk

Define Common Structures

We’ll also create some general helper functions for returning responses of different types, you can view the details for their files below but these just wrap the response in a status and header as well as stringify the body

src/responses/successResponse.ts

1
const successResponse = <T>(item: T) => {
2
return {
3
statusCode: 200,
4
headers: { 'Content-Type': 'application/json' },
5
body: JSON.stringify(item),
6
}
7
}
8
9
export default successResponse

src/responses/badResuestsResponse.ts

1
const badRequestResponse = (msg: string) => {
2
return {
3
statusCode: 400,
4
headers: { 'Content-Type': 'text/plain' },
5
body: msg,
6
}
7
}
8
9
export default badRequestResponse

src/responses/internalErrorResponse.ts

1
const internalErrorResponse = (msg: string) => {
2
console.error(msg)
3
return {
4
statusCode: 500,
5
headers: { 'Content-Type': 'text/plain' },
6
body: 'internal error',
7
}
8
}
9
10
export default internalErrorResponse

And we’ve also got a Note type which will be the data that gets stored/retreived:

src/notes/Note.ts

1
type Note = {
2
userId: string
3
noteId: string
4
content?: string
5
createdAt: number
6
}
7
8
export default Note

Access DB

Once we’ve got a DB table defined as above, we can then access the table to execute different queries

We would create a DB object instance using:

1
const db = new DynamoDB.DocumentClient()

Create

A create is the simplest one of the database functions for us to implement, this uses the db.put function with the Item to save which is of type Note:

1
const create = async (tableName: string, item: Note) => {
2
await db.put({ TableName: tableName, Item: item }).promise()
3
}

Get

We can implement a getOne function by using db.get and providing the full Key consisting of the userId and noteId

1
const getOne = async (tableName: string, noteId: string, userId: string) => {
2
const result = await db
3
.get({
4
TableName: tableName,
5
Key: {
6
userId: userId,
7
noteId: noteId,
8
},
9
})
10
.promise()
11
12
return result.Item
13
}

GetAll

We can implement a getByUserId function which will make use of db.query and use the ExpressionAttributeValues to populate the KeyConditionExpression as seen below:

1
const getByUserId = async (tableName: string, userId: string) => {
2
const result = await db
3
.query({
4
TableName: tableName,
5
KeyConditionExpression: 'userId = :userId',
6
ExpressionAttributeValues: {
7
':userId': userId,
8
},
9
})
10
.promise()
11
12
return result.Items
13
}

Define Lambdas

Now that we know how to write data to Dynamo, we can implement the following files for the endpoints we defined above:

Create

src/notes/create.ts

1
import { APIGatewayProxyEventV2, APIGatewayProxyHandlerV2 } from 'aws-lambda'
2
import { DynamoDB } from 'aws-sdk'
3
import { v1 } from 'uuid'
4
import internalErrorResponse from '../responses/internalErrorResponse'
5
import successResponse from '../responses/successResponse'
6
import badRequestResponse from '../responses/badRequestResponse'
7
import Note from './Note'
8
9
const db = new DynamoDB.DocumentClient()
10
11
const toItem = (data: string, content: string): Note => {
12
return {
13
userId: data,
14
noteId: v1(),
15
content: content,
16
createdAt: Date.now(),
17
}
18
}
19
20
const parseBody = (event: APIGatewayProxyEventV2) => {
21
const data = JSON.parse(event.body || '{}')
22
23
return {
24
userId: data.userId,
25
content: data.content,
26
}
27
}
28
29
const isValid = (data: Partial<Note>) =>
30
typeof data.userId !== 'undefined' && typeof data.content !== 'undefined'
31
32
const create = async (tableName: string, item: Note) => {
33
await db.put({ TableName: tableName, Item: item }).promise()
34
}
35
36
export const handler: APIGatewayProxyHandlerV2 = async (
37
event: APIGatewayProxyEventV2
38
) => {
39
if (typeof process.env.tableName === 'undefined')
40
return internalErrorResponse('tableName is undefined')
41
42
const tableName = process.env.tableName
43
const data = parseBody(event)
44
45
if (!isValid(data))
46
return badRequestResponse('userId and content are required')
47
48
const item = toItem(data.userId, data.content)
49
await create(tableName, item)
50
51
return successResponse(item)
52
}

Get

src/notes/get.ts

1
import { APIGatewayProxyEventV2, APIGatewayProxyHandlerV2 } from 'aws-lambda'
2
import { DynamoDB } from 'aws-sdk'
3
import badRequestResponse from '../responses/badRequestResponse'
4
import internalErrorResponse from '../responses/internalErrorResponse'
5
import successResponse from '../responses/successResponse'
6
7
type RequestParams = {
8
noteId?: string
9
userId?: string
10
}
11
12
const db = new DynamoDB.DocumentClient()
13
14
const parseBody = (event: APIGatewayProxyEventV2): RequestParams => {
15
const pathData = event.pathParameters
16
const queryData = event.queryStringParameters
17
18
return {
19
noteId: pathData?.noteId,
20
userId: queryData?.userId,
21
}
22
}
23
24
const isValid = (data: RequestParams) =>
25
typeof data.noteId !== 'undefined' && typeof data.userId !== 'undefined'
26
27
const getOne = async (tableName: string, noteId: string, userId: string) => {
28
const result = await db
29
.get({
30
TableName: tableName,
31
Key: {
32
userId: userId,
33
noteId: noteId,
34
},
35
})
36
.promise()
37
38
return result.Item
39
}
40
41
export const handler: APIGatewayProxyHandlerV2 = async (
42
event: APIGatewayProxyEventV2
43
) => {
44
const data = parseBody(event)
45
46
if (typeof process.env.tableName === 'undefined')
47
return internalErrorResponse('tableName is undefined')
48
49
const tableName = process.env.tableName
50
51
if (!isValid(data))
52
return badRequestResponse(
53
'noteId is required in path, userId is required in query'
54
)
55
56
const items = await getOne(
57
tableName,
58
data.noteId as string,
59
data.userId as string
60
)
61
62
return successResponse(items)
63
}
64
import { APIGatewayProxyEventV2, APIGatewayProxyHandlerV2 } from 'aws-lambda'
65
import { DynamoDB } from 'aws-sdk'
66
import badRequestResponse from '../responses/badRequestResponse'
67
import internalErrorResponse from '../responses/internalErrorResponse'
68
import successResponse from '../responses/successResponse'
69
70
type RequestParams = {
71
noteId?: string
72
userId?: string
73
}
74
75
const db = new DynamoDB.DocumentClient()
76
77
const parseBody = (event: APIGatewayProxyEventV2): RequestParams => {
78
const pathData = event.pathParameters
79
const queryData = event.queryStringParameters
80
81
return {
82
noteId: pathData?.noteId,
83
userId: queryData?.userId,
84
}
85
}
86
87
const isValid = (data: RequestParams) =>
88
typeof data.noteId !== 'undefined' && typeof data.userId !== 'undefined'
89
90
const getOne = async (tableName: string, noteId: string, userId: string) => {
91
const result = await db
92
.get({
93
TableName: tableName,
94
Key: {
95
userId: userId,
96
noteId: noteId,
97
},
98
})
99
.promise()
100
101
return result.Item
102
}
103
104
export const handler: APIGatewayProxyHandlerV2 = async (
105
event: APIGatewayProxyEventV2
106
) => {
107
const data = parseBody(event)
108
109
if (typeof process.env.tableName === 'undefined')
110
return internalErrorResponse('tableName is undefined')
111
112
const tableName = process.env.tableName
113
114
if (!isValid(data))
115
return badRequestResponse(
116
'noteId is required in path, userId is required in query'
117
)
118
119
const items = await getOne(
120
tableName,
121
data.noteId as string,
122
data.userId as string
123
)
124
125
return successResponse(items)
126
}

GetAll

src/notes/getAll.ts

1
import { APIGatewayProxyEventV2, APIGatewayProxyHandlerV2 } from 'aws-lambda'
2
import { DynamoDB } from 'aws-sdk'
3
import badRequestResponse from '../responses/badRequestResponse'
4
import internalErrorResponse from '../responses/internalErrorResponse'
5
import successResponse from '../responses/successResponse'
6
7
type PathParams = {
8
userId?: string
9
}
10
11
const db = new DynamoDB.DocumentClient()
12
13
const parseBody = (event: APIGatewayProxyEventV2): PathParams => {
14
const data = event.queryStringParameters
15
16
return {
17
userId: data?.userId,
18
}
19
}
20
21
const isValid = (data: PathParams) => typeof data.userId !== 'undefined'
22
23
const getByUserId = async (tableName: string, userId: string) => {
24
const result = await db
25
.query({
26
TableName: tableName,
27
KeyConditionExpression: 'userId = :userId',
28
ExpressionAttributeValues: {
29
':userId': userId,
30
},
31
})
32
.promise()
33
34
return result.Items
35
}
36
37
export const handler: APIGatewayProxyHandlerV2 = async (
38
event: APIGatewayProxyEventV2
39
) => {
40
const data = parseBody(event)
41
42
if (typeof process.env.tableName === 'undefined')
43
return internalErrorResponse('tableName is undefined')
44
45
const tableName = process.env.tableName
46
47
if (!isValid(data)) return badRequestResponse('userId is required in query')
48
49
const items = await getByUserId(tableName, data.userId as string)
50
51
return successResponse(items)
52
}

Testing

Once we’ve got all the above completed, we can actually test our endpoints and create and read back data

create:

1
POST https://AWS_ENDPOINT_HERE/notes
2
3
{
4
"userId": "USER_ID",
5
"content": "Hello world"
6
}

Which responds with:

1
200
2
3
{
4
"content": "Hello world",
5
"createdAt": 1619177078298,
6
"noteId": "NOTE_ID_UUID",
7
"userId": "USER_ID"
8
}

get:

1
GET https://AWS_ENDPOINT_HERE/notes/NOTE_ID_UUID?userId=USER_ID
1
200
2
3
{
4
"content": "Hello world",
5
"createdAt": 1619177078298,
6
"noteId": "NOTE_ID_UUID",
7
"userId": "USER_ID"
8
}

getAll

1
GET htttps://AWS_ENDPOINT_HERE/notes?userId=USER_ID
1
200
2
3
[
4
{
5
"content": "Hello world",
6
"createdAt": 1619177078298,
7
"noteId": "NOTE_ID_UUID",
8
"userId": "USER_ID"
9
}
10
]

Creating Notes Using a Queue

When working with microservices a common pattern is to use a message queue for any operations that can happen in an asynchronous fashion, we can create an SQS queue which we can use to stage messages and then separately save them at a rate that we’re able to process them

In order to make this kind of logic we’re going to break up our create data flow - a the moment it’s this:

1
lambda -> dynamo
2
return <-

We’re going to turn it into this:

1
lambda1 -> sqs
2
return <-
3
4
sqs -> lambda2 -> dynamo

This kind of pattern becomes especially useful if we’re doing a lot more stuff with the data other than just the single DB operation and also allows us to retry things like saving to the DB if we have errors, etc.

A more complex data flow could look something like this (not what we’re implementing):

1
lambda1 -> sqs
2
return <-
3
4
sqs -> lambda2 -> dynamo // save to db
5
-> lambda3 -> s3 // generate a report
6
sqs <-
7
8
sqs -> lambda4 // send an email

Create Queue

SST provides us with the sst.Queue class that we can use for this purpose

To create a Queue you can use the following in stack:

1
const queue = new sst.Queue(this, 'NotesQueue', {
2
consumer: 'src/consumers/createNote.handler',
3
})
4
5
queue.attachPermissions([table])
6
queue.consumerFunction?.addEnvironment(
7
'tableName',
8
table.dynamodbTable.tableName
9
)

The above code does the following:

  1. Create a queue
  2. Give the queue permission to access the table
  3. Add the tableName environment variable to the queue’s consumerFunction

We will also need to grant permissions to the API to access the queue so that our create handler is able to add messages to the queue

1
api.attachPermissions([table, queue])

Which means our Stack now looks like this:

lib/MyStack.ts

1
import * as sst from '@serverless-stack/resources'
2
3
export default class MyStack extends sst.Stack {
4
constructor(scope: sst.App, id: string, props?: sst.StackProps) {
5
super(scope, id, props)
6
7
const table = new sst.Table(this, 'Notes', {
8
fields: {
9
userId: sst.TableFieldType.STRING,
10
noteId: sst.TableFieldType.STRING,
11
},
12
primaryIndex: {
13
partitionKey: 'userId',
14
sortKey: 'noteId',
15
},
16
})
17
18
const queue = new sst.Queue(this, 'NotesQueue', {
19
consumer: 'src/consumers/createNote.handler',
20
})
21
22
queue.attachPermissions([table])
23
queue.consumerFunction?.addEnvironment(
24
'tableName',
25
table.dynamodbTable.tableName
26
)
27
28
// Create the HTTP API
29
const api = new sst.Api(this, 'Api', {
30
defaultFunctionProps: {
31
timeout: 60, // increase timeout so we can debug
32
environment: {
33
tableName: table.dynamodbTable.tableName,
34
queueUrl: queue.sqsQueue.queueUrl,
35
},
36
},
37
routes: {
38
'GET /': 'src/lambda.handler',
39
'GET /hello': 'src/hello.handler',
40
'GET /notes': 'src/notes/getAll.handler',
41
'POST /notes': 'src/notes/create.handler',
42
'GET /notes/{noteId}': 'src/notes/get.handler',
43
},
44
})
45
46
api.attachPermissions([table, queue])
47
48
// Show API endpoint in output
49
this.addOutputs({
50
ApiEndpoint: api.httpApi.apiEndpoint,
51
})
52
}
53
}

Update the Create Handler

Since we plan to create notes via a queue we will update our create function in the handler to create a new message in the queue, this is done using the SQS class from aws-sdk:

src/notes/create.ts

1
import { SQS } from 'aws-sdk'
2
3
const queue = new SQS()

Once we’ve got our instance, the create function is done by means of the queue.sendMessage function:

src/notes/create.ts

1
const create = async (queueUrl: string, item: Note) => {
2
return await queue
3
.sendMessage({
4
QueueUrl: queueUrl,
5
DelaySeconds: 0,
6
MessageBody: JSON.stringify(item),
7
})
8
.promise()
9
}

Lastly, our handler remains mostly the same with the exception of some additional validation to check that we have the queue connection information in the environment:

src/notes/create.ts

1
export const handler: APIGatewayProxyHandlerV2 = async (
2
event: APIGatewayProxyEventV2
3
) => {
4
// pre-save validation
5
if (typeof process.env.queueUrl === 'undefined')
6
return internalErrorResponse('queueUrl is undefined')
7
8
const queueUrl = process.env.queueUrl
9
10
const data = parseBody(event)
11
12
if (!isValid(data))
13
return badRequestResponse('userId and content are required')
14
15
// save process
16
const item = toItem(data.userId, data.content)
17
const creatresult = await create(queueUrl, item)
18
19
if (!creatresult.MessageId) internalErrorResponse('MessageId is undefined')
20
21
return successResponse(item)
22
}

Implementing the above into the create handler means that our create.ts file now looks like this:

src/notes/create.ts

1
import { APIGatewayProxyEventV2, APIGatewayProxyHandlerV2 } from 'aws-lambda'
2
import { v1 } from 'uuid'
3
import internalErrorResponse from '../responses/internalErrorResponse'
4
import successResponse from '../responses/successResponse'
5
import badRequestResponse from '../responses/badRequestResponse'
6
import Note from './Note'
7
import { SQS } from 'aws-sdk'
8
9
const queue = new SQS()
10
11
// helper functions start
12
13
const toItem = (data: string, content: string): Note => {
14
return {
15
userId: data,
16
noteId: v1(),
17
content: content,
18
createdAt: Date.now(),
19
}
20
}
21
22
const parseBody = (event: APIGatewayProxyEventV2) => {
23
const data = JSON.parse(event.body || '{}')
24
25
return {
26
userId: data.userId,
27
content: data.content,
28
}
29
}
30
31
const isValid = (data: Partial<Note>) =>
32
typeof data.userId !== 'undefined' && typeof data.content !== 'undefined'
33
34
// helper functions end
35
36
const create = async (queueUrl: string, item: Note) => {
37
return await queue
38
.sendMessage({
39
QueueUrl: queueUrl,
40
DelaySeconds: 0,
41
MessageBody: JSON.stringify(item),
42
})
43
.promise()
44
}
45
46
export const handler: APIGatewayProxyHandlerV2 = async (
47
event: APIGatewayProxyEventV2
48
) => {
49
// pre-save validation
50
if (typeof process.env.queueUrl === 'undefined')
51
return internalErrorResponse('queueUrl is undefined')
52
53
const queueUrl = process.env.queueUrl
54
55
const data = parseBody(event)
56
57
if (!isValid(data))
58
return badRequestResponse('userId and content are required')
59
60
// save process
61
const item = toItem(data.userId, data.content)
62
const creatresult = await create(queueUrl, item)
63
64
if (!creatresult.MessageId) internalErrorResponse('MessageId is undefined')
65
66
return successResponse(item)
67
}

Add Queue-Based Create Handler

Now that we’ve updated our logic to save the notes into the queue, we need to add the logic for the src/consumers/createNote.handler consumer function as we specified above, this handler will be sent an SQSEvent and will make use of the DynamoDB Table we gave it permissions to use

First, we take the create function that was previously on the create.ts file for saving to the DB:

src/consumers/createNote.ts

1
import { DynamoDB } from 'aws-sdk'
2
3
const db = new DynamoDB.DocumentClient()
4
5
const create = async (tableName: string, item: Note) => {
6
const createResult = await db
7
.put({ TableName: tableName, Item: item })
8
.promise()
9
if (!createResult) throw new Error('create failed')
10
11
return createResult
12
}

We’ll also need a function for parsing the SQSRecord object into a Note:

src/consumers/createNote.ts

1
const parseBody = (record: SQSRecord): Note => {
2
const { noteId, userId, content, createdAt } = JSON.parse(record.body) as Note
3
4
// do this to ensure we only extract information we need
5
return {
6
noteId,
7
userId,
8
content,
9
createdAt,
10
}
11
}

And finally we consume the above through the handler, you can see in the below code that we are iterating over the event.Records object, this is because the SQSEvent adds each new event into this array, the reason for this is because we can also specify batching into our Queue so that the handler is only triggered after n events instead of each time, and though this isn’t happening in our case, we still should handle this for our handler:

src/consumers/createNote.ts

1
export const handler: SQSHandler = async (event) => {
2
// pre-save environment check
3
if (typeof process.env.tableName === 'undefined')
4
throw new Error('tableName is undefined')
5
6
const tableName = process.env.tableName
7
8
for (let i = 0; i < event.Records.length; i++) {
9
const r = event.Records[i]
10
const item = parseBody(r)
11
console.log(item)
12
13
const result = await create(tableName, item)
14
console.log(result)
15
}
16
}

Putting all the above together our createNote.ts file now has the following code:

1
import { SQSHandler, SQSRecord } from 'aws-lambda'
2
import Note from '../notes/Note'
3
import { DynamoDB } from 'aws-sdk'
4
5
const db = new DynamoDB.DocumentClient()
6
7
const create = async (tableName: string, item: Note) => {
8
const createResult = await db
9
.put({ TableName: tableName, Item: item })
10
.promise()
11
if (!createResult) throw new Error('create failed')
12
13
return createResult
14
}
15
16
const parseBody = (record: SQSRecord): Note => {
17
const { noteId, userId, content, createdAt } = JSON.parse(record.body) as Note
18
19
// do this to ensure we only extract information we need
20
return {
21
noteId,
22
userId,
23
content,
24
createdAt,
25
}
26
}
27
28
export const handler: SQSHandler = async (event) => {
29
if (typeof process.env.tableName === 'undefined')
30
throw new Error('tableName is undefined')
31
32
const tableName = process.env.tableName
33
34
for (let i = 0; i < event.Records.length; i++) {
35
const r = event.Records[i]
36
const item = parseBody(r)
37
console.log(item)
38
39
const result = await create(tableName, item)
40
console.log(result)
41
}
42
}

This completes the implementation of the asynchronous saving mechanism for notes. As far as a consumer of our API is concerned, nothing has changed and they will still be able to use the API exactly as we had in the Testing section above

Deploy

Thus far, we’ve just been running our API in debug mode via the npm run start command, while useful for testing this adds a lot of code to make debugging possible, and isn’t something we’d want in our final deployed code

Deploying using sst is still very easy, all we need to do is run the npm run deploy command and this will update our lambda to use a production build of the code instead:

Terminal window
1
npm run deploy

Teardown

Lastly, the sst CLI also provides us with a function to teardown our start/deploy code. So once you’re done playing around you can use this to teardown all your deployed services:

1
npm run remove

Note that running the remove command will not delete the DB tables, you will need to do this manually