If you’ve been keeping up with my content, you’ll remember that I had written an article titled, Use AWS Lambda and API Gateway with Node.js and Couchbase NoSQL. In this article we had explored using Amazon’s Serverless services to create Lambda functions that interact with Couchbase, our NoSQL database.
However, Lambda isn’t the only serverless, otherwise known as functions as a service (FaaS), technology on the block. Take Apache OpenWhisk for example. With OpenWhisk you can create functions similarly to how you would with Lambda, but deploy them to a more diverse set of locations, the popular being IBM Bluemix.
We’re going to see how to create serverless functions using OpenWhisk to communicate with our Couchbase Server database.
Going forward, there are a few things to note. You’ll need to be hosting Couchbase Server somewhere accessible by the outside world. This means that your local computer won’t work. You’re going to need Docker so we can compile our dependencies to work with OpenWhisk. Finally, you’re going to need a Bluemix account, for this example at least.
Installing the Bluemix CLI Tools for OpenWhisk
Like I previously mentioned, OpenWhisk is an Apache Foundation project. However, for convenience we’re going to be using it on IBM’s Bluemix.
Create an account for the IBM Cloud if you haven’t already.
Instead of using a framework tool like Serverless, we’re going to be using the Bluemix CLI. Download the IBM Cloud Functions CLI so we can interact with OpenWhisk on IBM.
Before you can start working with your IBM Cloud account, you need to sign in via the CLI. From the command line, execute the following:
1 |
bx login -a api.ng.bluemix.net -o your_email@example.com -s dev |
When downloading the CLI, you’ll be given the exact command, but it should look similar to what I’ve presented above.
Now we can start creating our project.
Understanding the Project Structure and OpenWhisk Package Creation Process
If you’ve never worked with FaaS before, things are done a little differently than building a stand-alone, hardly scalable application.
For example, each endpoint in our FaaS project will be a separate function. Combined, these functions create what is called a package. These functions scale as necessary to meet the changing demand of your application.
With that said, create the following:
1 2 3 4 5 6 7 8 9 10 11 12 |
create --- package.json --- create.js retrieve --- package.json --- retrieve.js update --- package.json --- update.js delete --- package.json --- delete.js |
The project should have a directory for each function that we wish to create. Each function will have its own package.json file. Each package.json file can be created by executing the following within each of the directories:
1 |
npm init -y |
Within each of the package.json files, you’ll also need to define which file is your function code. For example, open create/package.json and add or change the following line:
1 |
"main": "create.js", |
By setting the main
file, we are stating which JavaScript file contains our function.
When we start deploying our functions, we’ll be doing it so they are part of the same package.
Designing a Function for Creating Data
Let’s start development with creating data in our database. Navigate to the create directory and execute the following command from your command line:
1 |
npm install couchbase uuid joi --save |
The above command will install our function dependencies. We’ll be using the Couchbase SDK for Node.js, the UUID library for generating unique keys, and the Joi library for validating input.
We will be revisiting the dependency installation, but at least it will keep us going for now.
Now open the project’s create/create.js file and include the following:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 |
const Couchbase = require("couchbase"); const UUID = require("uuid"); const Joi = require("joi"); var bucket = null; function main(params) { if(bucket == null) { var cluster = new Couchbase.Cluster("couchbase://" + params.host); cluster.authenticate(params.username, params.password); bucket = cluster.openBucket(params.bucketName); } var schema = Joi.object().keys({ firstname: Joi.string().required(), lastname: Joi.string().required(), type: Joi.string().forbidden().default("person") }); var data = params; var response = {}; return new Promise((resolve, reject) => { var validation = Joi.validate(data, schema, { stripUnknown: true }); if(validation.error) { response = { statusCode: 500, body: JSON.stringify(validation.error.details) }; reject(response); } var id = UUID.v4(); bucket.insert(id, validation.value, (error, result) => { if(error) { response = { body: JSON.stringify({ code: error.code, message: error.message }) }; reject(response); } data.id = id; response = { body: JSON.stringify(validation.value) }; resolve(response); }); }); } exports.main = main; |
The above code is a bit much, so we need to figure out what is going on. Let’s start with the variable that exists outside of our function:
1 |
var bucket = null; |
It isn’t the best idea to establish a new connection every time the function is called. Instead, we can keep a global instance of the open Couchbase Bucket and for as long as it exists, use it. Just note that it won’t always exist because OpenWhisk will destroy functions after a while of inactivity.
1 2 3 4 5 |
if(bucket == null) { var cluster = new Couchbase.Cluster("couchbase://" + params.host); cluster.authenticate(params.username, params.password); bucket = cluster.openBucket(params.bucketName); } |
Inside our function we check to see if the Bucket is already open. If the Bucket is not open, we establish a connection using parameters passed into the function. When the time comes, we’ll be defining default parameters which contain this connection information.
Since we’re creating data, we need to validate that the input is correct.
1 2 3 4 5 |
var schema = Joi.object().keys({ firstname: Joi.string().required(), lastname: Joi.string().required(), type: Joi.string().forbidden().default("person") }); |
We’re expecting a firstname
and lastname
value to be present. We’re also expecting a type
to not be present. We can validate this with the following:
1 2 3 4 5 6 7 8 |
var validation = Joi.validate(data, schema, { stripUnknown: true }); if(validation.error) { response = { statusCode: 500, body: JSON.stringify(validation.error.details) }; reject(response); } |
The stripUnknown
option will remove data not defined in the scheme. We need to strip data because our input and connection information will exist in the same payload. We don’t want the connection information to be saved in our documents. If there is a validation error, it will be returned. If there was no validation error, we can proceed to inserting the data.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
var id = UUID.v4(); bucket.insert(id, validation.value, (error, result) => { if(error) { response = { body: JSON.stringify({ code: error.code, message: error.message }) }; reject(response); } data.id = id; response = { body: JSON.stringify(validation.value) }; resolve(response); }); |
We can generate a new unique key and save the validated data as a document. The data itself will be returned as a response.
The other functions will follow this same strategy, more or less.
Designing a Function for Retrieving Data with N1QL
Now that we have data, let’s try to retrieve it from the database with a function invocation. Navigate to your retrieve directory and execute the following from the command line:
1 |
npm install couchbase --save |
Because we won’t be creating data, we don’t need to generate unique values or validate any user data. For this reason, we only need the Couchbase SDK for this function.
Open the project’s retrieve/retrieve.js file and include the following:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 |
const Couchbase = require("couchbase"); var bucket = null; function main(params) { if(bucket == null) { var cluster = new Couchbase.Cluster("couchbase://" + params.host); cluster.authenticate(params.username, params.password); bucket = cluster.openBucket(params.bucketName); } var response = {}; var statement = "SELECT META().id, `" + bucket._name + "`.* FROM `" + bucket._name + "` WHERE type = 'person'"; var query = Couchbase.N1qlQuery.fromString(statement); return new Promise((resolve, reject) => { bucket.query(query, (error, result) => { if(error) { response = { body: JSON.stringify({ code: error.code, message: error.message }) }; reject(response); } response = { body: JSON.stringify(result) }; resolve(response); }); }); } exports.main = main; |
Let’s skip over what we’ve already seen in the previous function and jump to what’s new. Once we’re connected to an open Bucket, we can create a N1QL query.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 |
var statement = "SELECT META().id, `" + bucket._name + "`.* FROM `" + bucket._name + "` WHERE type = 'person'"; var query = Couchbase.N1qlQuery.fromString(statement); return new Promise((resolve, reject) => { bucket.query(query, (error, result) => { if(error) { response = { body: JSON.stringify({ code: error.code, message: error.message }) }; reject(response); } response = { body: JSON.stringify(result) }; resolve(response); }); }); |
This N1QL query is SQL-like and it will allow us to retrieve all documents that match certain criteria. If there are any errors, return them as a response, otherwise return the result set.
Because we’re not validating anything, this function for retrieving data was much simpler.
Designing a Function for Updating Data with Subdocument Mutations
Now let’s say that we want to update documents within the database. Instead of retrieving documents, making changes, then saving those changes, we’re going to submit changes directly to the database and let the database figure things out.
Navigate into the project’s update directory and execute the following from the command line:
1 |
npm install couchbase joi --save |
Because we’re accepting user data, we want to validate that data. We’re not creating data so we don’t need to generate any unique keys.
Open the project’s update/update.js file and include the following:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 |
const Couchbase = require("couchbase"); const Joi = require("joi"); var bucket = null; function main(params) { if(bucket == null) { var cluster = new Couchbase.Cluster("couchbase://" + params.host); cluster.authenticate(params.username, params.password); bucket = cluster.openBucket(params.bucketName); } var schema = Joi.object().keys({ id: Joi.string().required(), firstname: Joi.string().optional(), lastname: Joi.string().optional() }); var data = params; var response = {}; return new Promise((resolve, reject) => { var validation = Joi.validate(data, schema, { stripUnknown: true }); if(validation.error) { response = { statusCode: 500, body: JSON.stringify(validation.error.details) }; reject(response); } var builder = bucket.mutateIn(validation.value.id); if(validation.value.firstname) { builder.replace("firstname", validation.value.firstname); } if(validation.value.lastname) { builder.replace("lastname", validation.value.lastname); } builder.execute((error, result) => { if(error) { response = { statusCode: 500, body: JSON.stringify({ code: error.code, message: error.message }) }; reject(response); } response = { statusCode: 200, body: JSON.stringify(validation.value) }; resolve(response); }); }); } exports.main = main; |
Does the above code look familiar? It should, because we’re following the same strategy.
Our validation logic is slightly different in this example:
1 2 3 4 5 |
var schema = Joi.object().keys({ id: Joi.string().required(), firstname: Joi.string().optional(), lastname: Joi.string().optional() }); |
We want to edit a particular document so a key is required. We don’t know what the user wants to update so we set the properties as optional.
To do updates we’re going to be doing subdocument operations on our documents. To do this, we can use a mutation builder.
1 2 3 4 5 6 7 |
var builder = bucket.mutateIn(validation.value.id); if(validation.value.firstname) { builder.replace("firstname", validation.value.firstname); } if(validation.value.lastname) { builder.replace("lastname", validation.value.lastname); } |
We provide a document to alter and whatever paths the properties exist at. The paths could be much more complex than the examples used here.
With the set of mutations defined, we can execute them against the database.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
builder.execute((error, result) => { if(error) { response = { statusCode: 500, body: JSON.stringify({ code: error.code, message: error.message }) }; reject(response); } response = { statusCode: 200, body: JSON.stringify(validation.value) }; resolve(response); }); |
Depending on the result, a response will be returned from the invocation of the function.
Designing a Function for Removing Data
We’re at our final function in a package of CRUD operations. The time has come to delete data from the database.
Navigate to the delete directory and execute the following command:
1 |
npm install couchbase joi --save |
We’ll be accepting document keys to be deleted so we’ll need to validate the input. Likewise we also need the Couchbase SDK to work with the database.
Open the project’s delete/delete.js file and include the following JavaScript code:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 |
const Couchbase = require("couchbase"); const Joi = require("joi"); var bucket = null; function main(params) { if(bucket == null) { var cluster = new Couchbase.Cluster("couchbase://" + params.host); cluster.authenticate(params.username, params.password); bucket = cluster.openBucket(params.bucketName); } var schema = Joi.object().keys({ id: Joi.string().required() }); var data = params; var response = {}; return new Promise((resolve, reject) => { var validation = Joi.validate(data, schema, { stripUnknown: true }); if(validation.error) { response = { statusCode: 500, body: JSON.stringify(validation.error.details) }; reject(response); } bucket.remove(validation.value.id, (error, result) => { if(error) { response = { body: JSON.stringify({ code: error.code, message: error.message }) }; reject(response); } response = { body: JSON.stringify(validation.value) }; resolve(response); }); }); } exports.main = main; |
You’re probably seeing the bigger picture now in regards to function creation with OpenWhisk and Couchbase, so we’re not going to walk through the above function for deleting documents.
Packaging and Deploying the Functions to OpenWhisk with Docker
We have a set of functions ready to go, but we can’t just package and deploy them to Bluemix. If we did that, we’d get a bunch of errors. Bluemix uses a special flavor of Linux with a certain architecture. I downloaded the dependencies on my Mac which isn’t a match.
Remember that article I wrote a while back titled, Deploying Native Node.js Dependencies On AWS Lambda? We need to do something similar for OpenWhisk with Docker.
With Docker installed and ready to go, execute the following from the CLI:
1 2 |
docker pull openwhisk/nodejs6action docker run -it -v /Users/nraboy/Desktop/couchbase-openwhisk:/project openwhisk/nodejs6action /bin/bash |
The above commands will download an appropriate OpenWhisk Docker image for Node.js. Then we deploy a container with that image in interactive terminal mode. This container will also have a mapped volume. I am mapping my local project directory to a directory within the container.
After the command has executed and the container is deployed, you should be in the shell within the container.
For each function, execute the following:
1 2 |
cd /project/create npm install |
Remember, installing dependencies from our host machine isn’t good enough. We need to compile the dependencies for Bluemix. Docker will compile these dependencies and since the directory is mapped, we can use them from the host machine.
After each functions packages are installed, we can bundle them and deploy them.
From the host machine, create a ZIP archive of each of the functions. The archive should contain the package.json file, the JavaScript file, and the node_modules directory.
If you’re on a Mac or computer with a ZIP CLI, execute the following:
1 2 |
cd create zip -r create.zip * |
When you have a ZIP of each function, they can be deployed by executing the following:
1 |
bx wsk action create couchbase/delete --kind nodejs:default delete.zip -p host ec2-45-236-32-140.compute-1.amazonaws.com -p username demo -p password bluemix -p bucketName example |
I introduced some new things in the above command.
First, we’re creating a package called couchbase and in this package we have a delete function that is based off the delete.zip file. I’m also passing some default parameters. These parameters will be our connection information. Since this information is sensitive, we are not passing them when invoking the function, rather creating the function.
Execute a variation of the above command for each of your functions.
To execute your function, try running something like the following:
1 |
bx wsk action invoke couchbase/create -p firstname Nic -p lastname Raboy --blocking |
The above command should pass in a few parameters to pass validation. The function is invoked in a blocking manner, and if it succeeds, our data will be saved in the database and returned in the response.
Conclusion
You just saw how to create a package of serverless functions for OpenWhisk that communicate with the NoSQL database, Couchbase. OpenWhisk can be used as an alternative to AWS Lambda, but the two are certainly not the only options available. Regardless on what you choose, functions as a service (FaaS) are very scalable solutions for massive applications.
Want to see another OpenWhisk example? Check out a tutorial I wrote titled, Convert a Node.js RESTful API to Serverless with OpenWhisk.
Hi 👋 Nick great post
I learn something new today about joi package 👍🏼
Some tips
You can make your actions a web action and have a public url when you create —web true
Also you can skip JSON.stringify and set body to an object it will be stringify for you.
To get the public url for the web action you can run wsk action get couchbase/delete —url
You can read more using web actions in the docs here https://console.bluemix.net/docs/openwhisk/openwhisk_webactions.html
Thanks for the comments!
[…] Use OpenWhisk for FaaS with Node.js and Couchbase NoSQL […]