IBM Cloud

This article will show you how to use a Data Connector to integrate with IBM cloud by forwarding events to a cloud function.

Before you begin

  • You need to have at least project administrator access to a project in Studio.

  • You need to have an IBM Cloud account, which is available with a free tier.

Create a Cloud Function

  1. Go to the IBM Cloud Console.

  2. Open the dropdown menu in the top left corner and navigate to Functions.

  3. Click Start Creating.

    No Cloud Foundry Space

    If prompted with “No Cloud Foundry Space”, click the X in the top-right corner and then select your REGION.

  4. After creating a namespace, click Create Action.

  5. Give it an Action Name, like disruptive.

    Leave the Enclosing Package as is, but make sure the Runtime is Node.js 10.

  6. Click Create.

Set as public HTTP function

In the new Cloud Function view, we will configure it to listen on a public URL and reply with an HTTP 200 OK status.

  1. In Endpoint on the left, check Enable as Web Action and Raw HTTP handling.


  2. Press Save.

  3. Copy the URL of the Web Action below for later.

The above configuration will enable public access to the function over HTTP without authentication. The input (params) will look something like this:

    "__ow_method": "post",
    "__ow_body": "<Body as base64 encoded string>",
    "__ow_headers": {
      "x-dt-signature": "<JSON web token>"
    "__ow_path": ""

Update Cloud Function code

Go back to the Code, via the menu, and update the code in the editor to be the following:

function main(params) {

  /* Decode and parse the JSON */
  var bodyDecoded = Buffer.from(params["__ow_body"], 'base64');  
  var body = JSON.parse(bodyDecoded);
  console.log("%j", body);

  /* Add further integration code here */

  return { statusCode: 200 };

Save the change.

Create a Data Connector

  1. Go to Studio.

  2. Navigate to the project you want to integrate with.

  3. Open the Data Connector view for the project via the menu on the left.

  4. Click Create new.

  5. Set the Name to “IBM Cloud” or something recognizable.

  6. Take the Web Action URL found in the previous step, and enter this into Endpoint URL.

  7. Set the Data Connector to only send the Touch, Temperature, and ObjectPresent events.


  8. Leave everything by default and click SAVE NEW DATA CONNECTOR.

Verify secret

For production code, it is recommended to use the Data Connector's secret to sign each request and to verify the origin in the receiving code. See, Signing events

Test the integration

To confirm that integration is up and running, we will both look at the Data Connector metrics in Studio and the IBM Cloud Log.

Before proceeding, make sure that any of your sensors has sent some data by pressing them a few times.

Data Connector metrics in Studio

Navigate back to the Data Connector that you created previously in Studio. At the top of the page, you will find how many times your new Data Connector has been run. data-connector-metrics__4_.png

The Success count shows the number of times the Cloud Function has been called and returned HTTP status 200 OK the last 24 hours.

The Error count shows how many times it has returned an error code or timed out.

Cloud Function log in IBM Cloud

Back in the IBM Cloud Function tab, click on the Logs button and click Ok on the pop-up. Here, create a logging instance. Verify the region and choose the Lite option, which is free.

After initialization is complete, View LogDNA will show every invocation of the Cloud Function. However, please be patient as there is some delay before the IBM Log is updated.

This log will show every invocation of the Cloud Function.



This simple integration is easy to set up and maintain, as IBM Cloud takes care of all the details surrounding running the function. It also scales incredibly well, as a new Cloud Function will be run for each event sent by the Data Connector.

Next steps

The next steps are to replace the Add further integration code here with code to forward the events into any of IBM’s vast selection of databases, event buses, stream processing or machine-learning tools.