Using Sensor Emulator API to Apply Custom Data Manipulation Methods in Studio

Introduction

Designed to be a simple yet powerful interface for our customers to interact with and organize their sensors and related data, Disruptive Technologies (DT) Studio aims to get your up and running quickly when buying our products. With its customizable dashboard, multiple sensors can be monitored and compared in realtime, and notifications can be configured to trigger at certain events.

We continuously improve and implement new methods for our users to interact with and explore their data. Many requested features are, however, quite user-specific and might take some time before receiving official support. Fortunately, DT Studio provides users with tools to tailor their own custom solutions, allowing quick prototyping and experimentation with data.

In this application note, custom functionality is added to DT Studio to model the core temperature of food within a fridge using nothing but the measured air temperature. By utilizing tools already built into DT Studio, the aim is to inspire users to experiment with their own data to extract even more information. An example code repository has also been provided to get you started.

studio_dashboard.png

Figure 1: DT Studio Dashboard showing data from multiple sources in real-time.

 

Modeling Temperature Inertia

For those interested, the details surround the temperature model applied in this application note may refer to our article on Modeling Fridge Content Temperatures. In summary, Newton's Law of Cooling, a well-established mathematical model, can be applied to the ambient air temperature inside a fridge to estimate the content's core temperature. By configuring notifications to be triggered by the modeled temperature instead of the ambient temperature measured by a sensor directly, alarm fatigue caused by sharp spikes in temperature due to opening- and closing fridge doors may be reduced.

emulated_project_result.png
Figure 2: A single DT Studio Dashboard card modeling fridge temperature inertia.

 

Implementation Overview

When visualization sensor data in DT Studio, the information is fetched from DT Cloud directly. Customers can not influence or modify the chain of events in this transfer. Therefore, to apply our custom function, we will use a Data Connector to forward each new sensor event to a cloud function to process the data freely. For sensors provided with a certain label and value, an emulated twin device will be created to contain the modeled data.

The event is sent using HTTP POST requests, which at arrival, spawns a new compute instance containing an implementation of our model in Python. Once the modeled temperature value has been calculated, the data is sent back to the emulated twin in DT Studio using the REST API. There it can be plotted or used for notifications. Figure 3 shows a simplified overview of the implementation flow. 

DT_Studio_Inertia_Transform.png
Figure 3: A simplified overview of the implementation.

This flow can, in practice, be reused for any number of different solutions. Everything except the temperature model part of the presented flow can essentially remain unchanged, whether implementing a simple moving average, an enveloping function, or other models. This allows for easy copy-and-paste expansion of more features should that be necessary.

 

Serverless Data Processing

Google Cloud has been chosen as the cloud platform on which our function is hosted. Any of the large cloud service providers can be used, and it is also possible to host it locally. However, Google Cloud offers a free trial and provides a very easy-to-use CLI for deployment, which saves a lot of hassle. The service also scales dynamically depending on load, allowing for any number of sensors to be modeled simultaneously.

Example Code Repository

An example implementation is provided on the official DT GitHub page for free under the MIT license. It can be accessed following this link. It contains the Python code necessary to receive, authenticate, and model the temperature data in real-time. You are encouraged to understand and modify this code to suit your needs.

Deploying the Function

Before the function can be deployed, which is done from the repository root, the environment variables needed to authenticate the request must be set. Create a file .env.yaml, in which you set the following variables.

SERVICE_ACCOUNT_KEY_ID: ___
SERVICE_ACCOUNT_SECRET: ___
SERVICE_ACCOUNT_EMAIL: ___
DT_SIGNATURE_SERCRET: ___
AUTH_ENDPOINT: https://identity.disruptive-technologies.com/oauth2/token
API_URL_BASE: https://api.disruptive-technologies.com/v2
EMU_URL_BASE: https://emulator.disruptive-technologies.com/v2

The values for Service Account key, secret, and email are found when creating your service account and are explained in the next section. The signature secret should be a strong and unique password of choice to validate the Data Connector's request content. After the variables are set, the function can be deployed to Google Cloud.

gcloud functions deploy function-name \
		--entry-point main \
                --runtime python37 \
                --trigger-http \
                --allow-unauthenticated \
                --timeout 30s \
                --ignore-file .gcloudignore \
                --project your-project \
                --region your-region \
                --env-vars-file .env.yaml

 

DT Studio Setup

To securely forward data to a cloud service, a Service Account and Data Connector should be used. Both can be configured from within DT Studio, assuming you have developer privileges or higher.

Service Account

For our cloud service to interface with the Disruptive Technologies API, a logged-in Service Account is needed. Without it, the authentication step in our cloud function will be rejected, and execution terminated. You can read more about Service Accounts here.

In DT Studio, under your project, you'll find Service Accounts under the API Integrations tab. You can either create a new or use an existing account, but be sure to give Project Developer access or higher. At least one active key should be created. The key ID, email, and secret are the same as in the function environment variables set before deployment.

service_account.png

Data Connector

Also located under API Integrations in DT Studio, a new Data Connector should be created. For the Data Connector to know where to push its requests, the Endpoint URL should be set to the URL of wherever our function is hosted. In Google Cloud Functions, this is called a Trigger URL and is found under the TRIGGER tab when examining your function details.

The Data Connector Signature Secret must be the same as the environment variable DT_SIGNATURE_SECRET set for the cloud function. If not, the content will not be verified due to a signature mismatch.

In our implementation here, a sensor label is used to control our model's heat transfer coefficient. Therefore, when creating the Data Connector, the same label must be added to the list of labels to be forwarded. By default, only the sensor name is included here. Otherwise, default values will suffice. You can, however, choose to exclude a few event types to reduce load. 

 

Applying the Model

With a Cloud Function running in the background, waiting for POST requests from our Data Connector, modeling the temperature data of a new sensor is as simple as adding a label. In our implementation, by providing a sensor with the label "inertia-model" and a value equalling the heat transfer coefficient, an emulated device twin will be spawned immediately. Following this, each new temperature event to this sensor will be accompanied by the emulated twin being updated with a modeled version of said event. 

double_inertia.png