Sensor Data Insight with Power BI Integration

Introduction

By utilizing Disruptive Technologies (DT) Wireless Sensors, collecting high-resolution data from almost any source is simple. With their long-lasting battery life, valuable information regarding the environment and assets can be continuously monitored in real-time. Over time, analyzing and modeling the aggregated data allows for changes in trend and behavior to be evaluated and reacted to. Combined, this information can be invaluable for any company looking to optimize their process, be it reducing production line downtime or increasing employee satisfaction.

Developed by Microsoft, POWER BI aims to connect and visualize data in one unified and scalable platform. Due to its powerful analytics and data exploration toolbox, many companies have adopted the platform for performance monitoring and decision-making support. Whether your company is already using POWER BI or wants to start exploring it now, the DT Sensor Solution makes it easy to integrate your sensor data to gain additional insight into daily operations.

In this application note, we will look at how Data Connectors can be used to forward data from DT Cloud to your own external database in real-time, where POWER BI can continuously query the data into your analytics solution. The integrating steps will be explained in detail without the need for prior programming knowledge. After reading, the only remaining step is to explore the many benefits continuous monitoring can have for you and your business.

powerbi_header.png

Integration Overview

Sensor data can be accessed through either the available REST API or by using the Data Connectors available in DT Studio. For customers who want to forward the data to an external storage or processing platform, these Data Connectors provide a simple but secure method of doing so. They are implemented as webhooks with additional redundancy and security measures to ensure your data reaches its destination as intended. In this integration, a Data Connector will forward data in real-time to an SQL Database hosted at Microsoft Azure.

Any popular cloud hosting services such as AWS, Google Cloud, IBM Cloud, or Azure could be used in this integration. However, Azure provides the simplest setup due to the shared user account between it and POWER BI. By setting up an HTTP trigger Azure Function that listens for messages from the Data Connector, the data can be forwarded to and stored in an SQL Database on the same platform in real-time. Then, querying the database from POWER BI only requires a few clicks of the mouse.

chain.png

 

Before you begin

This integration requires no previous knowledge regarding programming or cloud integration, and we will deploy using only the GUI. However, it does require some copy-and-pasting of pre-written code, and for this, all the necessary files have been provided in a Supporting File Repository. Additionally, all software and platforms used do, at the time of writing, provide trial periods. Assuming you own DT sensors, the integration can therefore be recreated for free.

Requirements

Supporting File Repository

To avoid endless code preview sections or appendices, a repository containing all code and files used in this project is provided. By following this link, you can either download or view in your browser the code necessary for later steps.

When the following integration steps mention a path in the form of ./example/path/to/file.ext, this refers to a file in the provided repository. You are then expected to either copy, upload, or apply the said file's contents depending on usage.

Note

  • If you struggle to read the images because they are too small, right-click and Open Image in New Tab will let you view the original image in full size.

 

Forward Data to Microsoft Azure

Before POWER BI can query data, it needs to be forwarded to an external storage location, here an Azure SQL Database. We will initialize a new empty database, the HTTP trigger function, and a new DT Studio Data Connector in the following steps. Then we will link all three elements and verify successful data transfer.

1 - Azure SQL Database

1.1 - If you do not already have one set up, follow Microsoft's own guide on creating an Azure SQL Database. Be sure to note the Server SQL admin username and password, as this will be used several times later.

1.2 - After the database has been initialized, enter it and locate the Query Editor. Log in with the Server SQL admin username and password from step 1.1.

1.3 - Paste the content of ./azure/query_editor.sql and click on Run. This will generate the database tables needed to store the data from all of DT's sensors. Verify that the tables were generated by viewing the Tables tab.

azure_query_editor.png

1.4 - Under the Connection Strings tab, copy the contents of ADO.NET into a text editor of choice. Paste the text, and replace {your_password} with the SQL admin password found in step 1.1. Do not include the brackets. Save this for later.

sql_connection_string.png

1.5 - Under Overview, copy the Server Name text and save it for later. This will be used by POWER BI to query the database for data.

sql_server_name.png

2 - HTTP Trigger Function

2.1 - Create a new Function App by using Microsoft's own guide as a reference. The following parameters should be specifically set:
Runtime Stack: .NET Core
Version: Latest
The remaining parameters can be left as default or set as desired.

2.2 - After the Function App has been initialized, Go To Resource.

2.3 - Under the Configurations tab -> Applications, click "+New Application Setting" to add the following environment variables.
DT_SIGNATURE_SECRET: A secret signature used to validate the requests coming from the Data Connector. Use a strong password/secret here. Save it for later, as we will use the same secret when creating the Cata Connector.
SQLDB_CONNECTION: Paste the SQL Connection String found in step 1.4.

env_vars.png

2.4 - Under the Configurations tab -> Function Runtime Settings, set the Runtime Version as ~2. Remember to click Save.

runtime.png

2.5 - Under the Functions tab, create a new function by clicking +Add. Choose the HTTP trigger template, and give it a name, like HttpTrigger1.

azure_function_create.png

2.6 - Enter the function. Under the Code + Test tab, paste the code located in ./azure/httptrigger/httptrigger.csx. This will extract the different data fields from the request body JSON received by the trigger, then, based on the event type, insert the data into the SQL database. Remember to click Save.

2.7 - In the same window, click the small icon to the right of Test/Run and copy the Function URL. Save this for later.

function_code.png

 

3 - DT Studio Data Connector

3.1 - In DT Studio, under API Integrations, you will find the Data Connectors tab. Here, create a new connector.

studio_dcon.png

3.2 - Set the following parameters:
Data Connector Display Name: Something unique.
Endpoint URL: Paste the Function URL of the HTTP trigger found in step 2.7.
Signature Secret: Must be the same value as DT_SIGNATURE_SECRET in step 2.3.

By default, the Data Connector will forward all event types. You can, however, choose to only forward one or a few types if you so wish. Remember to Save.

4 - Verify Connection

4.1 - If all the previous steps work as intended, the Success counter should be iterated for each new event. You can either wait for one of your sensors to send a new value, tap a sensor to send a touch event, or use the sensor emulator.

You can, and should, also verify that data appears in your database. You can do this by in your Azure SQL Database right-clicking one of the tables created in step 1.3, then click Select Top 1000 Rows. Your data should be listed below.

DCON_success.png

4.2 - If the Data Connector shows errors or the data does not show up in the SQL Database, check the function logs. In the same windows as for step 2.7, click Logs at the bottom of the screen. This will, for each new event, print information that can help you locate the problem.

 

POWER BI Setup

Now that the data are safely being forwarded to our SQL database, POWER BI can query the data for further processing and analysis. Assuming successful data forwarding, these remaining steps are less involved and is probably familiar to users previously experienced with the platform.

5 - Connect POWER BI to your Azure SQL Database

5.1 - In your POWER BI Project, add a new data source by clicking Get Data.

get_data.png

5.2 - Choose SQL Server Database, and click Connect.


choose_sql.png

5.3 - In the Server field, paste the Server Name string found in step 1.5. Click OK. If you're logged in to POWER BI with the same Microsoft account used in Azure, authentication should not be necessary. However, if you're prompted, use the admin username and password found in step 1.2.

5.4 - In the following Navigator, all your Azure databases are listed. Choose one or several of the tables we created earlier. We will here only focus on temperature data. Click Transform Data.

navigator.png

5.5 - For POWER BI to correctly read the timestamps, we need to convert the type from Text to Date/Time. For each imported table, right-click the timestamp column, then Change Type and Date/Time. Click Close & Apply to transform and import the data.

transform.png

 

Example Application

To test our integration, the implementations discussed in the two previously written application notes, Modeling Fridge Content Temperatures and Simple Temperature Forecasting for Substation Transformers, have been attempted recreated in a POWER BI Dashboard. Some additional statistics have also been added, mostly for exploratory purposes.

However, temperature forecasting can be applied simply by utilizing the built-in analytic found in POWER BI. The inertia model is implemented by utilizing the Python Visual element, which lets you transform the data through a custom Python script. It is available in ./powerbi/temperature_inertia.py should you wish to model your own temperature data inertia, assuming you have a working Python environment on your machine.

powerbi_gif.gif