Forwarding data with Data Connectors

Using a Data Connector is the easiest and most reliable way to get your sensor data out of DT Cloud and into an external service or database for further storage, processing, and analysis.

In this article we will cover:

  1. How Data Connectors work
  2. Why you should use Data Connectors
  3. How you can start forwarding your sensor data

How Data Connectors work

All sensors send their data to the DT Cloud, where the data is stored for 30 days. The DT Studio web application can then be used to do fleet management and to get a basic view of raw sensor data.

Many use-cases require (often use-case specific) functionality beyond what DT Studio supports, and Data Connectors provide a way to easily and reliably get your data into external services providing exactly the functionality you need.

How Data Connectors work is illustrated in the following picture:


  1. A sensor sends a data event to DT Cloud, via a Cloud Connector
  2. The DT Cloud uses the Data Connector to forward the events to your cloud
  3. Your cloud acknowledges that the event was received

Each event is sent as an HTTPS POST request with a JSON payload.

A temperature sensor data payload being sent over a Data Connector looks like this on the receiving side:

  "event": {
    "eventId": "bjeho5nlafj3bdrehgsg",
    "targetName": "projects/bhmh0143iktucae701vg/devices/bchonod7rihjtvdmd2vg",
    "eventType": "temperature",
    "data": {
      "temperature": {
        "value": 24.9,
        "updateTime": "2019-05-16T08:15:18.318751Z"
    "timestamp": "2019-05-16T08:15:18.318751Z"
  "labels": {}

See Events for more details on all payloads.

Why use Data Connectors

There are three properties of the Data Connector that make them particularly very well suited for integrating your sensor data from DT Cloud into an external service.

At-Least-Once Guarantee

To keep your code lean, we promise an at-least-once delivery guarantee.


The Data Connector is similar to a webhook, but with an added delivery guarantee. Every event received by DT Cloud is put in a dedicated, per-Data Connector, queue. Messages are removed from this queue once acknowledged, or if the message is older than 12 hours. This means that if your endpoint goes offline for a while, you will still receive the data when it comes back up again because the Data Connector will retransmit each message for up to 12 hours.

An important side effect of this delivery guarantee is that, under certain conditions, you may receive duplicates of the same event. This will be rare, but you should make sure that your receiving code can handle this.

Low Latency

Data Connectors give you the lowest possible end-to-end latency from the Sensor to your cloud.


Our cloud will create one HTTPS POST request for every event we receive from the sensor. Because we do not introduce artificial wait states to aggregate events, latency is kept as low as possible no matter how many sensors you have.

High Scalability

Data Connectors scale. To combine low latency with scale, our cloud will never wait for one HTTPS POST request to complete before sending the next. A new event will always instantly be sent to your cloud, running as many HTTPS POST requests in parallel as there are events at any given moment.

Start forwarding data

To start forwarding your data, you can set up a Data Connector via DT Studio.

For a step-by-step guide, see the Data Connector reference article.