MONITOR SIMPLY AND TRANSPARENTLY YOUR DATA STORAGE

Othman Moumni Abdou  - Senior Data Engineer

Cloud costs can be complex to forecast and monitor

The world before : one cloud, but many parameters

If you’ve started implementing an IoT data project, you may have faced the difficulties of estimating precisely the cloud costs when the project will be deployed on 1, 10 or 100 equipment. Indeed, such a project involves several capabilities: data transmission, storage, processing at least. On established cloud providers such as Azure or AWS, pricing of those products are complex: not only this is based on several parameters (for example: CPU, RAM & Gigabytes) but sometimes some of those metrics are not forecastable and you have no other choice but to accept the final price on the bill. As a result, it may slow down the beginning of the project or put in jeopardy its financial sustainability.

Simplicity at the core of our data pricing policy

At InUse, we believe that the pricing model of our solution should follow two rules:

  • Simple. It should be easily understandable and forecastable so that you can focus serenely on the core of the project.
  • Adapted. If you’re an OEM interested in providing a digital solution on top of your machines, there’s a good chance you will sell this solution per machine. Our pricing follows the same logic so that the total cost of ownership is under control.

That’s why our standard offer consists of:

  • A first package of 100 gigabytes. This package should be enough to initiate the project with for a dozen of machines.
  • An additional price per machine when total consumption goes above this threshold.

In this offer, a “gigabyte” integrates the storage cost but also the transmission and the processing costs. Having a unique integrated metric makes the model much simpler to understand. Naturally, we provide you with all the metrics to monitor your consumption in real time on the platform.

Keep your data consumption under control

A brand new interface for storage monitoring

In our latest version, we’ve released a new storage monitoring module so that you can have a more detailed view of your current consumption.

Our solution architecture may be summarized as follows:

  • A raw DBMS storage where data sent through IoT gateways (ex: Ewon) or directly through our MQTT broker are stored without being processed nore altered
  • A search engine (ElasticSearch) which indexes data after being processed with the models definition and makes it accessible for quick aggregations
  • A file storage for all the media attached to the various contents of the application (alerts, dashboards, reports, documentation etc)

The data consumption is the sum of the three. You can see below a screenshot of the new interface and the three categories mentioned before: Models stands for the analytics database, Databases for the raw storage and Files for the file storage.

The service computes statistics on the data storages every hour and they are historized. The aim of this service is not only to have a better visibility on the storage usage and its evolution over time but also to give smart recommendations to fine tune models and data acquisition. Contrary to big cloud providers, our goal is that you consume only what you need for building digital services in the solution. Not less. Not more.

Raw storage

Regarding the raw storage, we monitor mainly the database size. As mentioned before, we store the data you pushed without transforming it.

Thus, the total usage depends only on:

  • The frequency at which events are sent
  • The size of the payload sent
  • The retention period

More analytics with ewon gateways

As we provide a native connector for Ewon Flexy gateways, we also have specific analytics for those integrations. We monitor for each Ewon tag and Ewon gateway the frequency of the data acquisition.

As a reminder, Ewon provides IoT gateways which connect to PLCs and perform data acquisition on the cloud. There are several parameters for this acquisition: for example, a tag could be historized by taking a value every X seconds and/or when the value change is above a given threshold.

The acquisition parameters must be adjusted per tag so that we avoid undersampling (and wrong signal approximation) and oversampling (useless storage). For instance, capturing the internal temperature of your factory every second is usually unnecessary. The interface will highlight tags with a high sampling rate and a low variation rate to encourage optimized data consumption.

The long term objective is to provide those kinds of insights for other standard integrations such as MQTT.

Models

Track important metrics

For models, we monitor the size of the Elasticsearch index, the count of machines and properties used by the model, the first and last date timestamp, the number of periods (Elasticsearch document) and the average size per machine and per period. Those statistics are historized and may be displayed on several time periods.

Optimize your consumption with our smart recommendations

The service also analyze how the properties are used in the application and provide recommendations to optimize the model size:

  • If a property is not used anywhere in the application, it can be deleted
  • If a property is used only as an intermediate property in order to compute a final property (using the complex event processing library) it does not need to be historized and stored

These statistics and information can help the user:

  • Define a retention policy to control the size of stored data. For example, create a highly detailed model with 6 months retention policy used for daily analysis and a second less detailed model with a 3 year retention policy used for long term analysis
  • Model properties and their historization to decrease the storage size (compression)

Files

The solution provides a complete Document Management System.

Thus, you can :

  • Attach some documentation to a machine or a machine model
  • Use this documentation in an automatic recommendation
  • Upload user media (audio, videos)

This interface will display the total size used by those files.

Going further

As you can see, this module lets you monitor and forecast simply your data consumption. It also provides smart recommendations to optimize it and our customer success team will be more than happy to help you implement them.

Interested? Want to know more about our pricing model or this module? We’d be more than happy to give you a deeper explanation!

Related News