Othman Moumni Abdou - Senior Data Engineer
Cloud costs can be complex to forecast and monitor
The world before : one cloud, but many parameters
If you’ve started implementing an IoT data project, you may have faced the difficulties of estimating precisely the cloud costs when the project will be deployed on 1, 10 or 100 equipment. Indeed, such a project involves several capabilities: data transmission, storage, processing at least. On established cloud providers such as Azure or AWS, pricing of those products are complex: not only this is based on several parameters (for example: CPU, RAM & Gigabytes) but sometimes some of those metrics are not forecastable and you have no other choice but to accept the final price on the bill. As a result, it may slow down the beginning of the project or put in jeopardy its financial sustainability.
At InUse, we believe that the pricing model of our solution should follow two rules:
That’s why our standard offer consists of:
In this offer, a “gigabyte” integrates the storage cost but also the transmission and the processing costs. Having a unique integrated metric makes the model much simpler to understand. Naturally, we provide you with all the metrics to monitor your consumption in real time on the platform.
In our latest version, we’ve released a new storage monitoring module so that you can have a more detailed view of your current consumption.
Our solution architecture may be summarized as follows:
The data consumption is the sum of the three. You can see below a screenshot of the new interface and the three categories mentioned before: Models stands for the analytics database, Databases for the raw storage and Files for the file storage.
The service computes statistics on the data storages every hour and they are historized. The aim of this service is not only to have a better visibility on the storage usage and its evolution over time but also to give smart recommendations to fine tune models and data acquisition. Contrary to big cloud providers, our goal is that you consume only what you need for building digital services in the solution. Not less. Not more.
Regarding the raw storage, we monitor mainly the database size. As mentioned before, we store the data you pushed without transforming it.
Thus, the total usage depends only on:
As a reminder, Ewon provides IoT gateways which connect to PLCs and perform data acquisition on the cloud. There are several parameters for this acquisition: for example, a tag could be historized by taking a value every X seconds and/or when the value change is above a given threshold.
The acquisition parameters must be adjusted per tag so that we avoid undersampling (and wrong signal approximation) and oversampling (useless storage). For instance, capturing the internal temperature of your factory every second is usually unnecessary. The interface will highlight tags with a high sampling rate and a low variation rate to encourage optimized data consumption.
The long term objective is to provide those kinds of insights for other standard integrations such as MQTT.
For models, we monitor the size of the Elasticsearch index, the count of machines and properties used by the model, the first and last date timestamp, the number of periods (Elasticsearch document) and the average size per machine and per period. Those statistics are historized and may be displayed on several time periods.
The service also analyze how the properties are used in the application and provide recommendations to optimize the model size:
The solution provides a complete Document Management System.
Thus, you can :
This interface will display the total size used by those files.
As you can see, this module lets you monitor and forecast simply your data consumption. It also provides smart recommendations to optimize it and our customer success team will be more than happy to help you implement them.
Interested? Want to know more about our pricing model or this module? We’d be more than happy to give you a deeper explanation!