⬆️ ⬇️

How the Internet of Things shifts analytics to the periphery

In the era of the Internet of Things (Internet of Things, IoT), the tasks of processing, filtering and analyzing data as close as possible to their source are becoming increasingly relevant, as this allows you to act remotely instead of sending this data to data processing centers (DPCs) or clouds for filtering and analyzing.



Another reason for deploying analytics tools on the periphery of the network is that today there are new applications of the Internet of Things, and in many cases the amount of data generated on the periphery places such high demands on bandwidth and computing power that the available resources are not able to satisfy them. That is, data centers designed to solve more traditional corporate tasks, can not cope with the flow of data from smart devices, sensors, etc.



For example, if the temperature readings of a wind turbine engine lie within the operating range, it is not necessary to store them at one second intervals, since a huge amount of data will accumulate very quickly. It is better to stick to the approach when readings that go beyond the working range or indicate certain trends (for example, an unavoidable component malfunction) will generate a warning and, possibly, remain centralized only after this first anomaly is detected for subsequent analysis.

')

There are too many suppliers in this market to make an exhaustive list. However, it is worth mentioning here that last year the company, formerly known as JustOne Database, conducted a comprehensive rebranding to cover all branches. Not only the products were renamed, but the company itself. Now it is called Edge Intelligence. A company representative told me that their database, which can work on relatively compact servers at the periphery of the network, in the data center or in the cloud, became so popular that the company decided to rebrand after six years of work.



So, what you need to know about peripheral analytics, if you are trying to decentralize in relation to at least part of the analytical tasks?



Standards and translation of protocols



Some standards in this area are likely to disappear, however, focusing on technologies that support the standards is likely to be easier to integrate in the future. But there are a lot of specialized standards and APIs. Standards and protocols include the POSIX and HDFS API for file access, SQL for queries, the Kafka API for event streams, HBase, and possibly the OJAI API (open JSON API) for compatibility with NoSQL databases. Support for legacy proprietary telemetry protocols is also needed so that you can incorporate legacy hardware (which has often been used for decades) into more modern IoT infrastructures. This is particularly important in industries where the Internet of Things is of particular value because it helps improve the effectiveness of preventive maintenance.



Aggregation of distributed data



In a sense, this is the basis of peripheral analytics, since it implies the provision of high-speed local data processing, which is especially important if there are restrictions on geography or in terms of confidentiality of information (for example, when working with personal data). Aggregation can also be used to consolidate data from IoT devices located on the periphery.



Bandwidth Analysis



This applies to technologies that regulate the network bandwidth between peripheral devices and the cloud and (or) data center, even if sensors or devices are connected from time to time.



Convergent analytics



Making operational decisions and analyzing data in real time on the periphery.



Security and Identity Management



Comprehensive Internet of Things protection provides authentication, authorization and access control both at the periphery and at the level of central clusters. Under certain circumstances, it is desirable to provide encryption in the data channel between the objects on the periphery and the main data center. Identification control is also a complex problem: it requires tools to manage “things”, including authentication, authorization, and assignment of privileges within or outside the system’s borders and the enterprise itself.



Enterprise Reliability



A robust computing environment is being formed that can cope with numerous equipment failures that can occur in remote, isolated systems.



Cloud Integration



If not today, then in the future it may be necessary to ensure reliable integration between the peripheral analytics node and the cloud. This means that warnings and even “basic” data points can be stored in the cloud, and not in the company's own data center. Therefore, integration with your cloud service provider (if available) will be a very forward-looking solution. If you practically don’t use clouds for data processing and storage, in the future you may still be interested in Amazon Web Services and Google Cloud Platform or Microsoft Azure cloud services; In addition, it will be useful for you to find out about the availability of support for the OpenStack open source infrastructure within the framework of the Infrastructure as a Service (IaaS) concept.



In the past few years, peripheral analytics has become increasingly widespread as new uses of the Internet of Things become available. It is worth at least analyze the possible role of peripheral computing in projects in the field of the Internet of things that you plan to implement.

Source: https://habr.com/ru/post/353436/



All Articles