The massive scale of IoT networks is driving new architectures, with a prediction by Cisco that by 2020 there will be over 50 billion devices connected to some form of an IP network. Traditional IT networks are not prepared for this magnitude of network devices, but the data generated by these devices must also be of serious concern.
The data generated by IoT sensors is one of the single biggest challenges in building an IoT system. In modern IT networks, the data sourced by a computer or server is typically generated by the client/server communications model, and it serves the needs of the application. However, in sensor networks, the vast majority of data generated is unstructured and of very little use on its own.
An advantage of this model is simplicity, as objects just need to connect to a central cloud application. This application has visibility over all the IoT nodes and can process all the analytics needed today and in the future. However, as data volume, the variety of objects connecting to the network, and the need for more efficiency increase, new requirements appear, and those requirements tend to bring the need for data analysis closer to the IoT system. These new requirements include minimizing latency, conserving network bandwidth, and increasing local efficiency.
Designing an IoT network to manage this volume of data efficiently is crucial for business benefits. The volume of data generated by IoT devices can easily overrun the capabilities of the headend system in the data center or the cloud. As the massive amount of data begins to funnel into the data center, questions about bandwidth management arise. This is sometimes referred to as the "impedance mismatch" of the data generated by the IoT system and the management application's ability to deal with that data.
Fog Computing
The Internet of Things (IoT) system aims to distribute data management as close to the edge of the IP network as possible. Fog computing is a well-known embodiment of edge services in IoT, allowing devices with computing, storage, and network connectivity to be fog nodes.
This structure minimizes latency, offloads network traffic from the core network, and keeps sensitive data inside the local network. Fog nodes enable intelligence gathering and control from the closest possible point, allowing better performance over constrained networks.
Fog services are typically performed close to the edge device, sitting as close to the IoT endpoints as possible. This allows fog nodes to provide contextual awareness of the sensors they are managing, reducing the volume of data sent upstream and making them more useful to application and analytics servers in the cloud.
The fog layer provides a distributed edge control loop capability, allowing devices to be monitored, controlled, and analyzed in real time without the need to wait for communication from central analytics and application servers in the cloud.
For example, a fog node on a large truck can measure tire pressure and combine this data with information from other sensors, sending alert data upstream only if an actual problem is beginning to occur that affects operational efficiency.
IoT fog computing enables data to be preprocessed and correlated with other inputs to produce relevant information, which can then be used as real-time, actionable knowledge by IoT-enabled applications. In the long term, this data can be used to gain a deeper understanding of network behavior and systems for developing proactive policies, processes, and responses.
Edge Computing
Fog computing solutions are being adopted by various industries, with efforts to develop distributed applications and analytics tools accelerating. Fog nodes are typically found in network devices closest to IoT endpoints, but they can also be found in sensors and devices at the edge.
As compute capabilities increase, some new classes of IoT endpoints have enough compute capabilities to perform low-level analytics and filtering for basic decisions.
For example, a water sensor on a fire hydrant can provide a wider view of a problem, while a fog node on an electrical pole can quickly generate an alert for a localized issue. Edge compute-capable meters can also monitor localized power quality and consumption, ensuring the highest quality of power delivery to customers.
Cloud Computing
Edge and fog computing complement each other, acting as a first line of defense for filtering, analyzing, and managing data endpoints. They suggest a hierarchical organization of network, compute, and data storage resources, with data collected, analyzed, and responded to at each stage based on the capabilities of each layer.
The advantage of this hierarchy is that a fast response to events from resources close to the end device is possible while still having deeper compute resources available in the cloud when necessary.
The heterogeneity of IoT devices also means a heterogeneity of edge and fog computing resources. Edge and fog require an abstraction layer that allows applications to communicate with one another, exposing a common set of APIs for monitoring, provisioning, and controlling physical resources in a standardized way. The abstraction layer also requires a mechanism to support virtualization, allowing multiple operating systems or service containers on physical devices to support multitenancy and application consistency across the IoT system.
When architecting an IoT network, consider the amount of data to be analyzed and the time sensitivity of this data. Fog computing accelerates awareness and response to events by eliminating a round trip to the cloud for analysis, avoids costly bandwidth additions by offloading gigabytes of network traffic from the core network, and protects sensitive IoT data by analyzing it inside company walls.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.