by Jelani Harper


2019 will be a crucial year for the Internet of Things for two reasons. Firstly, many of the initial predictions for this application of big data prognosticated a future whereby at the start of the next decade there would be billions of connected devices all simultaneously producing sensor data. The IoT is just a year away from making good on those claims.

Secondly, and perhaps even more importantly, 2019 will be the year in which the Intelligent IoT takes tangible shape and delivers various manifestations of Artificial Intelligence the data necessary to apply both timely analytics and informed action predicated on this continual data transmission.

The melding of the IoT with cognitive computing for the IIoT is a natural evolution that will likely have horizontal consequences for business practices in the near future. According to American Showa Senior Manager of IT Sean Henry, whether data-centric “companies think about it this way or not, you’re a technology company. Suspension systems are a technology, driving systems are a technology and you have to treat it that way. Just as people expect other technologies to rapidly evolve, you’ve got to rapidly evolve with it.”

The IIoT is the evolution of the IoT that will give it meaning and help it actualize the number of connected devices forecast for the start of the next decade. The IIoT will encompass smart cities, edge devices, wearables, deep learning and classic machine learning alongside lesser acknowledged elements of AI in a basic paradigm in which, according to Franz CEO Jans Aasman, “you can look at the past and learn from certain situations what’s likely going to happen. You feed it in your [IoT] system and it does better… then you look at what actually happened and it goes back in your machine learning system. That will be your feedback loop.”


Related: AI Is Nothing Without The IIoT


Deep learning, machine learning

Most applications of the IIoT adhere to the model outlined by Aasman, in which the enormous quantities of sensor or streaming data are “feeding the engines that allow people to do really top level Artificial Intelligence and machine learning,” NetApp Business Development Manager Gregory Gardner revealed. These data amounts are instrumental in the huge training data requirements necessary for both machine learning and deep learning. Processing that training data is largely feasible today because of Graphics Processing Units (GPUs) which, according to Gardner, are “400 times better than a normal server and useful for automated neural networks, which has given us an incredible capability to do deep learning.”

Although deep learning relies on many of the same concepts as traditional machine learning, with “deep learning it’s just that you do it with more computers and more intermediate layers,” Aasman said, which results in higher accuracy levels. The primary outputs of these advanced cognitive analytics on IIoT data are greater performance and competitive advantage. Ducati CTO Konstantin Kostenarov referenced a sophisticated use case in which sensor data initially processed at the cloud’s edge on Ducati racing bikes in competitions are eventually analyzed in a centralized data center in Bologna with “Artificial Intelligence and deep machine learning algorithms to create another [perspective] that can give us additional information in another way that we can obtain only from the physical sensors on the bikes.”

Image recognition and video

Ducati effects competitive advantage from this data by analyzing how to improve the performance of its racing teams during future competitions. It also uses the data to devise different approaches for enhancing the sales of its road bikes, which Kostenarov mentioned is Ducati’s principal line of business. Deep learning is also a foundational technology for modern image recognition systems, which are also being deployed more frequently in the IIoT. Deep learning models provide the basis for analyzing and accurately identifying the pictures found in image recognition systems.

One of the largest rollouts of this IIoT capability is found in China, in which facial recognition systems have “already become the default way that people authenticate themselves onto a network,” Tomorrow CEO and self-proclaimed futurist Mike Walsh said. “If you want to get an Uber in China…the driver has to authenticate himself visually before the ride can start.” Additional facial recognition use cases in China include law enforcement. The tandem of IIoT data and optical characteristics is also consequential for maximizing organizational efficiency, particularly in the Industrial Internet of Things.

The feedback mechanism described by Aasman has such a tremendous capacity to reform data-driven businesses because of the speed of the iterations provided by low latency IIoT data. Henry discussed the degree of precision IoT sensor data enables for improving manufacturing and assembly line efficiency in which “you’re leveraging data that we were already creating and you’re turning it into immediately actionable information: this part, with this serial number, isn’t good.” The downstream effects of such information are enriched with image data like video feeds. Henry recollected an actual use case in which “a magnet dropped and it hit a part. While the sensor certainly said the part was bad, they wanted to understand why did this happen because you don’t want something like that repeating.”


Related: Urban IoT and AI – How Cities Can Leverage This Synergy


Knowledge-Based and Statistical AI

Smart cities are one of the more prominent demonstrations of the IIoT’s potential. Like most applications of this usage of sensor data, they become much more efficacious when pairing statistical AI (machine learning) with knowledge-based AI such as symbolic reasoning. One of the critical learning facets the latter produces involves optimization, such as determining the best way to optimize route deliveries encompassing a host of factors based on dedicated rules about them. “There’s no way in [Hades] that a machine learning system would be able to do the complex scheduling of 6,000 people,” Aasman declared. “That’s a really complicated thing where you have to think of every factor for every person.”

However, constraint systems utilizing multi-step reasoning can regularly complete such tasks and the optimization activities for smart cities. Aasman commented that for smart cities, semantic inferencing systems can incorporate data from traffic patterns and stop lights, weather predictions, the time of year, and data about specific businesses and their customers to devise rules for optimal event scheduling. Once the events actually take place, their results—as determined by KPIs—can be analyzed with machine learning to issue future predictions about how to better those results in what Aasman called “a beautiful feedback loop between a machine learning system and a rules-based system.”

Edge Processing

In the preceding smart city use case and in numerous others involving the IIoT, different cognitive computing applications are used to improve the performance and efficiency of the devices generating data. This virtuous cycle is particularly important for edge computing deployments, in which processing occurs at the edge of a network instead of in centralized locations. Kostenarov noted that the initial data processing for Ducati’s racing bikes occurs at the edge because “the racing teams move locations every week.” When processing data at the edge, organizations must not only take into account issues of security but also those of compute and storage to “abstract the storage operating system from our hardware [to] now make it available to run on small form factor devices,” Gardner indicated. Hyper-converged infrastructure options can make such small form factors viable entities for processing data quickly enough at the edge before perhaps sending them to centralized locations for further analysis—as Ducati does with its motorbike use cases.

Shorter Development Cycles and Personalization

In almost all of the examples discussed above, the IIoT incorporates cognitive computing “so humans can take action for better business results,” Aasman acknowledged. The means by which these advantages are created are practically limitless. By using cognitive analytics on actual consumer data such as that produced by connected cars, for example, manufacturers “get feedback so that your development cycle is not just based on development trends, but also on real data coming from real cars in the world, allowing not just for development of the next model, but rapid changes on your current model,” Henry said. These boons are available across industries leveraging the IIoT.

Still, with both connected cars and autonomous vehicles, the automotive industry has always been a forerunner in the IoT. It appears to be producing the same effect on the IoT’s evolution into the IIoT as well. “You’re not going to go to a dealer anymore,” Henry predicted. “You might go test drive, but you’re going to say I want this, this and this and your car’s going to roll down the assembly line and be delivered, personalized, to you.”


Jelani Harper is an editorial consultant servicing the information technology market, specializing in data-driven applications focused on semantic technologies, data governance and analytics.