Making AI Work with Edge Computing: The Necessity of Storage

Ciarán Daly

July 3, 2019

4 Min Read

by Jelani Harper

SAN FRANCISCO - Artificial intelligence’s resurgence is widely credited to the significant advancements in compute power that have typified the past several years of IT. When paired with Graphics Processing Units (GPU) and the elasticity of cloud computing resources, the computational demands of AI are much more viable to the enterprise for instances of machine learning and Natural Language Processing.

Nonetheless, there’s another less lauded, yet equally vital element of AI’s revival and application to contemporary developments such as the Internet of Things and edge computing. According to StorCentric CEO Mihir Shah, for these expressions of big data to work “the speed needs to be there, and the robustness needs to be there.”

Those necessities are facilitated by storage—the other half of the equation for AI’s preeminence across the data ecosystem—as well as computational power. Storage is indispensable for AI because its immense computations require data to be quickly accessible at scale, particularly when leveraging these technologies with edge computing and practical necessities like backups.

When equipped with the proper storage capacity, the computational speeds for AI function best to empower a number of vanguard edge computing use cases contributing to an Intelligent Internet of Things (IIoT).

Related: The crux of supervised learning - annotated training data

Facial recognition

The storage requirements for the IIoT are perhaps greatest for edge computing applications. The Department of Defense has utilized AI technologies like facial recognition in remote locations to verify personages entering and exiting facilities. Facial recognition techniques involving advanced machine learning, Convolutional Neural Networks, and other expressions of statistical cognitive computing have particular storage requirements to make them feasible in such settings. “They tend to use direct attached products just for the speed of it,” Shah noted about DoD facial recognition deployments. “It’s portability, it’s speed, and it’s ease of use.”

With this specific use case, facial recognition implementations rely considerably on storage to support edge computing. These facial recognition systems at the edge “sit next to a server,” Shah mentioned. “That server has a direct attachment to a … direct attached device. And, as [a] person walks in they obviously scan their face and whatever biometric measures they have. That server processes it and tries to pull it from the storage device.”

Related: Fulfilling AI's promise of micro-segmentation

AI at the edge

In the preceding example and in other AI deployments of edge computing, there are a number of specific requirements for storage units. In general, a reduced form factor is critical to IoT implementations, especially those at the edge. Size, then, is a crucial consideration for attached storage devices at the edge—as is the capability to still handle the scale of data required for AI. StorCentric CTO Rod Harrison observed that some of the smaller storage units supporting edge computing use cases hold approximately 70 terabytes of data. Additionally, such storage devices must be user friendly to accommodate the needs of non-technical users in remote settings. In such environments “because there’s not too many IT professionals there, ease of use and speed is big,” Shah reflected. “We do have the Thunderbolt connectivity on our devices.”

Mobile edge computing

Portability is also becoming increasingly important, not just for storage at the edge, but for edge computing itself. The best example of this fact is the plethora of smart phones generating sensor data today. Although these devices might not have any considerable storage needs compared to those of IT assets in the IIoT, they still illustrate the importance of mobility at the edge. An even better use case is furnished by the utilization of storage units for military combat vehicles deployed in remote locations. “It’s local storage and they bring it back to the base for it to download into their central servers,” Shah explained. Moreover, in the event of failure, such storage units are readily replaced to propagate business continuity. “Because [it’s] so easy to use, if you don’t have an IT person in a small battalion, if a drive fails or something like that happens, any soldier out there can eject the bad drive and stick in a new drive,” Shah said.

Related: How to build, train, test, and deploy a machine learning model

An intelligent edge

Storage is essential to reinforcing the IIoT because it enables devices to offload data as needed, access data on demand, and support the computational requirements for deploying AI at the cloud’s extremities. In this respect its utility not only extends to cognitive computing, but also to the IoT. Furthermore, expedient, dependable storage is equally imperative to AI deployments in centralized locations, and is directly responsible for the pervasiveness of its various expressions today. “The way that I view it, it’s going to be a spectrum of large enterprises that’s going to be [doing] this whole AI and IoT first, but that’s going to trickle down to the SMB over time,” Shah said.    

Jelani Harper is an editorial consultant servicing the information technology market, specializing in data-driven applications focused on semantic technologies, data governance and analytics.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like