Storage expert Cloudian launches AI infrastructure spin-off

Storage expert Cloudian launches AI infrastructure spin-off

Max Smolaks

September 17, 2019

5 Min Read

Edgematrix will offer a distributed hardware platform for sensor data, starting with Japan

by Max Smolaks 17 September 2019

American storage software vendor Cloudian, knownfor its HyperStore platform, has launched a Japanese subsidiary focused on distributedinfrastructure for AI workloads.

The newly-minted startup called Edgematrix willdeploy small GPU-powered processing nodes, powered by Cloudian’s software, at theedge of the network where sensor data originates.

As a result, customers won’t have to haul datasetsback to core data centers, reducing network congestion, latency and bandwidthcosts.

Edgematrix has received $9 million in Series Afunding from a trio of Japanese investors; these include Japan Post Capital,mobile network giant NTT Docomo, and Shimizu Corporation, one of the country’slargest construction firms. Docomo plans to use Edgematrix products to create anew AI service for its customers, while Shimizu will deploy the nodes in its smartbuilding projects.

Anatomy of
a start-up

Cloudian was established in 2011 to solve theissues around large-scale data storage and has received $173.1 million inventure capital to date.

“We observed that the amount of data wasgrowing annually at about 40-50 percent, but the Internet aggregated bandwidthwas only growing at around 22 percent,” Michael Tso, co-founder and CEO ofCloudian, told AI Business. “As the amount of data continues to grow, not alldata is going to end up hosted with centralized cloud infrastructure.”

The answer to this conundrum was runningclever software on commodity hardware – cheap, basic servers with noproprietary features, but broad compatibility. This should help, at leasttheoretically, achieve cloud storage prices in any data center.

Edgematrix is majority-owned by Cloudian and ledby another co-founder, Hiroshi Ohta, initially targeting the Japanese AI market.According to IDC, spending on AI systems in Japan is expected to increase at acompound annual growth rate of 45.3 percent between 2018 and 2023, thestrongest spending growth of any geographic region.

The start-up was born out of the work Cloudiandid in Japan: over the past two years, Ohta and his team created several projectsthat dealt with AI data at the edge.

“A couple of the guys went out and startedplaying with deep learning, machine learning, and pretty soon started toproduce some interesting solutions that we were actually able to commercialize,”Tso said.

“We built a smart roadside billboard that cantarget ads based on the year, make and model of the vehicle that’s driving towardsit. Training on thousands and thousands of pictures of cars, we’re able to getit to over 99 percent accuracy. And that allows then to sell advertising forabout ten times more money.

“We also worked with factories that use our AIto route machine parts that come in, smart buildings, and got into traffic analyticsand congestion tracking in Singapore. We just threw a couple of engineers at itand within a year, we had a bunch of customers. If a couple of our guys withoutany specific background in AI can produce world-leading solutions in a fewmonths, then barriers to entry are not very high,” he laughed.

This AI work wasinteresting but didn’t quite fit with the core Cloudian mission. The leadershipwas then presented with a dilemma: stop messing with AI or look for additionalfunding to set up as a separate, complimentary business. It chose the latter.

The Edgematrix appliances will be used formanaging large-scale data sets, data analytics and machine learning use cases.Data will be processed locally, and just the results will be sent back to thecloud. Cloudian is an expert in object storage, and its HyperStore is fullycompatible with Amazon’s Simple Storage Service (S3), a popular choice forstoring data in bulk.

What’s
inside the box

Small but mighty, the first-generation EdgematrixEdge AI Box is based on an Nvidia Jetson TX module: these low-power embeddedcomputing platforms include an NVIDIA Pascal GPU with 256 CUDA cores, a quad-coreARM Cortex A57 CPU, clocked at 2GHz, plus two Nvidia Denver 2 cores. This is accompaniedby 8GB of LPDDR4 memory and 128GB of NVMe storage.

Each Jetson TX can deliver more than a TFLOP/sof performance, while consuming just 7.5W of power.

The Edge AI Box offers a wealth of networkingoptions – the appliance will be deployed in remote locations, after all. There’sa choice of LTE, Wi-Fi and Ethernet, and it can supply power over Ethernet cablestoo, useful when you need to connect additional devices like CCTV cameras orsensors. It comes in indoor and outdoor varieties, and the latter is completelywaterproof. “Just plug in the power and it can do everything else,” Tso said.

Get the newsletter
From automation advancements to policy announcements, stay ahead of the curve with the bi-weekly AI Business newsletter.