Storage expert Cloudian launches AI infrastructure spin-off

Max Smolaks

September 17, 2019

5 Min Read

Edgematrix will offer a distributed hardware platform for sensor data, starting with Japan

by Max Smolaks 17 September 2019

American storage software vendor Cloudian, known
for its HyperStore platform, has launched a Japanese subsidiary focused on distributed
infrastructure for AI workloads.

The newly-minted startup called Edgematrix will
deploy small GPU-powered processing nodes, powered by Cloudian’s software, at the
edge of the network where sensor data originates.

As a result, customers won’t have to haul datasets
back to core data centers, reducing network congestion, latency and bandwidth
costs.

Edgematrix has received $9 million in Series A
funding from a trio of Japanese investors; these include Japan Post Capital,
mobile network giant NTT Docomo, and Shimizu Corporation, one of the country’s
largest construction firms. Docomo plans to use Edgematrix products to create a
new AI service for its customers, while Shimizu will deploy the nodes in its smart
building projects.

Anatomy of
a start-up

Cloudian was established in 2011 to solve the
issues around large-scale data storage and has received $173.1 million in
venture capital to date.

“We observed that the amount of data was
growing annually at about 40-50 percent, but the Internet aggregated bandwidth
was only growing at around 22 percent,” Michael Tso, co-founder and CEO of
Cloudian, told AI Business. “As the amount of data continues to grow, not all
data is going to end up hosted with centralized cloud infrastructure.”

The answer to this conundrum was running
clever software on commodity hardware – cheap, basic servers with no
proprietary features, but broad compatibility. This should help, at least
theoretically, achieve cloud storage prices in any data center.

Edgematrix is majority-owned by Cloudian and led
by another co-founder, Hiroshi Ohta, initially targeting the Japanese AI market.
According to IDC, spending on AI systems in Japan is expected to increase at a
compound annual growth rate of 45.3 percent between 2018 and 2023, the
strongest spending growth of any geographic region.

The start-up was born out of the work Cloudian
did in Japan: over the past two years, Ohta and his team created several projects
that dealt with AI data at the edge.

“A couple of the guys went out and started
playing with deep learning, machine learning, and pretty soon started to
produce some interesting solutions that we were actually able to commercialize,”
Tso said.

“We built a smart roadside billboard that can
target ads based on the year, make and model of the vehicle that’s driving towards
it. Training on thousands and thousands of pictures of cars, we’re able to get
it to over 99 percent accuracy. And that allows then to sell advertising for
about ten times more money.

“We also worked with factories that use our AI
to route machine parts that come in, smart buildings, and got into traffic analytics
and congestion tracking in Singapore. We just threw a couple of engineers at it
and within a year, we had a bunch of customers. If a couple of our guys without
any specific background in AI can produce world-leading solutions in a few
months, then barriers to entry are not very high,” he laughed.

This AI work was
interesting but didn’t quite fit with the core Cloudian mission. The leadership
was then presented with a dilemma: stop messing with AI or look for additional
funding to set up as a separate, complimentary business. It chose the latter.

The Edgematrix appliances will be used for
managing large-scale data sets, data analytics and machine learning use cases.
Data will be processed locally, and just the results will be sent back to the
cloud. Cloudian is an expert in object storage, and its HyperStore is fully
compatible with Amazon’s Simple Storage Service (S3), a popular choice for
storing data in bulk.

What’s
inside the box

Small but mighty, the first-generation Edgematrix
Edge AI Box is based on an Nvidia Jetson TX module: these low-power embedded
computing platforms include an NVIDIA Pascal GPU with 256 CUDA cores, a quad-core
ARM Cortex A57 CPU, clocked at 2GHz, plus two Nvidia Denver 2 cores. This is accompanied
by 8GB of LPDDR4 memory and 128GB of NVMe storage.

Each Jetson TX can deliver more than a TFLOP/s
of performance, while consuming just 7.5W of power.

The Edge AI Box offers a wealth of networking
options – the appliance will be deployed in remote locations, after all. There’s
a choice of LTE, Wi-Fi and Ethernet, and it can supply power over Ethernet cables
too, useful when you need to connect additional devices like CCTV cameras or
sensors. It comes in indoor and outdoor varieties, and the latter is completely
waterproof. “Just plug in the power and it can do everything else,” Tso said.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like