Sponsored by Google Cloud
Choosing Your First Generative AI Use Cases
To get started with generative AI, first focus on areas that can improve human experiences with information.
The approach could enable advanced AI systems that don’t require a connection to the cloud
The approach could enable advanced AI systems that don’t require a connection to the cloud
DeepCube, a startup that accelerates deep learning on edge devices through software, has raised $7 million in a Series A funding round.
The cash injection follows the launch of the company’s inference accelerator.
“DeepCube is proud to be at the helm of innovation in AI and deep learning, allowing for efficient and cost-effective implementation of the most advanced neural networks on edge devices,” said Dr. Eli David, co-founder of DeepCube.
“With the new funding, we can deliver on the promise of deep learning to customers in new markets, having an impact not only on their businesses, but also on the deep learning industry at large, far beyond what’s previously been possible.”
DeepCube says its framework can be use on existing hardware, in edge devices and data centers, promising “10X speed improvement and memory reduction.”
Examples of applications that could leverage this tech include surveillance cameras that can conduct image analysis in real time without an Internet connection, autonomous drones that can make decisions without any connectivity, or autonomous cars, also without the need for cloud connectivity.
The funding round was led by Canadian VC Awz Ventures, with participation from Koch Disruptive Technologies (KDT) and Nima Capital. It brings the total raised by DeepCube to $12 million.
“Deep learning has accelerated in recent years,” said Yaron Ashkenazi, founder and managing partner, Awz Ventures. “However, the ability to deploy and scale deep learning on edge devices, with a light footprint and efficient memory and processing power, is a significant challenge that has hindered adoption.
“DeepCube’s technology has the power to unlock truly autonomous decision making in semiconductors, data centers, and on edge devices, while improving speed and memory reductions. This is absolutely critical to the future of deep learning.”
You May Also Like