AI Business is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 3099067.

Manufacturing & Industrial

Reference design from Blaize and eYs3D looks to outmatch Lidar in industrial applications

by Ben Wodecki
Article ImageAmerican chip design startup Blaize and eYs3D Microelectronics have unveiled a reference design for advanced depth perception in robotics, security, and autonomous vehicles, that they say offers a cheaper option than Lidar-based systems.

The design integrates Blaize Pathfinder P1600 System-on-Module for edge AI applications with a ‘stereo’ vision sensor developed by eYs3D, which promises “millimeter-level accuracy of depth at optimal range”.

“The Blaize and eYs3D integration enables faster time-to-market for systems incorporating visual simultaneous location and mapping (VSLAM), facial feature depth recognition, and gesture-based commands,” Rajesh Anantharaman, Blaize’s senior products director, said.

Goodbye Lidar, hello Pathfinder?

Blaize, known as Thinci until 2019, was founded in California in 2010. The company recently expanded into China, Taiwan, and Southeast Asia, with Weikeng Group set to distribute its products in new markets.

eYs3D is a provider of computer vision platforms based in Taiwan. The company’s chips are currently embedded in Valve’s Index VR headsets and some consumer cleaning robots.

Integration between the two products enables the P1600 to convert the depth camera’s USB output to high-speed Ethernet connectivity, for enhanced video processing. Software development kits for the reference design will accommodate a wide range of operating systems, programming languages, and development tools, the companies said.

"Depth-sensing technology has been widely adopted commercially in consumer and industrial applications in the last few years,” James Wang, eYs3D’s chief strategy officer, said.

“We are now seeing growing applications in robotics, 3D scene learning, drones, smart retail, and other markets.”

The reliance on Lidar in device autonomy is already being questioned by a certain CEO. Elon Musk is a vocal advocate for using a vision-only approach in autonomous vehicles – believing that cameras are faster than either Lidar or radar.

His beliefs are being implemented in the company’s cars, with the newly-built North American Model Y and Model 3 vehicles featuring no radar – relying on cameras and machine learning as part of their autopilot and advanced driver assistance systems.

Tesla’s latest self-driving system is capable of collating video from eight cameras that surround the vehicle at 36 frames per second, providing information on the car’s surroundings.

EBooks

More EBooks

Latest video

More videos

Upcoming Webinars

More Webinars
AI Knowledge Hub

Research Reports

More Research Reports

Infographics

Smart Building AI

Infographics archive

Newsletter Sign Up


Sign Up