AI Business is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 3099067.

IT & Data Center

Lightmatter announces Envise, a photonic accelerator for AI

by Louis Stone
Article Image

Claims it can beat Nvidia in major deep learning workloads

A startup spun out of MIT plans to launch an artificial intelligence chip that uses light, rather than electricity, for processing.

Lightmatter plans to start shipping the photonic Envise systems later this year, with the company claiming they could run up to 10 times faster than servers based on Nvidia A100 GPUs, at least in specific inferencing workloads.

Do you see the light?

Envise will be made available as a 4U server blade, equipped with two AMD Epyc 7002 CPUs, 1TB of DDR4 DRAM, and 3TB of solid-state memory. Each system will contain 16 Envise chips.

Rather than relying on electrons like the vast majority of chips, Envise runs calculations by splitting and mixing beams of light. It still requires a traditional silicon semiconductor working in tandem, to help control the photonic aspect and serve as temporary memory storage.

Envise 4U server blade © Lightmatter

As the first commercial, general-purpose photonic AI accelerator, Envise is expected to be able to handle a variety of workloads, including inference on GPT-3, Megatron, BERT-Large, DLRM, and ResNet-50. Lightmatter claims Envise is five times faster than Nvidia’s A100 in BERT, but that has not been independently verified.

By shifting from increasingly-smaller transistors to light and fiber optics, the system could significantly reduce power and cooling demands of AI. Lightmatter claims a seven-fold energy efficiency improvement over the A100.

But as an analog chip (rather than digital), it is also less accurate. For this reason, the company told Wired that it is marketing its wares for inference (that is running pre-trained algorithms) workloads, rather than the more complex task of model training. CEO Nick Harris said that Envise could run training workloads, just not as well.

To help developers make the most of Envise, Lightmatter plans to roll out a new software stack called Idiom.

Alongside the new chip, the company also announced Passage, a wafer-scale programmable photonic interconnect that allows different chips to communicate optically over photonic links.

"We think Passage will enable a world where configurable-topology supercomputing systems are built on a single platform that takes advantage of the benefits of optics, without the packaging cost and complexity," Harris said in a blog post.

Lightmatter has raised $22 million from GV, Spark Capital, and Matrix Partners. It competes with a number of photonic chip pioneers, who all vie for the post-Moore's Law computing space. Among them are Fathom Computing, LightIntelligence, LightOn, Luminous, and Optalysis.

EBooks

More EBooks

Latest video

More videos

Upcoming Webinars

More Webinars
AI Knowledge Hub

Research Reports

More Research Reports

Infographics

Smart Building AI

Infographics archive

Newsletter Sign Up


Sign Up