August 12, 2022
Inspired by a fiddler crab, scientists have created a new artificial eye that works in both land and water
Artificial vision systems form a key part of sensor technologies; used in everything from autonomous vehicles to monitoring devices and robotic assistants. Creating an artificial eye that captures the right quality and depth of information has, however, proven challenging, with many existing systems not taking in a full 360-degree field of vision and being suited to either land or water rather than both.
A group of researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), the Gwangju Institute of Science and Technology (GIST) and Seoul National University in Korea is looking to change this limitation, taking inspiration from a fiddler crab to create a novel vision system that is both amphibious and panoramic.
Fiddler crabs are unique in having an omni-directional eye that enables 360-degree vision both underwater and on land. The small creature’s eyes are raised above their heads like a periscope and feature flat corneas; two features that enable this multi-directional vision that means the crab can see around it without having to move its body.
To mimic the crustacean’s eye, the team combined an array of flat microlenses with a flexible photodiode array in a single spherical structure. Such a design means a light source, regardless of changes to the external refractive index between air and water, will converge at the same spot on the image sensor.
In tests, the artificial eye demonstrated consistent image quality and an almost 360-degree field of view in both terrestrial and aquatic environments, with the eye correctly identifying five objects projected from different angles.
Study author Young Min Song, professor of electrical engineering and computer Science at GIST said the system has a wide range of possible applications, including “panoramic motion detection and obstacle avoidance in continuously changing environments,” as well as augmented and virtual reality, or all-weather vision for autonomous vehicles.
“Currently, the size of a semiconductor optical unit, commonly used in smartphones, automobiles, and surveillance/monitoring cameras, is restricted at the laboratory level,” he said. “However, with the technology of image sensor manufacturers such as Samsung and SK Hynix, the technological limitation is surmountable to develop a camera that is much smaller and has better imaging performance than the ones currently manufactured.”
“This is a spectacular piece of optical engineering and non-planar imaging,” said John A. Rogers, professor of materials science and engineering at Northwestern University, “…combining aspects of bio-inspired design and advanced flexible electronics to achieve unique capabilities unavailable in conventional cameras. Potential uses span from population surveillance to environmental monitoring.”
Findings were published in Nature Electronics in July, and the team is set to begin trials of the new system on several kinds of robots (both amphibious and terrestrial), as well as investigate novel camera types based on other animal vision systems.