Robo-optics, Organ Clocks, Blood Tech + Reforesting

A looping video of a small, six-legged robot with sensors walking steadily forward along a narrow gravel path in a lush garden setting.

As this robot scuttles down a path, it uses a sensor, chip and tiny AI model inspired by biological eyes and brains to determine its location. The system uses a tenth of the energy of a conventional camera-based system and could work on any robot.

Adam Hines

👁 Low Energy, High Efficiency Remote Sensors

A new seeing robot has entered the chat, and the energy needed to power its vision is only a fraction of what conventional location systems demand. Kathryn Hulick reports for Science News on ​an eyelike sensor incorporated with a chip and tiny AI model​ that can change the remote visual sensing arena.

🤖 Low-Power Superpower

Robots that gather visual information from their surroundings need an eye of some sort. While in some cases a camera and an AI that determines location may serve as that apparatus, that’s not always the most efficient visualizing instrument because a camera is always capturing information even if the environment remains static, needlessly using a huge amount of energy. To save energy, researchers have developed a system called LENS, comprising a sensor, chip and tiny AI model.

LENS uses an existing, commercially available product known as Speck, which is the sensor-chip duo, created by the company SynSense. Akin to a human eye, Speck operates more efficiently than a camera. It pays attention to how the environment changes through a process known as neuromorphic computing, which more closely resembles how our eyes and brain take cues.

Premium Subscription

Access to this content is for Investors Lab subscribers only. Investors Lab delivers exclusive, data-backed insights into scientific breakthroughs set to disrupt industries.