" MicromOne: What If Eyes Evolved Differently? Inside MIT’s New AI Vision Evolution Project

Pagine

What If Eyes Evolved Differently? Inside MIT’s New AI Vision Evolution Project


What if we could replay evolution and explore how eyes might have developed under different environmental pressures? That’s exactly what a team of researchers from MIT, Rice University, and Lund University set out to do with their groundbreaking “What if Eye…?” project — a computational framework that recreates vision evolution inside a virtual world. (eyes.mit.edu)

A Digital Sandbox for Evolution

Instead of waiting millions of years, this project places embodied AI agents — digital creatures inside simulated physics environments — through artificial evolution. They start with a basic light-sensing cell and, over many generations, their visual systems and behaviors evolve in response to survival challenges. The key idea is to let vision emerge naturally from interaction with the environment rather than being manually designed with fixed datasets or human bias. (eyes.mit.edu)

Different Tasks Lead to Different Eyes

The researchers posed “what-if” scenarios by assigning specific tasks to these agents:

  • Navigation tasks, like moving through a maze, favor eyes that cover a wide field with many simple sensors — similar to compound eyes seen in insects.

  • Detection tasks, such as distinguishing food from poison, drive the evolution of high-resolution, camera-like eyes with a focused forward gaze. (eyes.mit.edu)

This shows that the function of vision directly influences the type of visual system that evolves. (eyes.mit.edu)

Optics Emerge Naturally

One of the most fascinating outcomes is how optical features like lenses emerge in the simulations. When the artificial evolutionary system was allowed to evolve optical genes, it developed lens-like structures — not because they were programmed in, but because they offered a functional advantage. Lenses helped agents overcome the basic trade-off between collecting enough light and maintaining detailed spatial vision. (eyes.mit.edu)

Scaling Brains and Vision Together

The research also found that simply increasing the size of an agent’s “brain” didn’t always make it better at visual tasks. Improvements came only when neural processing and visual acuity scaled together — revealing a close link between sensory input quality and cognitive processing capacity. (eyes.mit.edu)

Beyond Biology — Designing Better Vision Systems

This computational approach doesn’t just help us understand how natural vision might have evolved differently; it also points toward new ways to design artificial vision systems. By treating embodied AI as a “hypothesis-testing machine,” the project lays the foundation for creating bio-inspired sensors and cameras that are optimized for specific tasks, from robotics and drones to wearable devices. (eyes.mit.edu)