UT Wordmark Primary UT Wordmark Formal Shield Texas UT News Camera Chevron Close Search Copy Link Download File Hamburger Menu Time Stamp Open in browser Load More Pull quote Cloudy and windy Cloudy Partly Cloudy Rain and snow Rain Showers Snow Sunny Thunderstorms Wind and Rain Windy Facebook Instagram LinkedIn Twitter email alert map calendar bullhorn

UT News

Researchers develop formulas to replicate optimal vision tracking strategies

Mathematical formulas for optimal eye movements that have significant implications for designing robotic visual systems and improving visual performance for people losing eyesight have been developed by University of Texas at Austin researchers.

Two color orange horizontal divider

AUSTIN, Texas—Mathematical formulas for optimal eye movements that have significant implications for designing robotic visual systems and improving visual performance for people losing eyesight have been developed by University of Texas at Austin researchers.

Their findings were published in the journal Nature this week.

“Effective visual searching is crucial to survival,” said Dr. Wilson Geisler, lead researcher and director of the Center for Perceptual Systems at the university. “Whether people are searching for food, predators, shelter or a potential mate, their eyes are continuously scanning their environment.”

The human eye makes between 10,000 and 15,000 movements on average, every hour—translating into hundreds of thousands of movements each day. For years, scientists have thought the eye’s movements were mostly random, but Geisler’s findings suggest they are the result of a highly evolved and effective strategy.

Geisler and graduate student Jiri Najemnik, with a grant from the National Eye Institute of the National Institutes of Health, have developed mathematical formulas that optimize visual searching for objects in a cluttered environment. They tested human eye movements and discovered they were nearly identical to those predicted by the formulas.

“This research shows that humans are as effective as possible at finding items in a cluttered environment, such as a golf ball in the grass or a key on a messy desk,” Geisler said.

Although the retina’s high acuity area (the fovea) is very limited, the brain is able to identify subtle features at the periphery and direct the eyes appropriately.

Applying Geisler’s mathematical formulas to robotics could create a camera that is able to sense stimuli at the periphery and react automatically, moving its gaze immediately as the human eye would move. It also could allow doctors to train individuals who have lost portions of their sight to more efficiently use the vision they have left.

For more information contact: Bill Geisler, Ph.D., 512-471-5380.