Skip to main content Skip to secondary navigation

A simpler approach to eye tracking for virtual reality

Main content start

By Andrew Myers

For decades, various sciences have sought a better way to track eye movements. Knowing where a subject is looking, in what sequence information is taken in, and how long the eye focuses on specific elements in view are sources of great insight into the viewer’s mindset.

Eye tracking has now become commonplace from virtual reality to market research. But there’s a catch. Current eye-tracking technologies require several cameras and highly complex mathematics to triangulate the user’s gaze. It’s costly in terms of the weight and bulk of all those cameras and computationally demanding as well in the mathematics needed to calculate the exact position of the pupil.

Against that backdrop, researchers at Stanford University say they have developed a simplified eye-tracking technology that uses a lone, off-the-shelf camera, an infrared light source like those found in TV remote controls in households the world over, and an invisible, ultrathin layer of patterned silicon on the inside of the lens of a pair of standard eyeglasses.

“In virtual reality, the design challenge is high,” says lead researcher, Mark Brongersma, a professor of materials science at Stanford University. “You want something fashionable and lightweight, yet very functional.”

eyeball looking through glass

The user must be able to peer out through the glasses to the outside world with a view as clear as with everyday eyeglasses, or look at a VR screen without distortion or color aberrations that would rob virtual reality of its realism. At the same time, the computational system needs a tool that can image the eye and calculate where it is looking.

Enter nanoscale optics—lightweight, transparent, and powerful.

Seeing the Light

The secret is etched into an ultrathin layer of silicon on the inside of the lens. The functional layer is just 3 nanometers thick and “grated” with a series of parallel strips of silicon—picket fence-style—that produce a remarkable, and extremely useful, optical effect.

“One would think: what can you do with a three-nanometer thick layer of silicon? And mostly it doesn't do anything, and that’s the secret,” Brongersma says, “It does just enough to allow the camera to pick up an image of the eye and nothing more. It’s totally unobtrusive to the user.”

The grating is a nanolens, of sorts. It doesn’t reflect the near-infrared light so much and collects, intensifies and redirects it back to the camera. The light is imperceptible to the person wearing the glasses, just as is the light from a TV remote when changing channels at home, and the view through the lens remains unperturbed. An algorithm then decodes the digital image of the eye, demarcating pupil from iris and iris from sclera, to calculate where the eye is looking.

The layer is so thin and flexible it can be patterned across large, irregular or curved surfaces, like the lens of a pair of glasses. The strips of silicon have an effect like dust particles on the lens. They scatter light, but they do it in a very controlled and tunable way.

pattern lens

“We’ve engineered the pattern specifically for this near-infrared light of 870 nanometers wavelength coming from the LED,” says Jung-Hwan Song, a research scientist in Brongersma’s lab and the first author of the paper which appeared in the journal Nature Nanotechnology.

With a single camera and light source, the mathematics become much less computationally intense. It all adds up to a simple, lightweight, highly effective eye tracking system.

Wider Implications

The researchers believe it could become a new approach to eye tracking for virtual reality, or any field where understanding what a user is looking at might be important, from advertising to psychology.

eyeball behind glass lens

Brongersma is quick to point out the fundamental technology built into the nanopatterned layer can be fine-tuned to the wavelength of the light to allow a great degree of control over the light and permit engineers to achieve other desirable functions at different wavelengths of light. The layers could even be stacked to achieve several functions simultaneously, Brongersma emphasizes.

It could have applications beyond virtual reality in fields like LIDAR used for crash avoidance systems in autonomous vehicles, new forms of optical imaging, sensing, communication, new displays and more.

“We’re just beginning to explore the possibilities,” Brongersma says.

Additonal authors include post-doctoral scholar Jorik van de Groep, now an assistant professor at the University of Amerstdam, and post-doctoral scholar Soo Jin Kim, now an assistant professor at Korea University.   Images are courtesy of the Brongersma Group at Stanford University. 

Source Article: Song, JH., van de Groep, J., Kim, S.J. et al. Non-local metasurfaces for spectrally decoupled wavefront manipulation and eye tracking. Nat. Nanotechnol. 16, 1224–1230 (2021).

The eWEAR-TCCI awards for science writing is a project commissioned by the Wearable Electronics Initiative (eWEAR) at Stanford University and made possible by funding through eWEAR industrial affiliates program member Shanda Group and the Tianqiao and Chrissy Chen Institute (TCCI®).