Game changer in relation to visual fidelity, interface design and interactivity in the XR field

OVERVIEW

Our eyes and gaze contain critical information. Depending on how our eyes glide from one focus point to another, and for how long, we can interpret degrees of power, engagement, confidence, and control. Eye gaze is the pipeline of our nonverbal communication and a key element in social interaction.

Here at FotoNation, we look at the environment. We look at humans. We look at objects. We’ve put to the test more than two decades of experience in observing human faces to prepare for the next technological leap – from smartphones to head-mounted extended reality (XR – augmented, mixed and virtual reality) devices.

The way we interact with mobile devices will change, and eye gaze is at the core of it. Imagine controlling your XR devices with your eye movement and spatial positioning (focal plane).

HIGHLIGHTS

Works with NIR images. Does not require glint for gaze tracking.
Works with up to 50% pupil occlusion
Outputs the following: Position of pupil centre, Gaze position, Depth of Fixation, IPD (Inter-Pupillary Distance)
Automatic gaze re-calibration
Execution time under five milliseconds
Less than one degree accuracy in any illumination condition

Enhanced performance and foveated rendering (the maximum amount of detail is present only where the user is looking, thus reducing the rendering workload) are around the corner.

By concentrating resources exactly where the user is looking, developers will save power and bandwidth. Additionally, by leveraging focal plane detection and gaze tracking, developers are able to simulate realistic sharpness and bokeh effects.

Eye gaze tracking enables immersive and discrete device interaction. Users will be able to engage with content simply by looking at it. This, in turn, will generate a new class of user interfaces, greatly improving the user experience and generating highly valuable analytics data about the way users interact with the applications.

Xperi Logo