New headset technology adjusts brightness by tracking pupil dilation

Background: Headsets with heads-up displays have been billed as the next big thing in consumer technology for years, and more recently, even as smartphone replacements. But it’s 2024, and they still haven’t made a dent in the market. Factors like comfort, battery and looks aside, eye strain from these micro displays is another small but significant factor holding things back.

Cobb has developed new software called Neural Display, which your eyes will silently thank you for during extended sessions with spatial computing headsets. The technology essentially adds eye-tracking sensors to microdisplays, using custom software to track eye movements, pupil dilation, gaze direction, and more. Within half a millisecond, it dynamically adjusts the display’s brightness and contrast to match your eyes, so it doesn’t look too bright or too dim.

Typical HUD headsets used in the military, medical field, or other industries typically have micro displays with fixed brightness and contrast levels. But your pupils are constantly dilating and contracting based on your emotional state, what you’re looking at, and other factors.

For example, if you startle and your pupils dilate, the scene may suddenly appear brighter. It’s not ideal, especially when you’re a fighter pilot in a life-or-death situation.

Cobb’s CEO is Michael Murray said The idea to develop Tom’s Hardware Neural Display was born out of this situation. He visited the Air Force and found that some pilots had trouble keeping their headsets on during combat due to brightness changes.

The company shared an initial demo video showing the technology in action. An engineer plays a simple asteroid game, graphically showing how the neural display tracks their pupil size (green circle) and point of view (red circle). As their eyes react, the display seamlessly adjusts without manual input.

READ  Defense tech entrepreneurs call to "bet on youth"

The demo is rudimentary, but Gobin believes the potential implications are huge. Neural Display eliminates the need for dedicated eye-tracking cameras in many spatial computing systems, reducing size, weight, power consumption and cost, the company says.

The most advanced headset of its kind on the market, the Apple Vision Pro uses infrared lights and cameras to track the user’s eyes. But there is no mention of real-time brightness adjustment based on pupil dilation. If Gobin’s technology takes off, headsets like the Vision Pro could certainly benefit, shaving a few grams off its nearly 650-gram weight.

For now, the Neural Display platform has entered alpha testing, and the company expects to finalize headset demos in a few months.

Image credit: Copy it

Dodaj komentarz

Twój adres e-mail nie zostanie opublikowany. Wymagane pola są oznaczone *