May 27, 2025
Most XR headsets today come with some form of tracking technology. There are sensors and cameras for eye tracking, hand tracking, head tracking, motion tracking, and spatial mapping…Eye tracking, in particular, is essential to most extended reality applications and rapidly advancing with AI. Eye tracking, however, isn’t new, with roots in marketing, medical and other research as well as gaming.
With recent advances, eye tracking data is becoming easier to collect and, as a result, it’s becoming easier to train algorithmic models based upon this data. Eye or gaze tracking measures the direction of your gaze. Combined with hand tracking, head tracking and potentially additional biometric sensors like EEG, eye tracking can reveal all kinds of things about the user.
Input & Foveated Rendering
In VR, eye tracking has user interface and performance implications. As an input method, it makes virtual experiences more natural and comfortable. Gaze is a natural interface, and grabbing a virtual object by looking at it is more intuitive than a controller. When you infuse physical sensations by adding hand tracking and haptics, it can really feel like you’re touching a virtual object and it’s responding to your touch. The combo of gaze and gesture controls can make it seem like you’re genuinely moving and interacting in a virtual space, which makes for more realistic and engaging XR experiences.
Input aside, eye tracking also improves headset performance by helping to reduce computational load. First and foremost, eye tracking enables foveated rendering, whereby only the portion of the image that the user is looking at is fully rendered. The rest of the scene - what’s in your peripheral vision - isn’t fully rendered, thereby reducing processing power. Essentially, foveated rendering imitates the way we actually see, putting what’s in the center of your vision in clearest focus and blurring (reducing the number of pixels) what’s on the periphery.
Eye tracking also allows for automatic IPD adjustment, meaning headsets can calculate interpupillary distance (IPD), a measurement unique to each person. With auto adjustment, the lenses move into optimal position for the user, resulting in reduced eye strain, reduced risk of VR sickness, and a more comfortable experience overall.
Applications of Eye Tracking/XR Analytics
Eye tracking in XR captures valuable user information that can take enterprise applications to the next level and help businesses unlock new insights.
Training & Evaluation
Eye tracking not only makes for more immersive and realistic VR training simulations, it also enables gaze-based triggers and real-time performance feedback. For high-precision training - think pilot, surgical, and complex industrial training - the natural interaction and visual fidelity possible with eye tracking are key, on par with the inherent safety implications and ability to repeat a simulation as many times as needed.
Consider the nuclear power industry: A nuclear power reactor is operated by strict procedures. Training is difficult and physical simulators expensive. This is especially true for practicing emergency scenarios, tasks performed infrequently, and procedures in areas of the plant that are inaccessible for much of the time due to factors like radiation. VR offers a solution.
Finnish energy company Forum adopted VR training for control room operators: In the simulation, trainees read virtual procedure manuals and control room displays as they would in the real world. By following the user’s gaze, trainers know whether a trainee is referencing the right manual, checking the right value on a panel, etc. and can thus assess their performance.
With attention tracking (blinking, pupil dilation, and other eye movements) and additional biosensors (skin, sweat, heart rate, etc.), proficiency testing is possible. Instructors can assess alertness or focus, level of engagement and performance under stress, as well as gain insight into the design or effectiveness of the virtual learning experience itself. The addition of haptics could even reveal things like confusion or hesitation on the user’s part.
In another real-world example, Flint Systems worked with HTC VIVE to develop a welding simulator for training welders in a compact space. The technology uses high-precision tracking to measure every movement of the user’s hands (and determine proficiency), while “eyes-on” data ensures the learner is focused.
Consumer Research & Marketing
When traditional advertising metrics like impressions and clicks are no longer reliable, how do you measure a campaign’s effectiveness? With attention tracking. Marketers are interested in attention metrics like gaze point and duration of view correlated with other information like timing and demographics.
A lot of research goes into product placement. Traditional methods like surveys and setting up mock stores are geographically and logistically limiting, and it’s hard to get unprompted behavioral data from something like a survey. With XR, however, you can research faster, cheaper and at a larger scale, quickly testing different scenarios and collecting rich data you would otherwise miss.
Eye tracking in XR is key to turning consumer behavior into quantifiable, objective data, and it’s already transforming how brands do research. Accenture and Kellogg’s provide a classic example: In 2019, the two companies worked with Qualcomm and eye tracking vendor Tobii to determine the placement of a new Kellogg’s product on supermarket shelves. Eye tracking in VR allowed researchers to essentially look through shoppers’ eyes as they perused and selected items, and map their behavior to specific products. The results were surprising: It turned out that placing new products on lower shelves (not higher) directed shoppers’ attention to surrounding products, translating into an 18% increase in total sales for the brand.
Eye tracking can reveal attention but also preference and even intent. XR provides a controlled yet immersive environment where it’s possible to observe behaviors that are difficult to replicate in the real world. The combination could enhance or even replace all kinds of user experience studies, focus groups, and large-scale A/B testing of store layouts and promotional displays.
Remote Collaboration & Social VR
Eye tracking enables nonverbal communication in social XR experiences like virtual collaboration and AI-powered customer service with avatars. Around 80% of human communication is not about what we say but rather what we don’t say: Facial expressions, body language, gestures, and eye contact speak louder than words, indicating confidence, openness, and more. The lack of nonverbal cues in VR is why it’s still more effective and meaningful to meet face-to-face as opposed to avatar-to-avatar in a virtual meeting room.
Eye, hand, and body tracking bring lifelike nonverbal communication into virtual collaboration and other immersive scenarios by enabling avatars to maintain direct eye contact, mirror our facial expressions, and replicate our actions. Avatars become more expressive, making virtual encounters more realistic and reducing misunderstandings between workers.
Eye and hand tracking make interaction in virtual workspaces more organic and precise. Remote workers feel more genuinely “in the room” and employees can “touch” or work on the same virtual model from wherever they are. Workers in Accenture’s Nth floor experiment reported a stronger sense of presence, while - in another real-world example - Siemens workers are able to rotate, zoom, and annotate digital twins as they follow 3D instructions developed by BILT for Vision Pro.
In sum, eye, hand and other forms of tracking embedded in XR headsets are unlocking new capabilities in enterprise, elevating use cases and producing the necessary data for improving experiences and validating use cases. Artificial Intelligence is helping to maximize the impact by integrating data streams from multiple sensors and processing large amounts of data. AI can, for instance, identify patterns in eye tracking data to personalize immersive experiences for individual users (e.g. adjusting the difficulty of a VR training module based on how quickly one’s eyes move between tasks).
Think of an XR headset as a vehicle for data collection and the user as subject. We’re learning that it’s not just about where you look but for how long and in what order, the associated behaviors you exhibit, and additional biometric data that can potentially reveal even deeper insights into customers and employees alike.
