top of page
Search
Writer's pictureAdolfo Ruiz

Eye Tracking Technology

Eye tracking is a sensor technology that makes it possible for a computer or other device to know where a person is looking. An eye tracker can detect the presence, attention and focus of the user. It allows for unique insights into human behavior and facilitates natural user interfaces in a broad range of devices. The ability to control a computer using the eyes is also vital for people who are unable to speak or use their hands.


Parts of a High-Performing eye-tracking system

  • Custom-designed sensors — The hardware is designed to be a high-performance sensor, and not for taking nice pictures. It consists of custom designed projectors, customized image sensors and optics as well as custom processing with embedded algorithms.

  • Advanced algorithms — Algorithms are the brain of the system, which interprets the image stream generated by the sensors.

  • User-oriented applications — An intelligent application layer is added to enable the various ways the technology can be used.

How eye tracking works


  1. An eye tracker consist of cameras, projectors and algorithms

  2. The projectors create a pattern of near-infrared light on the eyes.

  3. The cameras take high-resolution images of the user's eyes and the pattern.

  4. Machine learning, image processing and mathematical algorithms are used to determine the eye's position and gaze point.

Applications


A wide variety of disciplines use eye-tracking techniques, including cognitive science; psychology (notably psycholinguistics; the visual world paradigm); human-computer interaction (HCI); human factors and ergonomics; marketing research and medical research (neurological diagnosis). Specific applications include the tracking eye movement in language reading, music reading, human activity recognition, the perception of advertising, playing of sports, distraction detection and cognitive load estimation of drivers and pilots and as a means of operating computers by people with severe motor impairment.


In recent years, the increased sophistication and accessibility of eye-tracking technologies have generated a great deal of interest in the commercial sector. Applications include web usability, advertising, sponsorship, package design and automotive engineering. In general, commercial eye-tracking studies function by presenting a target stimulus to a sample of consumers while an eye tracker is used to record the activity of the eye. Examples of target stimuli may include websites; television programs; sporting events; films and commercials; magazines and newspapers; packages; shelf displays; consumer systems (ATMs, checkout systems, kiosks); and software. The resulting data can be statistically analyzed and graphically rendered to provide evidence of specific visual patterns. By examining fixations, saccades, pupil dilation, blinks and a variety of other behaviors, researchers can determine a great deal about the effectiveness of a given medium or product. While some companies complete this type of research internally, there are many private companies that offer eye-tracking services and analysis.

One field of commercial eye-tracking research is web usability. While traditional usability techniques are often quite powerful in providing information on clicking and scrolling patterns, eye-tracking offers the ability to analyze user interaction between the clicks and how much time a user spends between clicks, thereby providing valuable insight into which features are the most eye-catching, which features cause confusion and which are ignored altogether. Specifically, eye-tracking can be used to assess search efficiency, branding, online advertisements, navigation usability, overall design and many other site components. Analyses may target a prototype or competitor site in addition to the main client site.

Eye-tracking is commonly used in a variety of different advertising media. Commercials, print ads, online ads and sponsored programs are all conducive to analysis with current eye-tracking technology. One example is the analysis of eye movements over advertisements in the Yellow Pages. One study focused on what particular features caused people to notice an ad, whether they viewed ads in a particular order and how viewing times varied. The study revealed that ad size, graphics, color, and copy all influence attention to advertisements. Knowing this allows researchers to assess in great detail how often a sample of consumers fixates on the target logo, product or ad. As a result, an advertiser can quantify the success of a given campaign in terms of actual visual attention. Another example of this is a study that found that in a search engine results page, authorship snippets received more attention than the paid ads or even the first organic result.

Safety applications

Scientists in 2017 constructed a Deep Integrated Neural Network (DINN) out of a Deep Neural Network and a convolutional neural network. The goal was to use deep learning to examine images of drivers and determine their level of drowsiness by "classify [ing] eye states." With enough images, the proposed DINN could ideally determine when drivers blink, how often they blink, and for how long. From there, it could judge how tired a given driver appears to be, effectively conducting an eye-tracking exercise. The DINN was trained on data from over 2,400 subjects and correctly diagnosed their states 96%-99.5% of the time. Most other artificial intelligence models performed at rates above 90%. This technology could ideally provide another avenue for driver drowsiness detection.

Game theory applications


In a 2019 study, a Convolutional Neural Network (CNN) was constructed with the ability to identify individual chess pieces the same way other CNNs can identify facial features. It was then fed eye-tracking input data from thirty chess players of various skill levels. With this data, the CNN used gaze estimation to determine parts of the chess board to which a player was paying close attention. It then generated a saliency map to illustrate those parts of the board. Ultimately, the CNN would combine its knowledge of the board and pieces with its saliency map to predict the players' next move. Regardless of the training dataset the neural network system was trained upon, it predicted the next move more accurately than if it had selected any possible move at random, and the saliency maps drawn for any given player and situation were more than 54% similar.

Assistive technology

People with severe motor impairment can use eye tracking for interacting with computers as it is faster than single switch scanning techniques and intuitive to operate. Motor impairment caused by Cerebral Palsy or Amyotrophic lateral sclerosis often affects speech, and users with Severe Speech and Motor Impairment (SSMI) use a type of software known as Augmentative and Alternative Communication (AAC) aid, that displays icons, words and letters on screen and uses text-to-speech software to generate spoken output. In recent times, researchers also explored eye tracking to control robotic arms and powered wheelchairs. Eye tracking is also helpful in analysing visual search patterns, detecting presence of Nystagmus and detecting early signs of learning disability by analysing eye gaze movement during reading.

Aviation applications


Eye tracking has already been studied for flight safety by comparing scan paths and fixation duration to evaluate the progress of pilot trainees, for estimating pilots’ skills, for analyzing crew’s joint attention and shared situational awareness. Eye tracking technology was also explored to interact with helmet mounted display systems and multi-functional displays in military aircraft. Studies were conducted to investigate the utility of eye tracker for Head-up target locking and Head-up target acquisition in Helmet mounted display systems (HMDS). Pilots' feedback suggested that even though the technology is promising, its hardware and software components are yet to be matured. Research on interacting with multi-functional displays in simulator environment showed that eye tracking can improve the response times and perceived cognitive load significantly over existing systems. Further, research also investigated utilizing measurements of fixation and pupillary responses to estimate pilot's cognitive load. Estimating cognitive load can help to design next generation adaptive cockpits with improved flight safety. Eye tracking is also useful for detecting pilot fatigue.

Automotive applications

In recent time, eye tracking technology is investigated in automotive domain in both passive and active ways. National Highway Traffic Safety Administration measured glance duration for undertaking secondary tasks while driving and used it to promote safety by discouraging the introduction of excessively distracting devices in vehicles. In addition to distraction detection, eye tracking is also used to interact with IVIS. Though initial research investigated the efficacy of eye tracking system for interaction with HDD (Head Down Display), it still required drivers to take their eyes off the road while performing a secondary task. Recent studies investigated eye gaze controlled interaction with HUD (Head Up Display) that eliminates eyes-off-road distraction. Eye tracking is also used to monitor cognitive load of drivers to detect potential distraction. Though researchers explored different methods to estimate cognitive load of drivers from different physiological parameters, usage of ocular parameters explored a new way to use the existing eye trackers to monitor cognitive load of drivers in addition to interaction with IVIS.


2 views0 comments

Recent Posts

See All

Comments


bottom of page