![]() In, they present an automated process based on I-VT (Velocity-Threshold Identification) gaze filter, applied to different types of scene and eye motions recorded with a mobile eye-tracker. The results show that these probabilistic methods work well in these scenarios thanks to their ability to adapt to the viewing behavior and changes in the scene. ![]() In, they propose the use of Hidden Markov and Bayesian Mixture models for an online recognition of gaze events. Another important open issue is how to determine fixations and saccades using wearable devices, due to the non-static nature of the observed scene. In, they compare calibration accuracy and precision among three commercial models of wearable eye-trackers, including the Tobii Pro Glasses 2. The precision/accuracy of the collected gaze data represents another critical point because it varies depending on the calibration procedure and the target distance. Some successful attempts have been already proposed in this direction by integrating gaze data with object recognition Īnd machine learning algorithms. Currently, the definition of the AOIs and the relative gaze mapping relies on computer vision algorithms that have still room of improvements, in terms of robustness and easy to use. In mobile eye-trackers, gaze points are registered with respect to the scene camera (placed in front of the glasses) which captures different images depending on the user point of view. The lack of a fixed frame presents a significant challenge for the analyst confronted with a highly complex data stream. Contrary to the static eye-trackers, the gaze coordinates are not referred to a fixed frame of reference (a display screen for example). The wearable solutions have introduced new technical challenges in eye-tracking technology due to the user-centered point of view. Some of these frameworks are available for: MATLAB (GazeAlyze, EyeMMV, EALab, SacLab ), Python (PyGazeAnalyzer, ), R (ETRAN-R ). Regarding data analysis tools, many software packages have been released to address the most common issues, such as: for processing of eye movement data to static and dynamic scenes, for detecting and filtering artefacts, for detecting gaze events, for generating AOIs (Areas-Of-Interest) and for visualizing data using heatmaps and gaze plots. GazeParser, originally developed for Windows OS, offers the possibility to record eye movements using video-based techniques, to create an eye-tracking study using PsychoPy and VisionEgg experimental control libraries and later to extract fixations and saccades. A similar open-source framework, written in Python, is GazeParser. In addition, one of the main advantages of PyGaze lies in being able to access all the libraries and frameworks already available for the Python programming language. It is compatible with several commercial eye-tracker devices and provides developers with interfaces to implement their custom controllers. PyGaze implements the methods for presenting visual and auditory stimulus and for collecting responses using standard and custom input devices. The PyGaze software, for example, is an open-source package for creating eye-tracking experiments using Python. The presence of high-level development tools, accessible also for non-experienced programmers, has enabled researchers to share the results of their efforts for implementing prototypes of new analysis techniques and custom controllers of eye-tracker equipments. Several open-source solutions have been produced over the years to support eye-tracking research and applications.
0 Comments
Leave a Reply. |