Eye-Tracking Applications

Our lab focuses on integrating eye-tracking within real-life applications and user experience to observe and quantify user attention, cognitive load, and interaction patterns. By gathering real-time gaze, we can assess usability, measure reading flow, and evaluate how users interact with interfaces.


Our lab explores the integration of eye-tracking in virtual reality (VR) to enhance user interaction, performance analysis, and adaptive system design. By capturing real-time gaze data, we gain valuable insights into attention patterns, cognitive load, and decision-making processes within immersive environments. Beyond interaction, our research leverages eye-tracking data to study human behavior in virtual spaces, improving training simulations, situational awareness, and user experience design.

By leveraging real-time eye-tracking data, we focus on usablity evaluation, measuring cognitive load, reading flow, and overall engagement. Through rigorous user testing, we analyze factors such as response accuracy, interface intuitiveness, and personalization effectiveness. By prioritizing user experience, we advance the development of assistive technologies that enhance learning while maintaining a natural and immersive interaction.

We apply AI-based methods to real-time eye-tracking data, using neural networks to model user behavior and adjust system responses. This includes predicting gaze trajectories, distinguishing voluntary from involuntary blinks, and forecasting reading comprehension, thereby enabling more precise and data-driven adaptations for user interaction. For example, SQHCI group has developed Eye-tracking Translation Software (ETS), a system that detects real-time cognitive load to identify challenging words, delivers unobtrusive in-line translations, and preserves immersive reading flow.


  • Dritsa, S., Mallas, A. and Xenos, M., 2023, November. Screen reading regions in social media comments: An eye-tracking analysis of visual attention on smartphones. In Proceedings of the 27th Pan-Hellenic Conference on Progress in Computing and Informatics (pp. 95-101). https://doi.org/10.1145/3635059.3635074
  • Minas D., Theodosiou E., Roumpas K., Xenos M., 2025. Adaptive Real-Time Translation Assistance Through Eye-Tracking. AI, 6(1), p.5. https://doi.org/10.3390/ai6010005

Download the information here.