default search action
ETRA 2023: Tübingen, Germany
- Enkelejda Kasneci, Frédérick Shic, Mohamed Khamis:
Proceedings of the 2023 Symposium on Eye Tracking Research and Applications, ETRA 2023, Tübingen, Germany, 30 May 2023 - 2 June 2023. ACM 2023
ETRA 2023 Short Papers
- Tim Rolff, Susanne Schmidt, Frank Steinicke, Simone Frintrop:
A Deep Learning Architecture for Egocentric Time-to-Saccade Prediction using Weibull Mixture-Models and Historic Priors. 1:1-1:8 - Wolfgang Fuhl, Susanne Zabel, Theresa Anisja Harbig, Julia Astrid Moldt, Teresa Festl-Wietek, Anne Herrmann-Werner, Kay Nieselt:
Area of interest adaption using feature importance. 2:1-2:7 - Daniel G. Krakowczyk, Paul Prasse, David R. Reich, Sebastian Lapuschkin, Tobias Scheffer, Lena A. Jäger:
Bridging the Gap: Gaze Events as Interpretable Concepts to Explain Deep Neural Sequence Models. 3:1-3:8 - Catarina Moreira, Diogo Miguel Alvito, Sandra Costa Sousa, Isabel Maria Gomes Blanco Nobre, Chun Ouyang, Regis Kopper, Andrew T. Duchowski, Joaquim Jorge:
Comparing Visual Search Patterns in Chest X-Ray Diagnostics. 4:1-4:6 - Nadine Marie Moacdieh, Michel Dibo, Zeina Halabi, Jumana Antoun:
Eye tracking to evaluate the effectiveness of electronic medical record training. 5:1-5:7 - Fei Chang, Jiabei Zeng, Qiaoyun Liu, Shiguang Shan:
Gaze Pattern Recognition in Dyadic Communication. 6:1-6:7 - Yanfei Hu Fleischhauer, Hemant Bhaskar Surale, Florian Alt, Ken Pfeuffer:
Gaze-based Mode-Switching to Enhance Interaction with Menus on Tablets. 7:1-7:8 - Ludwig Sidenmark, Mathias N. Lystbæk, Hans Gellersen:
GE-Simulator: An Open-Source Tool for Simulating Real-Time Errors for HMD-based Eye Trackers. 8:1-8:6 - Kenan Bektas, Jannis Strecker, Simon Mayer, Kimberly García, Jonas Hermann, Kay Erik Jenß, Yasmine Sheila Antille, Marc Solèr:
GEAR: Gaze-enabled augmented reality for human activity recognition. 9:1-9:9 - Ruoyan Kong, Ruixuan Sun, Charles Chuankai Zhang, Chen Chen, Sneha Patri, Gayathri Gajjela, Joseph A. Konstan:
Getting the Most from Eye-Tracking: User-Interaction Based Reading Region Estimation Dataset and Models. 10:1-10:7 - Ethan Wilson, Frederick Shic, Eakta Jain:
Introducing Explicit Gaze Constraints to Face Swapping. 11:1-11:7 - Cristina Palmero, Oleg V. Komogortsev, Sergio Escalera, Sachin S. Talathi:
Multi-Rate Sensor Fusion for Unconstrained Near-Eye Gaze Estimation. 12:1-12:8 - Naila Ayala, Diako Mardanbegi, Andrew T. Duchowski, Ewa Niechwiej-Szwedo, Shi Cao, Suzanne K. Kearns, Elizabeth L. Irving:
On The Visibility Of Fiducial Markers For Mobile Eye Tracking. 13:1-13:7 - Wolfgang Fuhl, Susanne Zabel, Theresa Anisja Harbig, Julia Astrid Moldt, Teresa Festl-Wietek, Anne Herrmann-Werner, Kay Nieselt:
One step closer to EEG based eye tracking. 14:1-14:7 - Karolina J. Krzys, Mubeena Mistry, Tyler Q. Yan, Monica S. Castelhano:
Predicting the Allocation of Attention: Using contextual guidance of eye movements to examine the distribution of attention. 15:1-15:10 - Minoru Nakayama, Wioletta Nowak, Anna Zarowska:
Prediction Procedure for Dementia Levels based on Waveform Features of Binocular Pupil Light Reflex. 16:1-16:7 - Philipp Stark, Tobias Appel, Milo J. Olbrich, Enkelejda Kasneci:
Pupil Diameter during Counting Tasks as Potential Baseline for Virtual Reality Experiments. 17:1-17:7 - Paul Prasse, David R. Reich, Silvia Makowski, Seoyoung Ahn, Tobias Scheffer, Lena A. Jäger:
SP-EyeGAN: Generating Synthetic Eye Movement Data with Generative Adversarial Networks. 18:1-18:9 - Johan Chandra, Nicholas Witzig, Jochen Laubrock:
Synthetic predictabilities from large language models explain reading eye movements. 19:1-19:7 - Erwan J. David, Jesús Gutiérrez, Melissa Le-Hoa Vo, Antoine Coutrot, Matthieu Perreira Da Silva, Patrick Le Callet:
The Salient360! Toolbox: Processing, Visualising and Comparing Gaze Data in 3D. 20:1-20:8 - Pelin Çelikkol, Jochen Laubrock, David Schlangen:
TF-IDF based Scene-Object Relations Correlate With Visual Attention. 21:1-21:6 - Jan Ehlers, Janine Grimmer:
Visual Center Biasing in a Stimulus-Free Laboratory Setting. 22:1-22:6 - Sonali Aatrai, Sparsh Kumar Jha, Rajlakshmi Guha:
Visual Perception and Performance: An Eye Tracking Study. 23:1-23:5
ETRA 2023 Doctoral Symposium
- Qasim Ali:
Analysis of Eye Tracking Data: Supporting Vision Screening with Eye Tracking Technologies. 24:1-24:3 - Alejandro Garcia De La Santa Ramos, Rafael Cabeza, Arantxa Villanueva:
Calibration free eye tracking solution for mobile and embedded devices. 25:1-25:3 - Gonzalo Garde:
Discussing the importance of calibration in low-cost gaze estimation solutions. 26:1-26:3 - Yasasi Abeysinghe:
Evaluating Human Eye Features for Objective Measure of Working Memory Capacity. 27:1-27:3 - Lena Stubbemann:
Eye Tracking for Virtual Customer Requirements Evaluations. 28:1-28:2 - Omar Namnakani:
Gaze-based Interaction on Handheld Mobile Devices. 29:1-29:4 - Adrian Vulpe-Grigorasi:
Multimodal machine learning for cognitive load based on eye tracking and biosensors. 30:1-30:3
ETRA 2023 Late-Breaking Work (Poster Abstracts)
- Owen Raymond, Yelaman Moldagali, Naser Al Madi:
A Dataset of Underrepresented Languages in Eye Tracking Research. 31:1-31:2 - Yuka Hayakawa, Saki Tanaka, Airi Tsuji, Kaori Fujinami, Junichi Yamamoto:
A Preliminary Investigation on Eye Gaze-based Estimation of Self-efficacy during a Dexterity Task. 32:1-32:2 - Wolfgang Fuhl, Anne Herrmann-Werner, Kay Nieselt:
A temporally quantized distribution of pupil diameters as a new feature for cognitive load classification. 33:1-33:2 - Lynn Knippertz, Anna Münz, Stefan Ruzika:
Automated Detection of Geometric Structures in Gaze Data. 34:1-34:2 - Li-Chen Fu, Robin Fischer:
Collaboration Assistance Through Object Based User Intent Detection Using Gaze Data. 35:1-35:2 - Michal Hochhauser, Kelsey Jackson Dommer, Adham Atyabi, Beibin Li, Yeojin A. Ahn, Madeline Aubertine, Minah Kim, Sarah G. Corrigan, Kevin A. Pelphrey, Frederick Shic:
Comparing Attention to Biological Motion in Autism across Age Groups Using Eye-Tracking. 36:1-36:3 - Dillon J. Lohr, Saide Johnson, Samantha Aziz, Oleg Komogortsev:
Demonstrating Eye Movement Biometrics in Virtual Reality. 37:1-37:2 - Monica S. Castelhano:
Deriving Cognitive Strategies from Fixations during Visual Search in Scenes. 38:1-38:2 - Alessandro Bruno, Marouane Tliba, Mohamed Amine Kerkouri, Aladine Chetouani, Carlo Calogero Giunta, Arzu Çöltekin:
Detecting colour vision deficiencies via Webcam-based Eye-tracking: A case study. 39:1-39:2 - Quentin Meteier, Elena Mugellini, Leonardo Angelini, Alain Adrian Verdon, Catherine Senn-Dubey, Jean-Michel Vasse:
Enhancing the Metacognition of Nursing Students Using Eye Tracking Glasses. 40:1-40:2 - Minoru Nakayama, Takahiro Ueno:
Estimation of Latent Attention Resources using Microsaccade Frequency during a Dual Task. 41:1-41:2 - Beryl Gnanaraj, Jaya Sreevalsan-Nair:
EyeExplore: An Interactive Visualization Tool for Eye-Tracking Data for Novel Stimulus-based Analysis. 42:1-42:2 - Yuki Kubota, Tomohiko Hayakawa, Masatoshi Ishikawa:
Foveated Noise Reduction: Visual Search Tasks under Spatio-Temporal Control Synchronized with Eye Movements. 43:1-43:2 - Hirotake Yamazoe:
Indirect gaze estimation from body movements based on relationship between gaze and body movements. 44:1-44:2 - Lynsay A. Shepherd, Andrea Szymkowiak:
Investigating Phishing Awareness Using Virtual Agents and Eye Movements. 45:1-45:2 - Moritz Langner, Peyman Toreini, Alexander Maedche:
Leveraging Eye Tracking Technology for a Situation-Aware Writing Assistant. 46:1-46:2 - Makoto Sei, Akira Utsumi, Hirotake Yamazoe, Jooho Lee:
Model-based deep gaze estimation using incrementally updated face-shape parameters. 47:1-47:2 - Jasmin L. Walter, Vincent Schmidt, Sabine U. König, Peter König:
Navigating Virtual Worlds: Examining Spatial Navigation Using a Graph Theoretical Analysis of Eye Tracking Data Recorded in Virtual Reality. 48:1-48:2 - Patrizia Lenhart, Enkeleda Thaqi, Nora Castner, Enkelejda Kasneci:
Old or Modern? A Computational Model for Classifying Poem Comprehension using Microsaccades. 49:1-49:2 - Tamsin Rogers, Naser Al Madi:
On the Pursuit of Developer Happiness: Webcam-Based Eye Tracking and Affect Recognition in the IDE. 50:1-50:2 - Yi-Wen Wang, Kelsey Jackson Dommer, Sara Jane Webb, Frederick Shic:
On the Value of Data Loss: A Study of Atypical Attention in Autism Spectrum Disorder Using Eye Tracking. 51:1-51:2 - Damian Hostettler, Kenan Bektas, Simon Mayer:
Pupillometry for Measuring User Response to Movement of an Industrial Robot. 52:1-52:2 - Daniel G. Krakowczyk, David R. Reich, Jakob Chwastek, Deborah N. Jakobi, Paul Prasse, Assunta Süss, Oleksii Turuta, Pawel Kasprowski, Lena A. Jäger:
pymovements: A Python Package for Eye Movement Data Processing. 53:1-53:2 - Rasha Sameer Moustafa, Harri Karhu, Sami Andberg, Roman Bednarik:
Seeing Through Their Eyes - A Customizable Gaze-Contingent Simulation of Impaired Vision and Other Eye Conditions Using VR/XR Technology. 54:1-54:2 - Hanna Julku, Caitlin Dawson, Jaana Simola:
The Effect of Curiosity on Eye Movements During Reading of Health Related Arguments. 55:1-55:2 - Michael Mühlbauer, Johannes Meyer:
The influence of pupil ellipse noise on the convergence time of a glint-free 3D eye tracking algorithm. 56:1-56:2 - Wolfgang Fuhl, Anne Herrmann-Werner, Kay Nieselt:
The Tiny Eye Movement Transformer. 57:1-57:2 - Hayden A. Mayer, Kelsey Jackson Dommer, Jenny Skytta, Dimitri Christakis, Sara Jane Webb, Frederick Shic:
Time of Day Effects on Eye-Tracking Acquisition in Infants at Higher Likelihood for Atypical Developmental Outcomes: Time of Day Effects on Eye Tracking Data Acquisition in Vulnerable Infants. 58:1-58:2 - Wolfgang Fuhl, Björn Severitt, Nora Castner, Babette Bühler, Johannes Meyer, Daniel Weber, Regine Lendway, Ruikun Hou, Enkelejda Kasneci:
Watch out for those bananas! Gaze Based Mario Kart Performance Classification. 59:1-59:2 - Margarita Ryzhova, Iza Skrjanec, Nina Quach, Alice Virginia Chase, Emilia Ellsiepen, Vera Demberg:
Word Familiarity Classification From a Single Trial Based on Eye-Movements. A Study in German and English. 60:1-60:2
PETMEI 2023 Session I
- Benedikt W. Hosp, Siegfried Wahl:
ZING: An Eye-Tracking Experiment Software for Organization and Presentation of Omnidirectional Stimuli in Virtual Reality. 61:1-61:4 - Mamoru Hiroe, Michiya Yamamoto, Takashi Nagamatsu:
Implicit User Calibration for Gaze-tracking Systems Using Saliency Maps Filtered by Eye Movements. 62:1-62:5
PETMEI 2023 Session II
- Jesse W. Grootjen, Henrike Weingärtner, Sven Mayer:
Highlighting the Challenges of Blinks in Eye Tracking for Interactive Systems. 63:1-63:7 - Benedikt W. Hosp, Siegfried Wahl:
ZERO: A Generic Open-Source Extended Reality Eye-Tracking Controller Interface for Scientists. 64:1-64:4 - Yu Wang, Wanglong Lu, Hanli Zhao, Xianta Jiang, Bin Zheng, M. Stella Atkins:
Detecting Blinks from Wearable Cameras using Spatial-Temporal-Aware Deep Network Learning. 65:1-65:7
ETVIS Session I: Visual Attention and Strategies
- Felix Sihan Wang, Quentin Lohmeyer, Andrew T. Duchowski, Mirko Meboldt:
Gaze is more than just a point: Rethinking visual attention analysis using peripheral vision-based gaze mapping. 66:1-66:7 - Kun-Ting Chen, Quynh Quang Ngo, Kuno Kurzhals, Kim Marriott, Tim Dwyer, Michael Sedlmair, Daniel Weiskopf:
Reading Strategies for Graph Visualizations that Wrap Around in Torus Topology. 67:1-67:7
ETVIS Session II: Tools and Applications
- Jaime Maldonado, Christoph Zetzsche:
Representing (Dis)Similarities Between Prediction and Fixation Maps Using Intersection-over-Union Features. 68:1-68:8 - Kun-Ting Chen, Arnaud Prouzeau, Joshua Langmead, Ryan T. Whitelock-Jones, Lawrence Lee, Tim Dwyer, Christophe Hurter, Daniel Weiskopf, Sarah Goodwin:
Gazealytics: A Unified and Flexible Visual Toolkit for Exploratory and Comparative Gaze Analysis. 69:1-69:7 - Kuno Kurzhals:
Privacy in Eye Tracking Research with Stable Diffusion. 70:1-70:7
Eyes4ICU
- Toshiya Isomoto, Shota Yamanaka, Buntarou Shizuki:
Reanalyzing Effective Eye-related Information for Developing User's Intent Detection Systems. 71:1-71:3 - Xianta Jiang, Ziang Wu, Jingjing Zheng, Bin Zheng, M. Stella Atkins:
Index Pupil Activity Echoing with Task Difficulty in Fitts' Law Setting. 72:1-72:3 - Qasim Ali, Ilona Heldal, Carsten Gunnar Helgesen, Are Dæhlen:
Towards Developing an Animation Kit for Functional Vision Screening with Eye Trackers. 73:1-73:3 - Santiago de Leon-Martinez, Róbert Móro, Mária Bieliková:
Eye Tracking as a Source of Implicit Feedback in Recommender Systems: A Preliminary Analysis. 74:1-74:3 - Filippo Baldisserotto, Krzysztof Krejtz, Izabela Krejtz:
A Review of Eye Tracking in Advanced Driver Assistance Systems: An Adaptive Multi-Modal Eye Tracking Interface Solution. 75:1-75:3 - Mehtap Çakir, Anke Huckauf:
Reviewing the Social Function of Eye Gaze in Social Interaction. 76:1-76:3 - Anna Warchol-Jakubowska, Izabela Krejtz, Krzysztof Krejtz:
An irrelevant look of novice tram driver: Visual attention distribution of novice and expert tram drivers. 77:1-77:3 - Valentin Foucher, Anke Huckauf:
Using Eye Tracking to detect Faking Intentions. 78:1-78:3
EduEye
- Sheida Marzban, Habil Gyöngyi Fábián, Béla Weiss:
The effect of intersemiotic relations on L2 learners' multimodal reading. 79:1-79:8 - Sean Anthony Byrne, Nora Castner, Ard Kastrati, Martyna Beata Plomecka, William Schaefer, Enkelejda Kasneci, Zoya Bylinskii:
Leveraging Eye Tracking in Digital Classrooms: A Step Towards Multimodal Model for Learning Assistance. 80:1-80:6 - Michael Burch:
Gaze-Based Monitoring in the Classroom. 81:1-81:3 - Tobias Appel, Kevin Kärcher, Hans-Dieter Körner:
Investigating Cognitive Load for Tasks with Mathematics and Chemistry Context through Eye Tracking. 82:1-82:6
EMIP
- Peng Kuang, Emma Söderberg, Diederick C. Niehorster, Martin Höst:
Applying Machine Learning to Gaze Data in Software Development: a Mapping Study. 83:1-83:7 - William Saranpää, Felix Apell Skjutar, Johan Heander, Emma Söderberg, Diederick C. Niehorster, Olivia Mattsson, Hedda Klintskog, Luke Church:
GANDER: a Platform for Exploration of Gaze-driven Assistance in Code Review. 84:1-84:7 - Florian Hauser, Lisa Grabinger, Jürgen Mottok, Hans Gruber:
Visual Expertise in Code Reviews: Using Holistic Models of Image Perception to Analyze and Interpret Eye Movements. 85:1-85:7 - Sören Sparmann, Carsten Schulte:
Analysing the API learning process through the use of eye tracking. 86:1-86:6 - Zubair Ahsan, Unaizah Obaidellah:
Is Clustering Novice Programmers Possible? Investigating Scanpath Trend Analysis in Programming Tasks. 87:1-87:7 - Ming Sun, Minoru Nakayama:
Program Code Navigation Model for Individuals based on LSTM with Co-clustering. 88:1-88:6
COGAIN 2023
- Mateusz Dubiel, Minoru Nakayama, Xin Wang:
Modelling Attention Levels with Ocular Responses in a Speech-in-Noise Recall Task. 89:1-89:7 - John Paulin Hansen, Per Bækgaard, Dagny Valgeirsdottir, Sofie Beier:
Universal Design of Gaze Interactive Applications for People with Special Needs. 90:1-90:7 - Shota Imamura, Jieun Lee, Makoto Itoh:
Control prediction based on cumulative gaze dwell time while browsing contents. 91:1-91:6 - Omar Namnakani, Penpicha Sinrattanavong, Yasmeen Abdrabou, Andreas Bulling, Florian Alt, Mohamed Khamis:
GazeCast: Using Mobile Devices to Allow Gaze-based Interaction on Public Displays. 92:1-92:8
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.