COGAIN Symposium:
Communication by Gaze Interaction

Seattle, WA, USA, June 11, 2022

In collaboration with:



Important dates

Papers due : Feb 24, 2022
Notifications: Mar 20, 2022
Camera-ready: Apr 12, 2022


General co-chairs
Ken Pfeuffer
Minoru Nakayama

Program co-chairs
Augusto Esteves
Joshua Newn
COGAIN symposium program overview

The COGAIN workshop will be held on the June 11 from 10am to 12 (UTC -7). This will be an hybrid event, with both live and pre-recorded presentations. If you as a presenter opt for the latter, please send us a pre-recorded video of your presentation (or a youtube link) at least 24 hours before the start of the workshop: cogain2022@cogain.org (guidance for videos here).

Either presentation format should last no longer than 5-10 minutes.

  • Accepted papers: 6 short papers
  • Date: 11th June 2022
  • Time: 10:00-12:00AM (Seattle time)
  • Format: Video or Live Presentation
10.00 - 10.05 General introduction
10.05 - 10.45 Session 1 (chair: Augusto Esteves)
  • Paper 1: A Research on Time Series Evaluation of Cognitive Load Factors by Eye Movement Features (In-person presentation) Paper

  • Paper 2: Feasibility of a Device for Gaze Interaction by Visually-Evoked Brain Signals (Remote presentation) Paper

  • Paper 3: Usability of the super-vowel for gaze-based text entry Paper

10.45 - 11.25 Session 2 (chair: Joshua Newn)
  • Paper 4: User Perception of Smooth Pursuit Target Speed (In-person presentation) Paper

  • Paper 5: Attention of Many Observers Visualized by Eye Movements (Remote presentation) Paper

  • Paper 6: Look & Turn: One-handed and Expressive Menu Interaction by Gaze and Arm Turns in VR (In-person presentation) Paper

11:25 - 11:30 Break
11:30 - 11:55: Keynote: Tanya Jonker (Facebook Research).

Topic: Eye tracking and the future of AR

Abstract: Technology research is making rapid progress towards the next great wave of personal computing: wearable augmented reality. Despite incredible innovation in hardware and computer vision, the field has not yet reached its potential because these systems lack a personalized, contextually adaptive interface. I will discuss our vision for a human-centered, AI-driven, proactive AR system, and the role that eye tracking must play in realizing that vision. I will also share several eye-tracking based research innovations from our lab and how they are being used to transform human interaction with AR. This talk shares a call to action to shape the way in which people interact with AR and the world.

11:55 - 12:00: Conclusion