Eye movement patterns
What do scanpaths tell us? How they differ between subjects, tasks and experience. And ho to do machine learning on eye-tracking data.
Everyone can do eye-tracking in the lab. But how about real-world applications? How can we strengthen eye-tracking technology so that it works in naturalistic scenarios, with ambient light, eyeglasses, smudges and dirt on the lenses?
How good is your eye-tracker in terms of tracking rate but also in accuracy and precision? How can you tell? And what to do if the scenario does not allow for a more accurate recording?
Data analysis beyond the heatmap. How do we make sense out of that huge pile of data? Is is actually useful at all?