Indeed, on desktop mouse cursor can be a decent proxy for gaze. I can't
remember if it was the particular paper you've quoted where I read about
it, but I seem to recall that the correlation varied quite a bit from one
person to another. Essentially some people point the mouse to where they're
looking, but others don't.
This paper by Microsoft about cursor position is also relevant:
https://phabricator.wikimedia.org/T165272#3955001 but it shows the
limitations of predicting things based on cursor position when the person
doesn't know what they're looking for when they land on the page. It works
really well for the use case of people looking for "Facebook" on Bing to go
to Facebook. Which probably has some muscle memory to it, i.e. people who
do that already know where they're going to move their mouse to.
This is the paper that got me interested in gaze tracking for performance:
https://phabricator.wikimedia.org/T165272#3933730 It has limitations in
terms of methodology, with people watching videos and playing a guessing
game instead of having a more natural browsing experience, but the results
are impressive. Enough to make me want to investigate what people look at
on a Wikipedia page as it's loading, to inform our decisions about how we
prioritise page elements.
The upside of doing a study with a device like the Tobii 4C is that we can
also verify at the same time the correlation between gaze and cursor
position.
Post by Aaron HalfakerSee also: Chen, M. C., Anderson, J. R., & Sohn, M. H. (2001, March). What
can a mouse cursor tell us more?: correlation of eye/mouse movements on web
browsing. In CHI'01 extended abstracts on Human factors in computing
systems (pp. 281-282). ACM. (I can provide the PDF on request -- too big
to attach)
Kim, N. W., Bylinskii, Z., Borkin, M. A., Gajos, K. Z., Oliva, A., Durand,
F., & Pfister, H. (2017). BubbleView: an interface for crowdsourcing image
importance maps and tracking visual attention. ACM Transactions on
Computer-Human Interaction (TOCHI), 24(5), 36.
https://arxiv.org/pdf/1702.05150
I don't know how readily available mouse-based attention tracking
solutions are. But from the literature, it seems like there are good
options for understanding attention through purely software means.
-Aaron
Post by Gilles DubucPost by Tilman BayerNice! Added a link at
https://meta.wikimedia.org/wiki/Research:Which_parts_of_an_article_do_readers_read#Eyetracking
Post by Gilles DubucI've just discovered and received today an amazing piece of hardware
that a lot of you might find useful. It's called the Tobii Eye Tracker
4C <https://tobiigaming.com/product/tobii-eye-tracker-4c/>, which can
be used with the Tobii Pro Sprint <https://www.tobiipro.com/sprint/>
hosted service.
https://www.mediawiki.org/wiki/File:Tobii_Eye_Tracker_4C_demo.webm
This could allow us to do lab user testing where we record gaze cheaply
and very easily.
_______________________________________________
Research-Internal mailing list
https://lists.wikimedia.org/mailman/listinfo/research-internal
--
Tilman Bayer
Senior Analyst
Wikimedia Foundation
IRC (Freenode): HaeB
_______________________________________________
Research-Internal mailing list
https://lists.wikimedia.org/mailman/listinfo/research-internal
_______________________________________________
Engineering mailing list
https://lists.wikimedia.org/mailman/listinfo/engineering