Space-Variant Spatio-Temporal Filtering of Video for Gaze Visualization and Perceptual Learning

Michael Dorr, Halszka Jarodzka, Erhardt Barth

Abstract

We introduce an algorithm for space-variant filtering of video based on a spatio-temporal Laplacian pyramid and use this algorithm to render videos in order to visualize prerecorded eye movements. Spatio-temporal contrast and colour saturation are reduced as a function of distance to the nearest gaze point of regard, i.e. non-fixated, distracting regions are filtered out, whereas fixated image regions remain unchanged. Results of an experiment in which the eye movements of an expert on instructional videos are visualized with this algorithm, so that the gaze of novices is guided to relevant image locations, show that this visualization technique facilitates the novices' perceptual learning.
Original languageEnglish
Title of host publicationETRA '10 Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Number of pages8
Place of PublicationNew York, NY, USA
PublisherACM
Publication date22.03.2010
Pages307-314
ISBN (Electronic)978-1-60558-994-7
DOIs
Publication statusPublished - 22.03.2010
EventEye Tracking Research and Applications - Austin, United States
Duration: 22.03.201024.03.2010
http://www.sis.uta.fi/cs/etra-2010/

Fingerprint

Dive into the research topics of 'Space-Variant Spatio-Temporal Filtering of Video for Gaze Visualization and Perceptual Learning'. Together they form a unique fingerprint.

Cite this