Digit-tracking as a new tactile interface for visual perception analysis
Guillaume Lio, Roberta Fadda, Giuseppe Doneddu, Jean-René Duhamel, Angela Sirigu, Guillaume Lio, Roberta Fadda, Giuseppe Doneddu, Jean-René Duhamel, Angela Sirigu
Abstract
Eye-tracking is a valuable tool in cognitive science for measuring how visual processing resources are allocated during scene exploration. However, eye-tracking technology is largely confined to laboratory-based settings, making it difficult to apply to large-scale studies. Here, we introduce a biologically-inspired solution that involves presenting, on a touch-sensitive interface, a Gaussian-blurred image that is locally unblurred by sliding a finger over the display. Thus, the user's finger movements provide a proxy for their eye movements and attention. We validated the method by showing strong correlations between attention maps obtained using finger-tracking vs. conventional optical eye-tracking. Using neural networks trained to predict empirically-derived attention maps, we established that identical high-level features hierarchically drive explorations with either method. Finally, the diagnostic value of digit-tracking was tested in autistic and brain-damaged patients. Rapid yet robust measures afforded by this method open the way to large scale applications in research and clinical settings.
Conflict of interest statement
The technology described in this paper has been the object of a patent filed by CNRS and the University of Lyon (Dispositif et procédé de détermination des mouvements oculaires par interface tactile. 2017, EP3192434A1).
Figures
References
- Young LR, Sheena D. Survey of eye movement recording methods. Behav. Res. Methods Instrum. 1975;7:397–429. doi: 10.3758/BF03201553.
- Schott E. Uber die Registrierung des Nystagmus und anderer Augenbewegungen verm itteles des Saitengalvanometers. Deut Arch. Klin. Med. 1922;140:79–90.
- Mowrer OH, Ruch TC, Miller NE. The corneo-retinal potential difference as the basis of the galvanometric method of recording eye movements. Am. J. Physiol. Leg. Content. 1935;114:423–428. doi: 10.1152/ajplegacy.1935.114.2.423.
- Robinson DA. A method of measuring eye movement using a scleral search coil in a magnetic field. IEEE Trans. Biomed. Eng. 1963;10:137–145.
- Judge SJ, Richmond BJ, Chu FC. Implantation of magnetic search coils for measurement of eye position: an improved method. Vis. Res. 1980;20:535–538. doi: 10.1016/0042-6989(80)90128-5.
- Mackworth JF, Mackworth NH. Eye fixations recorded on changing visual scenes by the television eye-marker. JOSA. 1958;48:439–445. doi: 10.1364/JOSA.48.000439.
- Cornsweet TN, Crane HD. Accurate two-dimensional eye tracker using first fourth Purkinje images. JOSA. 1973;63:921–928. doi: 10.1364/JOSA.63.000921.
- Yarbus, A. L. Eye Movements and Vision. (Springer, 1967).
- Tatler BW, Wade NJ, Kwan H, Findlay JM, Velichkovsky BM. Yarbus, eye movements, and vision. i-Perception. 2010;1:7–27. doi: 10.1068/i0382.
- Theeuwes J. Top-down and bottom-up control of visual selection. Acta Psychol. 2010;135:77–99. doi: 10.1016/j.actpsy.2010.02.006.
- Awh E, Belopolsky AV, Theeuwes J. Top-down versus bottom-up attentional control: a failed theoretical dichotomy. Trends Cogn. Sci. 2012;16:437–443. doi: 10.1016/j.tics.2012.06.010.
- Buschman TJ, Miller EK. Top-down versus bottom-up control of attention in the prefrontal and posterior parietal cortices. Science. 2007;315:1860–1862. doi: 10.1126/science.1138071.
- Treisman AM, Gelade G. A feature-integration theory of attention. Cogn. Psychol. 1980;12:97–136. doi: 10.1016/0010-0285(80)90005-5.
- Elazary L, Itti L. Interesting objects are visually salient. J. Vis. 2008;8:3–3. doi: 10.1167/8.3.3.
- Cerf, M., Harel, J., Einhaeuser, W. & Koch, C. Predicting human gaze using low-level saliency combined with face detection. In Advances in Neural Information Processing Systems 20 (eds. Platt, J. C., Koller, D., Singer, Y. & Roweis, S. T.) 241–248 (Curran Associates, Inc., 2008).
- Crouzet SM, Kirchner H, Thorpe SJ. Fast saccades toward faces: Face detection in just 100 ms. J. Vis. 2010;10:16–16. doi: 10.1167/10.4.16.
- Birmingham E, Bischof WF, Kingstone A. Gaze selection in complex social scenes. Vis. Cogn. 2008;16:341–355. doi: 10.1080/13506280701434532.
- Anderson BA, Laurent PA, Yantis S. Value-driven attentional capture. Proc. Natl Acad. Sci. 2011;108:10367–10371. doi: 10.1073/pnas.1104047108.
- Judd, T., Durand, F. & Torralba, A. A Benchmark of Computational Models of Saliency to Predict Human Fixations (2012).
- Huang, X., Shen, C., Boix, X. & Zhao, Q. SALICON: Reducing the Semantic Gap in Saliency Prediction by Adapting Deep Neural Networks. In 2015 IEEE International Conference on Computer Vision (ICCV) 262–270 (IEEE, 2015).
- Emery NJ. The eyes have it: the neuroethology, function and evolution of social gaze. Neurosci. Biobehav. Rev. 2000;24:581–604. doi: 10.1016/S0149-7634(00)00025-7.
- Maurer D, Salapatek P. Developmental changes in the scanning of faces by young infants. Child Dev. 1976;47:523–527. doi: 10.2307/1128813.
- Krizhevsky, A., Sutskever, I. & Hinton, G. E. ImageNet Classification with Deep Convolutional Neural Networks. in Advances in Neural Information Processing Systems 25 (eds. Pereira, F., Burges, C. J. C., Bottou, L. & Weinberger, K. Q.) 1097–1105 (Curran Associates, Inc., 2012).
- Russakovsky, O. et al. ImageNet large scale visual recognition challenge. Preprint at ArXiv14090575 Cs (2014).
- Kanner L. Autistic disturbances of affective contact. Nerv. Child. 1943;2:217–250.
- Pelphrey KA, et al. Visual scanning of faces in autism. J. Autism Dev. Disord. 2002;32:249–261. doi: 10.1023/A:1016374617369.
- Klin A, Jones W, Schultz R, Volkmar F, Cohen D. Visual fixation patterns during viewing of naturalistic social situations as predictors of social competence in individuals with autism. Arch. Gen. Psychiatry. 2002;59:809–816. doi: 10.1001/archpsyc.59.9.809.
- Dalton KM, et al. Gaze fixation and the neural circuitry of face processing in autism. Nat. Neurosci. 2005;8:519–526. doi: 10.1038/nn1421.
- Esteve-Gibert N, Prieto P. Infants temporally coordinate gesture-speech combinations before they produce their first words. Speech Commun. 2014;57:301–316. doi: 10.1016/j.specom.2013.06.006.
- Button KS, et al. Power failure: why small sample size undermines the reliability of neuroscience. Nat. Rev. Neurosci. 2013;14:365–376. doi: 10.1038/nrn3475.
- Bacchetti P. Small sample size is not the real problem. Nat. Rev. Neurosci. 2013;14:585. doi: 10.1038/nrn3475-c3.
- Young T. II. The Bakerian Lecture. On the theory of light and colours. Philos. Trans. R. Soc. Lond. 1802;92:12–48. doi: 10.1098/rstl.1802.0004.
- Maxwell JC. XVIII.—Experiments on colour, as perceived by the eye, with remarks on colour-blindness. Earth Environ. Sci. Trans. R. Soc. Edinb. 1857;21:275–298. doi: 10.1017/S0080456800032117.
- Helmholtz Hvon. Handbuch der physiologischen Optik. Leipzig: Leopold Voss; 1867.
- LeCun Y, Bengio Y, Hinton G. Deep learning. Nature. 2015;521:436–444. doi: 10.1038/nature14539.
- Rosenblatt F. The perceptron: a probabilistic model for information storage and organization in the brain. Psychol. Rev. 1958;65:386–408. doi: 10.1037/h0042519.
- Jutten C, Herault J. Blind separation of sources, part I: an adaptive algorithm based on neuromimetic architecture. Signal Process. 1991;24:1–10. doi: 10.1016/0165-1684(91)90079-X.
- Itti L, Koch C. Computational modelling of visual attention. Nat. Rev. Neurosci. 2001;2:194–203. doi: 10.1038/35058500.
- Huang, X., Shen, C., Boix, X. & Zhao, Q. SALICON: reducing the semantic gap in saliency prediction by adapting deep neural networks. In 2015 IEEE International Conference on Computer Vision (ICCV) 262–270 (IEEE, 2015).
- Hadjikhani N, et al. Look me in the eyes: constraining gaze in the eye-region provokes abnormally high subcortical activation in autism. Sci. Rep. 2017;7:3163. doi: 10.1038/s41598-017-03378-5.
- Trevisan DA, Roberts N, Lin C, Birmingham E. How do adults and teens with self-declared Autism Spectrum Disorder experience eye contact? A qualitative analysis of first-hand accounts. PLoS ONE. 2017;12:e0188446. doi: 10.1371/journal.pone.0188446.
- Braddick O, Atkinson J. Development of human visual function. Vis. Res. 2011;51:1588–1609. doi: 10.1016/j.visres.2011.02.018.
- Eckstein MK, Guerra-Carrillo B, Miller Singley AT, Bunge SA. Beyond eye gaze: What else can eyetracking reveal about cognition and cognitive development? Dev. Cogn. Neurosci. 2017;25:69–91. doi: 10.1016/j.dcn.2016.11.001.
- American Psychiatric Association. (2013). Diagnostic and statistical manual of mental disorders (5th ed.). 10.1176/appi.books.9780890425596.
- Hus V, Lord C. The autism diagnostic observation schedule, module 4: revised algorithm and standardized severity scores. J. Autism Dev. Disord. 2014;44:1996–2012. doi: 10.1007/s10803-014-2080-3.
- Morgante JD, Zolfaghari R, Johnson SP. A critical test of temporal and spatial accuracy of the Tobii T60XL eye tracker. Infancy. 2012;17:9–32. doi: 10.1111/j.1532-7078.2011.00089.x.
- Brainard DH. The psychophysics toolbox. Spat. Vis. 1997;10:433–436. doi: 10.1163/156856897X00357.
- Pelli DG. The VideoToolbox software for visual psychophysics: transforming numbers into movies. Spat. Vis. 1997;10:437–442. doi: 10.1163/156856897X00366.
- Kleiner M, et al. What’s new in psychtoolbox-3. Perception. 2007;36:1–16.
- Sirigu A., Duhamel J-R, Lio G. Dispositif Et Procédé De Détermination Des Mouvements Oculaires Par Interface Tactile. Patent number EP/163050042 (15.01.2016), extension PCT/082730 (27.12.2016).
- Kümmerer, M., Theis, L. & Bethge, M. Deep gaze i: Boosting saliency prediction with feature maps trained on imagenet. Preprint at ArXiv14111045 (2014).
- Judd, T., Ehinger, K., Durand, F. & Torralba, A. Learning to predict where humans look. In Computer Vision, 2009 IEEE 12th International Conference on 2106–2113 (IEEE, 2009).
- Yuan, J., Ni, B. & Kassim, A. A. Half-CNN: a general framework for whole-image regression. Preprint at ArXiv14126885 (2014).
- Vig, E., Dorr, M. & Cox, D. Large-SCale Optimization of Hierarchical Features for Saliency Prediction in Natural Images. In 2014 IEEE Conference on Computer Vision and Pattern Recognition 2798–2805 (IEEE, 2014).
- Watson AB. A formula for human retinal ganglion cell receptive field density as a function of visual field location. J. Vis. 2014;14:15–15. doi: 10.1167/14.7.15.
Source: PubMed