Michael Spivey & Stephanie Huette

Methods workshop on measuring and testing the temporal unfolding of cognitive processes

Date: Wednesday, 11 May 2011
: 9:30-16:30
Location: Room 2.2, Studenternes Hus, Fredrik Nielsens Vej 4, 8000 Aarhus C


Please see English side for updated information.

Michael Spivey is a professor at University of California, Merced. His research interests include:

  • Interaction between language and vision
  • Sentence processing
  • Word recognition
  • Visual attention
  • Visual memory
  • Eye movements
  • Computational modeling

Using an eyetracker and simultaneously recording the streaming x,y coordinates of the subject's computer mouse movements, he gets an online measure of some of the probabilistic representations (or "tentative interpretations") that get computed in real-time as the subject attempts to integrate various sources of visual and/or linguistic information.
One finding from this study is that spoken word recognition and syntactic parsing are immediately influenced by relevant visual context — thus compromising strict modular theories of language processing. This, and related, work is described in his book, The Continuity of Mind (2007).

Stephanie Huette is a Ph.D student at University of California, Merced. Her research investigates visual and linguistic mechanisms and interactions, using time-sensitive measurements such as reach-tracking, computer mouse-tracking, eyetracking and computational modeling. Using these tools, it is possible to gain insight not only into what is, but by looking at intermediate moments in time, we describe how something comes to be, and occasionally, we can see where we came from and predict what comes next. She researches the moments ofhmmmmm... as well as the moments of aha!  Through this we can not only describe how you perceive, think and act - we can also predict thoughts and actions, and occasionally even control them.

A MINDLab event. Spots for presentations and participants are limited. Register by sending a mail to fusaroli@gmail.com


Comments on content: 
Revised 28.03.2012