A key question for temporal processing research is how the nervous system extracts event duration, despite a notable lack of neural structures dedicated to duration encoding. This is in stark contrast with the orderly arrangement of neurons tasked with spatial processing. In this study, we examine the linkage between the spatial and temporal domains. We use sensory adaptation techniques to generate after-effects where perceived duration is either compressed or expanded in the opposite direction to the adapting stimulus’ duration. Our results indicate that these after-effects are broadly tuned, extending over an area approximately five times the size of the stimulus. This region is directly related to the size of the adapting stimulus—the larger the adapting stimulus the greater the spatial spread of the aftereffect. We construct a simple model to test predictions based on overlapping adapted versus non-adapted neuronal populations and show that our effects cannot be explained by any single, fixed-scale neural filtering. Rather, our effects are best explained by a self-scaled mechanism underpinned by duration selective neurons that also pool spatial information across earlier stages of visual processing.
|Number of pages||9|
|Journal||Proceedings of the Royal Society B: Biological Sciences|
|Early online date||27 Jul 2016|
|Publication status||Published - 27 Jul 2016|
FingerprintDive into the research topics of 'Object size determines the spatial spread of visual time'. Together they form a unique fingerprint.
- Department of Optometry and Vision Sciences - Lecturer in Optometry
- School of Applied Sciences
- Centre for Vision across the Life Span - Member