From OpenWetWare

Revision as of 23:07, 29 January 2009 by Alex O. Holcombe (Talk | contribs)
Jump to: navigation, search

Image:HolcombeLabLogoMedLong.png Lab and Office Location

SUPA Sydney University Perception and Action Lab Primer editing help refcard


Alex Holcombe
Polly Barr
• Charlie Ludowici
• Kim Ransley
• Ingrid Van Tongeren
William Ngiam
Fahed Jbarah
• Patrick Goodbourn


Skills Checklist
Python Programming
Psychopy/VisionEgg Installation Notes
R analysis,plot,stats
Buttonbox with photocell
Programming Cheat Sheets

We're curious about temporal aspects of human visual processing - how quickly do different cortical modules and stages process information, and how are they coordinated in time? We use behavioral experiments, illustrated by the animations below, to compare speed limits for different features and the dynamics of how these features are bound into a coherent percept. One coordination problem occurs because when an object moves across the visual field, it stimulates different populations of neurons in early visual cortex, so we're testing how the signals from different areas are combined by later stages of the brain. Overall, fast processes somehow work together with others that are very sluggish in order to yield conscious perception. Recently, we've begun experiments to see how temporal limits constrain our interactions with moving objects.

Collaborators: • Alex White
Christina Howard
• Jay Edelman
• Patrick Cavanagh
Jun Saiki
• Gene Stoner
Piers Howe
Tanja Seizova-Cajic
• Nicholas O'Dwyer
Clare Fraser
University Perception Group
• Eric Altschuler

Recent updates to the lab wiki

Personal tools