From OpenWetWare

Revision as of 06:52, 19 April 2011 by Alex O. Holcombe (Talk | contribs)
Jump to: navigation, search

Image:LabLogoWikiAug2010.png Lab and Office Location

SUPA Sydney University Perception and Action Lab Primer editing help refcard


Alex Holcombe
Polly Barr
• Charlie Ludowici
• Kim Ransley
• Ingrid Van Tongeren
William Ngiam
Fahed Jbarah
• Patrick Goodbourn


Skills Checklist
Python Programming
Psychopy/VisionEgg Installation Notes
R analysis,plot,stats
Buttonbox with photocell
Programming Cheat Sheets

We're curious about temporal aspects of human visual processing - how quickly do different cortical modules and stages process information, and how are they coordinated in time? We use behavioral experiments, illustrated by the animations below, to compare speed limits for different features and the dynamics of how these features are bound into a coherent percept. One coordination problem occurs because when an object moves across the visual field, it stimulates different populations of neurons in early visual cortex, so we're testing how the signals from different areas are combined by later stages of the brain. Overall, fast processes somehow work together with others that are very sluggish in order to yield conscious perception. Recently, we've begun experiments to see how temporal limits constrain our interactions with moving objects.

Collaborators: Alex White,Christina J. Howard, Jay Edelman,Patrick Cavanagh, Jun Saiki, Gene Stoner, Eli Brenner, Sasha Klistorner, Tatjana Seizova-Cajic,Evelyn Smith-Bergelund, University Perception Group Eric Altschuler,Daniel Linares, Maryam Vaziri-Pashkam...

Recent updates to the lab wiki

Personal tools