Our Research
Photo of two birds overlaid with a typical set of eye movements
Our perception is of a seamless visual world, but in reality a sharp image is available only at the very centre of gaze. Eye movements and visual memory work in concert so we can explore and understand our visual environment.

In order to interact with the world, your brain constructs and maintains internal representations of it. Our aim is to understand the nature of these representations and the computations performed on them to achieve behavioural goals. In the lab, we use visual psychophysics and memory tasks, extended reality (XR) technology, eye and body movement recordings, mathematical models and artificial neural networks. We also collaborate with researchers using brain imaging, recording and stimulation, and with neuropsychologists who study cognitive aging, mental illness and neurological disorders.

A major focus of the lab is on visual working memory. Our ability to recall details of what we have just seen is remarkably limited: rather than a maximum number of objects we can remember, our work has shown that the limit is on the total resolution with which visual information can be maintained. Visual memory acts like a resource that can be allocated to important information in our environment: we investigate how this resource is distributed between features of the visual scene and the ways in which this affects our perception, decisions and actions.

In everyday life, we redirect our gaze several times per second in order to extract detailed information from the world around us. These shifts of visual attention represent a simple case of exploration and decision-making behaviour. Our research has shown that working memory plays a vital role in bridging discrete transitions in visual input, so that processing does not have to begin anew after each eye movement.

In the brain, information about our environment and our planned actions is encoded in the joint spiking activity of populations of neurons. We develop models based on neural coding principles to identify mechanisms that are compatible with our experimental observations of human perception and behaviour. We also have long-standing interests in sensory prediction and motor learning, particularly in relation to tactile attenuation.

Joining the lab

To apply for an internship, MPhil or PhD degree in the Bays lab, please complete this form. If you are interested in a post-doctoral position please contact me by email.

Plots of movement paths showing increasing scatter from left to right
Reaching movements to a remembered target. When the number of different locations held in memory at one time increases (from left to right; other targets not shown) each gets a smaller fraction of working memory resources, leading to greater variability in recall.
News from @Bayslab.org
10 Jul 2024
A systematic dissection of the analogue report methodology and comparison with "gold-standard" 2AFC. Key findings: motor noise is negligible; analogue report error is a robust measure of memory fidelity, but may underestimate perceptual precision doi.org/10.3758/s13428
11 Jun 2024
Out now in NHB (finally!), a review of visual working memory from a computational perspective, with and Sebastian Schneegans. nature.com/articles/s4156…
14 May 2024
Nice commentary from and on our new eLife paper doi.org/10.7554/eLife.…
13 Feb 2024
Yes, I believe so. It's a dramatic demo (I tried it at VSS once) but doesn't need any extra mechanism to explain it over the standard illusion.
12 Feb 2024
The size-weight illusion is a by-product of efficient sensory coding adapted to the combinations of volume and mass found in everyday objects. New preprint biorxiv.org/content/10.110…
28 Sep 2023
RT : 1/ 🔥 Hot off the press: my latest paper in collaboration with and . Ever wondered how Bayes' theo…
08 May 2023
RT : “Among articles stating that data was available upon request, only 17% shared data upon request.” https://t.co/L84lnKMgP…
13 Jan 2023
We measured how effectively observers can reallocate working memory resources to new visual items when old ones become obsolete - people are surprisingly good at it! New paper with @dataforyounz link.springer.com/article/10.375…
13 Jan 2023
RT : The difference between saccades and smooth pursuit eye movements, as demonstrated by sticking gross plastic thingies on…
10 Jan 2023
A critique of the psychological similarity account of working memory errors: work with now out in JEP:LMC biorxiv.org/content/10.110…
07 Oct 2022
RT : New review on visual working memory with , , and Sebastian Schneegans: psyarxiv.com/kubr9/
04 Jul 2022
New work with Jess McMaster & others: we show swap errors (item confusions) in cued recall are not a strategic response to forgotten items, but instead occur at exactly the rate predicted by variability in recall of the cue features sciencedirect.com/science/articl…
17 Feb 2022
New in Psych Review with Sebastian Schneegans & Jess McMaster: comparing the roles of time and space in binding features in working memory pubmed.ncbi.nlm.nih.gov/35099211/