A pinboard by
Tristan Chaplin

Graduate Student, Monash University


Electrophysiology and machine learning to study how networks of neurons support motion perception

I study how the activity of neurons (brain cells) result in our ability to perceive moving objects. There is a specialised part of the brain, called the middle temporal area, in which the neurons respond to motion. The activity of these neurons not only indicates when something moves, but also it's direction and speed.

In the first part of my project, I investigated how single neurons respond to motion when there is also distracting, noise-like motion is present. In the real world, there are often many things moving at the same time, and they can be partially obscured by other objects. Our brains must do the best it can with the information available so we can see important events, like a car fast approaching! I have found that some neurons are far more tolerant to distracting motion than the others, and they likely play a key role in our perception of motion in these conditions.

Secondly, I am examining how networks of cells combine to support motion perception. Whilst individual neurons are reasonably good at encoding motion, our motion perception abilities are no doubt the result of the activity of many neurons acting in concert. Understanding this is a difficult, multi-dimensional computational problem and I am currently utilised machine learning methods to analyse big datasets to gain insight into this question. Preliminary analyses indicate that correlations in activity between some pairs of neurons are key factor in determining how much information a population of neurons can represent.

Finally, I have investigated if the neurons responsible for seeing objects move are the same neurons that responsible for hearing objects move, and additionally, if the the visual neurons can integrate information from the auditory system. For example, you may see a car approach you, but you can also hear it. If you can combine that information, perhaps you will be faster at getting out the way! My findings demonstrate clearly that perceiving visual and auditory motion actually involves two distinct brain systems, and that when we integrate audio-visual motion signals, we likely use yet another part of the brain.


Sensitivity of Neurons in the Middle Temporal Area of Marmoset Monkeys to Random Dot Motion.

Abstract: Neurons in the Middle Temporal area (MT) of the primate cerebral cortex respond to moving visual stimuli. The sensitivity of MT neurons to motion signals can be characterized by using random-dot stimuli, in which the strength of the motion signal is manipulated by adding different levels of noise (elements that move in random directions). In macaques, this has allowed the calculation of "neurometric" thresholds. We characterized the responses of MT neurons in sufentanil/nitrous oxide anesthetized marmoset monkeys, a species which has attracted considerable recent interest as an animal model for vision research. We found that MT neurons show a wide range of neurometric thresholds, and that the responses of the most sensitive neurons could account for the behavioral performance of macaques and humans. We also investigated factors that contributed to the wide range of observed thresholds. The difference in firing rate between responses to motion in the preferred and null directions was the most effective predictor of neurometric threshold, whereas the direction tuning bandwidth had no correlation with the threshold. We also showed that it is possible to obtain reliable estimates of neurometric thresholds using stimuli that were not highly optimized for each neuron, as is often necessary when recording from large populations of neurons with different receptive field concurrently, as was the case in this study. These results demonstrate that marmoset MT shows an essential physiological similarity to macaque MT, and suggest that its neurons are capable of representing motion signals that allow for comparable motion-in-noise judgments.

Pub.: 24 Jun '17, Pinned: 27 Aug '17

Estimates of the contribution of single neurons to perception depend on timescale and noise correlation.

Abstract: The sensitivity of a population of neurons, and therefore the amount of sensory information available to an animal, is limited by the sensitivity of single neurons in the population and by noise correlation between neurons. For decades, therefore, neurophysiologists have devised increasingly clever and rigorous ways to measure these critical variables (Parker and Newsome, 1998). Previous studies examining the relationship between the responses of single middle temporal (MT) neurons and direction-discrimination performance uncovered an apparent paradox. Sensitivity measurements from single neurons suggested that small numbers of neurons may account for a monkey's psychophysical performance (Britten et al., 1992), but trial-to-trial variability in activity of single MT neurons are only weakly correlated with the monkey's behavior, suggesting that the monkey's decision must be based on the responses of many neurons (Britten et al., 1996). We suggest that the resolution to this paradox lies (1) in the long stimulus duration used in the original studies, which led to an overestimate of neural sensitivity relative to psychophysical sensitivity, and (2) mistaken assumptions (because no data were available) about the level of noise correlation in MT columns with opposite preferred directions. We therefore made new physiological and psychophysical measurements in a reaction time version of the direction-discrimination task that matches neural measurements to the actual decision time of the animals. These new data, considered together with our recent data on noise correlation in MT (Cohen and Newsome, 2008), provide a substantially improved account of psychometric performance in the direction-discrimination task.

Pub.: 22 May '09, Pinned: 27 Aug '17

Sound facilitates visual learning.

Abstract: Numerous studies show that practice can result in performance improvements on low-level visual perceptual tasks [1-5]. However, such learning is characteristically difficult and slow, requiring many days of training [6-8]. Here, we show that a multisensory audiovisual training procedure facilitates visual learning and results in significantly faster learning than unisensory visual training. We trained one group of subjects with an audiovisual motion-detection task and a second group with a visual motion-detection task, and compared performance on trials containing only visual signals across ten days of training. Whereas observers in both groups showed improvements of visual sensitivity with training, subjects trained with multisensory stimuli showed significantly more learning both within and across training sessions. These benefits of multisensory training are particularly surprising given that the learning of visual motion stimuli is generally thought to be mediated by low-level visual brain areas [6, 9, 10]. Although crossmodal interactions are ubiquitous in human perceptual processing [11-13], the contribution of crossmodal information to perceptual learning has not been studied previously. Our results show that multisensory interactions can be exploited to yield more efficient learning of sensory information and suggest that multisensory training programs would be most effective for the acquisition of new skills.

Pub.: 25 Jul '06, Pinned: 24 Aug '17