Graduate Student, Monash University
Electrophysiology and machine learning to study how networks of neurons support motion perception
I study how the activity of neurons (brain cells) result in our ability to perceive moving objects. There is a specialised part of the brain, called the middle temporal area, in which the neurons respond to motion. The activity of these neurons not only indicates when something moves, but also it's direction and speed.
In the first part of my project, I investigated how single neurons respond to motion when there is also distracting, noise-like motion is present. In the real world, there are often many things moving at the same time, and they can be partially obscured by other objects. Our brains must do the best it can with the information available so we can see important events, like a car fast approaching! I have found that some neurons are far more tolerant to distracting motion than the others, and they likely play a key role in our perception of motion in these conditions.
Secondly, I am examining how networks of cells combine to support motion perception. Whilst individual neurons are reasonably good at encoding motion, our motion perception abilities are no doubt the result of the activity of many neurons acting in concert. Understanding this is a difficult, multi-dimensional computational problem and I am currently utilised machine learning methods to analyse big datasets to gain insight into this question. Preliminary analyses indicate that correlations in activity between some pairs of neurons are key factor in determining how much information a population of neurons can represent.
Finally, I have investigated if the neurons responsible for seeing objects move are the same neurons that responsible for hearing objects move, and additionally, if the the visual neurons can integrate information from the auditory system. For example, you may see a car approach you, but you can also hear it. If you can combine that information, perhaps you will be faster at getting out the way! My findings demonstrate clearly that perceiving visual and auditory motion actually involves two distinct brain systems, and that when we integrate audio-visual motion signals, we likely use yet another part of the brain.
Abstract: Neurons in the Middle Temporal area (MT) of the primate cerebral cortex respond to moving visual stimuli. The sensitivity of MT neurons to motion signals can be characterized by using random-dot stimuli, in which the strength of the motion signal is manipulated by adding different levels of noise (elements that move in random directions). In macaques, this has allowed the calculation of "neurometric" thresholds. We characterized the responses of MT neurons in sufentanil/nitrous oxide anesthetized marmoset monkeys, a species which has attracted considerable recent interest as an animal model for vision research. We found that MT neurons show a wide range of neurometric thresholds, and that the responses of the most sensitive neurons could account for the behavioral performance of macaques and humans. We also investigated factors that contributed to the wide range of observed thresholds. The difference in firing rate between responses to motion in the preferred and null directions was the most effective predictor of neurometric threshold, whereas the direction tuning bandwidth had no correlation with the threshold. We also showed that it is possible to obtain reliable estimates of neurometric thresholds using stimuli that were not highly optimized for each neuron, as is often necessary when recording from large populations of neurons with different receptive field concurrently, as was the case in this study. These results demonstrate that marmoset MT shows an essential physiological similarity to macaque MT, and suggest that its neurons are capable of representing motion signals that allow for comparable motion-in-noise judgments.
Pub.: 24 Jun '17, Pinned: 27 Aug '17
Abstract: The relationship between neuronal activity and psychophysical judgement has long been of interest to students of sensory processing. Previous analyses of this problem have compared the performance of human or animal observers in detection or discrimination tasks with the signals carried by individual neurons, but have been hampered because neuronal and perceptual data were not obtained at the same time and under the same conditions. We have now measured the performance of monkeys and of visual cortical neurons while the animals performed a psychophysical task well matched to the properties of the neurons under study. Here we report that the reliability and sensitivity of most neurons on this task equalled or exceeded that of the monkeys. We therefore suggest that under our conditions, psychophysical judgements could be based on the activity of a relatively small number of neurons.
Pub.: 07 Sep '89, Pinned: 31 Aug '17
Abstract: The sensitivity of a population of neurons, and therefore the amount of sensory information available to an animal, is limited by the sensitivity of single neurons in the population and by noise correlation between neurons. For decades, therefore, neurophysiologists have devised increasingly clever and rigorous ways to measure these critical variables (Parker and Newsome, 1998). Previous studies examining the relationship between the responses of single middle temporal (MT) neurons and direction-discrimination performance uncovered an apparent paradox. Sensitivity measurements from single neurons suggested that small numbers of neurons may account for a monkey's psychophysical performance (Britten et al., 1992), but trial-to-trial variability in activity of single MT neurons are only weakly correlated with the monkey's behavior, suggesting that the monkey's decision must be based on the responses of many neurons (Britten et al., 1996). We suggest that the resolution to this paradox lies (1) in the long stimulus duration used in the original studies, which led to an overestimate of neural sensitivity relative to psychophysical sensitivity, and (2) mistaken assumptions (because no data were available) about the level of noise correlation in MT columns with opposite preferred directions. We therefore made new physiological and psychophysical measurements in a reaction time version of the direction-discrimination task that matches neural measurements to the actual decision time of the animals. These new data, considered together with our recent data on noise correlation in MT (Cohen and Newsome, 2008), provide a substantially improved account of psychometric performance in the direction-discrimination task.
Pub.: 22 May '09, Pinned: 27 Aug '17
Abstract: Behavior relies on the distributed and coordinated activity of neural populations. Population activity can be measured using multi-neuron recordings and neuroimaging. Neural recordings reveal how the heterogeneity, sparseness, timing, and correlation of population activity shape information processing in local networks, whereas neuroimaging shows how long-range coupling and brain states impact on local activity and perception. To obtain an integrated perspective on neural information processing we need to combine knowledge from both levels of investigation. We review recent progress of how neural recordings, neuroimaging, and computational approaches begin to elucidate how interactions between local neural population activity and large-scale dynamics shape the structure and coding capacity of local information representations, make them state-dependent, and control distributed populations that collectively shape behavior.
Pub.: 12 Feb '15, Pinned: 24 Aug '17
Abstract: Population codes assume that neural systems represent sensory inputs through the firing rates of populations of differently tuned neurons. However, trial-by-trial variability and noise correlations are known to affect the information capacity of neural codes. Although recent studies have shown that stimulus presentation reduces both variability and rate correlations with respect to their spontaneous level, possibly improving the encoding accuracy, whether these second order statistics are tuned is unknown. If so, second-order statistics could themselves carry information, rather than being invariably detrimental. Here we show that rate variability and noise correlation vary systematically with stimulus direction in directionally selective middle temporal (MT) neurons, leading to characteristic tuning curves. We show that such tuning emerges in a stochastic recurrent network, for a set of connectivity parameters that overlaps with a single-state scenario and multistability. Information theoretic analysis shows that second-order statistics carry information that can improve the accuracy of the population code.
Pub.: 24 Jul '13, Pinned: 31 Aug '17
Abstract: It is well known that the nervous system combines information from different cues within and across sensory modalities to improve performance on perceptual tasks. In this article, we present results showing that in a visual motion-detection task, concurrent auditory motion stimuli improve accuracy even when they do not provide any useful information for the task. When participants judged which of two stimulus intervals contained visual coherent motion, the addition of identical moving sounds to both intervals improved accuracy. However, this enhancement occurred only with sounds that moved in the same direction as the visual motion. Therefore, it appears that the observed benefit of auditory stimulation is due to auditory-visual interactions at a sensory level. Thus, auditory and visual motion-processing pathways interact at a sensory-representation level in addition to the level at which perceptual estimates are combined.
Pub.: 01 Dec '11, Pinned: 27 Aug '17
Abstract: Numerous studies show that practice can result in performance improvements on low-level visual perceptual tasks [1-5]. However, such learning is characteristically difficult and slow, requiring many days of training [6-8]. Here, we show that a multisensory audiovisual training procedure facilitates visual learning and results in significantly faster learning than unisensory visual training. We trained one group of subjects with an audiovisual motion-detection task and a second group with a visual motion-detection task, and compared performance on trials containing only visual signals across ten days of training. Whereas observers in both groups showed improvements of visual sensitivity with training, subjects trained with multisensory stimuli showed significantly more learning both within and across training sessions. These benefits of multisensory training are particularly surprising given that the learning of visual motion stimuli is generally thought to be mediated by low-level visual brain areas [6, 9, 10]. Although crossmodal interactions are ubiquitous in human perceptual processing [11-13], the contribution of crossmodal information to perceptual learning has not been studied previously. Our results show that multisensory interactions can be exploited to yield more efficient learning of sensory information and suggest that multisensory training programs would be most effective for the acquisition of new skills.
Pub.: 25 Jul '06, Pinned: 24 Aug '17