PostDoc, Columbia University (NY)
Blind and blindfolded subjects navigate virtual reality mazes during fMRI
When navigating a lot of visual regions are recruited in our brain - but is this because these regions are a door for the visual information, or because they play an active part in navigation. To explore this we check what happens when humans navigate without vision using a virtual cane that gives auditory cues - people who are blind from birth or blindfolded people who are sighted are compared to people navigating visually. We find that the visual regions are recruited in all 3 groups, showing that they indeed play an inherent navigational role even when navigating in the absence of vision.
Abstract: Graphical virtual environments are currently far from accessible to blind users as their content is mostly visual. This is especially unfortunate as these environments hold great potential for this population for purposes such as safe orientation, education, and entertainment. Previous tools have increased accessibility but there is still a long way to go. Visual-to-audio Sensory-Substitution-Devices (SSDs) can increase accessibility generically by sonifying on-screen content regardless of the specific environment and offer increased accessibility without the use of expensive dedicated peripherals like electrode/vibrator arrays. Using SSDs virtually utilizes similar skills as when using them in the real world, enabling both training on the device and training on environments virtually before real-world visits. This could enable more complex, standardized and autonomous SSD training and new insights into multisensory interaction and the visually-deprived brain. However, whether congenitally blind users, who have never experienced virtual environments, will be able to use this information for successful perception and interaction within them is currently unclear.We tested this using the EyeMusic SSD, which conveys whole-scene visual information, to perform virtual tasks otherwise impossible without vision. Congenitally blind users had to navigate virtual environments and find doors, differentiate between them based on their features (Experiment1:task1) and surroundings (Experiment1:task2) and walk through them; these tasks were accomplished with a 95% and 97% success rate, respectively. We further explored the reactions of congenitally blind users during their first interaction with a more complex virtual environment than in the previous tasks-walking down a virtual street, recognizing different features of houses and trees, navigating to cross-walks, etc. Users reacted enthusiastically and reported feeling immersed within the environment. They highlighted the potential usefulness of such environments for understanding what visual scenes are supposed to look like and their potential for complex training and suggested many future environments they wished to experience.
Pub.: 18 Feb '16, Pinned: 17 Oct '17
Abstract: Virtual environments are becoming ubiquitous, and used in a variety of contexts-from entertainment to training and rehabilitation. Recently, technology for making them more accessible to blind or visually impaired users has been developed, by using sound to represent visual information. The ability of older individuals to interpret these cues has not yet been studied. In this experiment, we studied the effects of age and sensory modality (visual or auditory) on navigation through a virtual maze. We added a layer of complexity by conducting the experiment in a rotating room, in order to test the effect of the spatial bias induced by the rotation on performance. Results from 29 participants showed that with the auditory cues, it took participants a longer time to complete the mazes, they took a longer path length through the maze, they paused more, and had more collisions with the walls, compared to navigation with the visual cues. The older group took a longer time to complete the mazes, they paused more, and had more collisions with the walls, compared to the younger group. There was no effect of room rotation on the performance, nor were there any significant interactions among age, feedback modality and room rotation. We conclude that there is a decline in performance with age, and that while navigation with auditory cues is possible even at an old age, it presents more challenges than visual navigation.
Pub.: 24 Mar '16, Pinned: 17 Oct '17
Abstract: One of the most stirring statistics in relation to the mobility of blind individuals is the high rate of upper body injuries, even when using the white-cane.We here addressed a rehabilitation- oriented challenge of providing a reliable tool for blind people to avoid waist-up obstacles, namely one of the impediments to their successful mobility using currently available methods (e.g., white-cane).We used the EyeCane, a device we developed which translates distances from several angles to haptic and auditory cues in an intuitive and unobtrusive manner, serving both as a primary and secondary mobility aid. We investigated the rehabilitation potential of such a device in facilitating visionless waist-up body protection.After ∼5 minutes of training with the EyeCane blind participants were able to successfully detect and avoid obstacles waist-high and up. This was significantly higher than their success when using the white-cane alone. As avoidance of obstacles required participants to perform additional cognitive process after their detection, the avoidance rate was significantly lower than the detection rate.Our work has demonstrated that the EyeCane has the potential to extend the sensory world of blind individuals by expanding their currently accessible inputs, and has offered them a new practical rehabilitation tool.
Pub.: 06 Feb '17, Pinned: 17 Oct '17
Abstract: How do the anatomically consistent functional selectivities of the brain emerge? A new study by Bola and colleagues reveals task selectivity in auditory rhythm-selective areas in congenitally deaf adults perceiving visual rhythm sequences. Here, we contextualize this result with accumulating evidence from animal and human studies supporting sensory-independent task specializations as a comprehensive principle shaping brain (re)organization.
Pub.: 08 Apr '17, Pinned: 17 Oct '17