21st May 2018
21st May 2018
Curated by Endre Szvetnik
Researchers using AI for a navigational problem ended up with a system that developed virtual brain cells similar to those helping mammals orient themselves.
In 10 seconds? Artificial neural networks looking for the best path can now learn to mimic the functions of mammals’ navigational brain cells, which help us intuitively choose shortcuts. (Read the science)
What, AI now has brain cells? Sort of! But the cool bit is that scientists didn’t mean to create brain cells. They fed a blank ‘neural network’ with inputs to simulate the part of the brain responsible for spatial awareness. To their surprise, they ended up with an AI that mimicked grid cells, which form an imaginary hexagonal grid helping mammals to orient themselves. (More about grid cells)
And why are these virtual grid cells big news? Because they allowed the system to beat humans and conventional AI at finding shortcuts. In the study, both algorithms learnt to go through a maze, past a closed door. When the door was later opened, only the grid-equipped AI saw it as a shortcut, letting it shave two thirds off the journey. (Read more)
How did this AI learn to find the shortcut? Weirdly enough, the people behind the research paper don’t know, but they understand what helped – they gave the AI ‘rewards’ when it performed well, a method called reinforcement learning. This resulted in the AI developing a superior ability to find shortcuts. (More about reinforcement learning)
Amazing, so AI is now learning from animals? We could say so. Other researchers successfully trained a neural network to behave like a dog and found it developed visual intelligence, recognising food, living beings and obstacles. The program also learnt what surfaces dogs would run on and what they would avoid. Coding this data into the system would have taken a lot more effort. (Read more)
How does all this advance the field? These results reinforce the links between neuroscience and AI design. ‘Virtual’ grid cells confirm existing knowledge about their real-world counterparts and suggest ways to map brain functions. They also throw up ideas about how to develop AI navigation – such as for drones to seek out survivors in a collapsed building. Looks like animals can teach robots a trick or two!
A little help from a friend, called Kelp
University of Washington researchers used a malamute dog, Kelp, to gather canine data.
They mounted a GoPro camera on his head and attached motion sensors to his body.
With this kit, they’ve recorded Kelp going on about his daily routine and matched the motion data with the video.
Having fed it into the neural network, the AI was able to predict the dog’s behaviour from visual clues.
For example, if someone threw a stick, the virtual dog would run to fetch it.
(Psst, Endre distilled 12 research papers to save you 801.9 min)