Improving AI by assessing where your eyes will look

Posted Dec 15, 2018 by Tim Sandle
By assessing precise brain measurements, scientists have successfully predicted how people's eyes move when viewing natural scenes. Understanding this is the key to improving visual-based artificial intelligence.
Microsoft s Seeing AI uses artificial intelligence to work out what s going in the world and provide...
Microsoft's Seeing AI uses artificial intelligence to work out what's going in the world and provide an audio description to the user
© Microsoft
The new an advance in understanding the human visual system is designed to lead to improvements with a range of artificial intelligence efforts, especially those based on visual systems. This could aid the application of emergent technologies like driverless cars. Another application is the use of embedded vision to augment traditional law enforcement techniques in real world surveillance settings, using artificial intelligence to assist with image recognition and verification.
To improve visual-based artificial intelligence scientists from Yale University undertook a study into how people's eyes move when viewing natural scenes, such as taking in the vista of a mountain range. Such developments fall within the science of computer vision, which captures processes designed to processing and analyzing real-world images and video and then to enable machines to extract meaningful, contextual information from the physical world. The way this type of image recognition works is through the creation of a neural network that processes the individual pixels of an image.
The new study has found that by analyzing the brain responses to complex, natural scenes, scientists are now able to predict where people will direct their attention and gaze. Having established this, it has been possible to apply this to deep convolutional neural networks, which are commonly used in artificial intelligence. This provides the basis for improving image recognition.
Commenting on the research, Marvin Chun from Yale said: "We are visual beings and knowing how the brain rapidly computes where to look is fundamentally important...The work represents a perfect marriage of neuroscience and data science."
The study has been reported to the journal Nature Communications. The research paper is headed "Predicting eye movement patterns from fMRI responses to natural scenes."