Mind-reading AI recreates what you’re looking at with amazing accuracy

Michael Le Page in New Scientist:

Artificial intelligence systems can now create remarkably accurate reconstructions of what someone is looking at based on recordings of their brain activity. These reconstructed images are greatly improved when the AI learns which parts of the brain to pay attention to. “As far as I know, these are the closest, most accurate reconstructions,” says Umut Güçlü at Radboud University in the Netherlands. Güçlü’s team is one of several around the world using AI systems to work out what animals or people are seeing from brain recordings and scans. In one previous study, his team used a functional MRI (fMRI) scanner to record the brain activity of three people as they were shown a series of photographs.

In another study, the team used implanted electrode arrays to directly record the brain activity of a single macaque monkey as it looked at AI-generated images. This implant was done for other purposes by another team, says Güçlü’s colleague Thirza Dado, also at Radboud University. “The macaque was not implanted so that we can do reconstruction of perception,” she says. “That is not a good argument to do surgery on animals.” The team has now reanalysed the data from these previous studies using an improved AI system that can learn which parts of the brain it should pay most attention to.

More here.