An actual working mind probe
This is incredible…researchers at Berkeley have developed a system that reads people’s minds while they watch a video and then roughly reconstructs what they were watching from thousands of hours of YouTube videos. This short demo shows how it works:
Nishimoto and two other research team members served as subjects for the experiment, because the procedure requires volunteers to remain still inside the MRI scanner for hours at a time.
They watched two separate sets of Hollywood movie trailers, while fMRI was used to measure blood flow through the visual cortex, the part of the brain that processes visual information. On the computer, the brain was divided into small, three-dimensional cubes known as volumetric pixels, or “voxels.”
“We built a model for each voxel that describes how shape and motion information in the movie is mapped into brain activity,” Nishimoto said.
The brain activity recorded while subjects viewed the first set of clips was fed into a computer program that learned, second by second, to associate visual patterns in the movie with the corresponding brain activity.
Brain activity evoked by the second set of clips was used to test the movie reconstruction algorithm. This was done by feeding 18 million seconds of random YouTube videos into the computer program so that it could predict the brain activity that each film clip would most likely evoke in each subject.
Finally, the 100 clips that the computer program decided were most similar to the clip that the subject had probably seen were merged to produce a blurry yet continuous reconstruction of the original movie.
The kicker: “the breakthrough paves the way for reproducing the movies inside our heads that no one else sees, such as dreams and memories”. First time travelling neutrinos and now this…what a time to be alive. (via β essl)
Stay Connected