MRI, which stands for magnetic resonance imagingIt is often used in medicine to view the internal structure of living things. This method is preferred in situations where it is difficult to perceive from the outside in our body and plays an important role in some diseases and treatments.
According to a study published on bioRxiv, scientists have made a significant improvement using an MRI machine. Researchers at the University of Texas in the US, using MRI reading your mind found his way.
Mind-reading method using an MRI machine could be used in brain-computer interfaces

To achieve this, the researchers developed a device that we can also call a ‘decoder’. algorithm developed. This is based on data from an ordinary MRI machine. the words he hears and thinks had it read. While there have been similar studies before, it was also one of the explanations that the new method was a first.
Alexander Huth, one of the authors of the study, also commented on his research.If you had asked any neuroscientist in the world twenty years ago if this was possible, they would have laughed.” used his statements. On the other hand, another researcher, Yukiyasu Kamitani, said the exciting study can form a basis for brain-computer interface applications. added.
We can say that it is quite difficult to use MRI data in such studies; because they are rather slow compared to human thoughts. Instead of neurons’ activities taking place in milliseconds, MRI devices changes in blood flow measures. Such changes can also take seconds. Researcher Huth argues that the design of the study works because the system does not literally decipher language, but distinguishes the meaning of a sentence or thought.
The algorithm managed to make sense even when watching a silent movie

In the experiment, the subjects have before 4 p.m. podcasts and some stories were listened to so thoughts were resolved. The experts used the data to train the algorithm, and the system compared changes in blood flow to what the subjects were listening to. that you can relate expressed. Huth also stated that the method can understand changes in blood flow “pretty well” and that the results are promising.
In addition, the method participants even when watching a silent movie managed to make sense. This showed that the decoder is not limited to spoken language only. The researchers added that the method could help us better understand how different parts of the brain play a role in understanding the world.
Finally, it is worth noting that there are some shortcomings in the algorithm. The method failed to detect who said what in podcast recordings. In other words, this means that the algorithm clearly understands what is going on, but the source having trouble understanding who you are revealed. Experts hope the algorithm can be used to develop technologies that can help people who can’t speak by laying the groundwork for brain-computer interfaces.