Scroll Top

Meta now has an AI that can read your mind and draw your thoughts

WHY THIS MATTERS IN BRIEF

Technologies such as BMI and AI are getting even better at reading, or sensing, your thoughts and translating them into images, sound, text, and video opening up new possibilities.

 

Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential University, read about exponential tech and trendsconnect, watch a keynote, or browse my blog.

After years of hard work and development Meta has finally unveiled a ground breaking Artificial Intelligence (AI) system that can almost instantaneously decode visual representations in the brain.

Meta’s AI system captures thousands of brain activity measurements per second and then reconstructs how images are perceived and processed in our minds, according to a new research paper published by the company.

 

See also
Google DeepMind has given its AI the ability to imagine and ultimately innovate

 

“Overall, these results provide an important step towards the decoding – in real time – of the visual processes continuously unfolding within the human brain,” the report said.

The technique leverages MagnetoEncephalography (MEG) to provide a real-time visual representation of thoughts.

MEG is a mind-reading AI. As recently reported a recent study led by the University of California Berkeley showcased the ability of AI to recreate music by scanning brain activity, and elsewhere we’ve seen AI streaming video from people’s minds.

 

See also
A new artificial human skin gives people spiderman like superpowers

 

In the former experiment, participants thought about Pink Floyd’s “Another Brick in the Wall,” and the AI was able to generate audio resembling the song using only data from the brain.

Furthermore, advancements in AI and neurotechnology have led to life-changing applications for individuals with physical disabilities. A recent report highlighted a medical team’s success in implanting microchips in a quadriplegic man’s brain. Using AI, they were able to “relink” his brain to his body and spinal cord, restoring sensation and movement and letting him drive a car again. Such breakthroughs hint at the transformative potential of AI in healthcare and rehabilitation.

 

See also
Unmanned military drones team up with manned fighter jets during BAE trial

 

The potential applications of such technology are vast, from enhancing Virtual Reality (VR) experiences to potentially aiding those who have lost their ability to speak due to brain injuries.

It’s essential to approach such advancements with a balanced perspective, however. The Meta researchers noted that while the MEG decoder is swift, it’s not always precise in image generation. The images it produces represent only higher-level characteristics of the perceived image, such as object categories, but might falter in detailing specifics.

The implications of this technology are profound. Beyond its immediate applications, understanding the foundations of human intelligence and developing AI systems that think like us and that can also read our thoughts could redefine our relationship with technology – again.

 

See also
Under Armour's new trainers are inspired by nature, designed by an AI and 3D printed

 

“The rapid advances of this technology raise several ethical considerations, and most notably, the necessity to preserve mental privacy,” the researchers warned. Ultimately, while AI can now paint our thoughts, it’s up to us to ensure the canvas remains our own.

Related Posts

Leave a comment

FREE! DOWNLOAD THE 2024 EMERGING TECHNOLOGY AND TRENDS CODEXES!DOWNLOAD
+

Awesome! You're now subscribed.

Pin It on Pinterest

Share This