Elon Musk was undoubtedly the boldest Broligarhi in terms of the brain machine interface. However, Mark Zuckerberg has a hot heel.
Shortly after Musk co-founded Neuralink – a company that puts chips and counts three human brains – In 2016, Meta (then Facebook) took on neurotechnical research, where people typing their brains We have announced plans to build technology that can be heard. Language through the skin.
Since then, meta-funded researchers have found ways to decipher audio from recorded activity from surgically implanted electrodes in people's brains. Brain surgery can feel valuable to paralyzed people who want to regain their ability to communicate, but such invasive devices are a tough sell for those who want to type faster. The normal people in commercial equipment may need to be wearable and removable, not actually permanent.
Sign up here to explore the big complex problems the world is facing and the most efficient ways to solve them. It was sent twice a week.
Meta expressed efforts several years ago to build a consumer brain computer interface. Brain Reading Headbands were not ready for Primetime. Instead of developing new gadgets directly, the company is investing in slower combustion neuroscience research. Their hope is that studying the brain will help them build AI that excels in what humans are good at, like processing language. Some of this research remained focused on mental reading. Specifically, it deciphers how the brain produces sentences.
However, this month, Meta created a breakthrough.
In collaboration with the Basque Center for Cognition, Brain and Language, researchers at Meta's Basic Artificial Intelligence Research (FAIR) Lab were able to accurately decode brain signals recorded outside the skull. No surgery was required.
Of course, this was in the lab. However, these findings represent a major step towards the wearable mind reading device Zuckerberg promised eight years ago.
And as interbrain devices in the not-so-distant future approach commercial viability, we need to tackle the meaning of meta becoming gatekeepers. In the lab, mind-reading technology promises to uncover previously unknown information about how our brains build thoughts, make decisions, and guide actions. But around the world, tech companies can misuse brain data unless they establish and enforce regulations to stop them.
Meta can decode implicit sentences from the magnetic field of your brain
Until a few years ago, researchers were unable to decipher implicit language without burying electrodes in the brain that required surgery. In 2023, University of Texas scientists used fMRI to use a combination of versions of AI models that power chatgpt to decode the key points of implicit sentences from brain activity. However, fMRI machines cost millions of dollars, outperform fully grown elephants and limit their usefulness outside the lab.
Most research in the human brain involves measuring the proxy for neural activity itself, as neuroscientists generally don't want to stick recording devices into the human brain. The fMRI scanner measures how much blood flows through brain cells during work. This requires a bit of delay. Another method known as magnetic enemies (MEG) measures the magnetic field that brain cells create when they transmit electrical signals. Neither of these techniques can track what individual cells are doing, but both provide rough snapshots of brain activity patterns while performing tasks such as reading or typing. Masu. What's cool is that unlike fMRI, Meg can record his brain in near real time.
So Meta researchers recruited 35 volunteers to type sentences on the keyboard while sitting on a Megscanner, which looks like a chair that drys hair from space. Some people gelled EEG (EEG) electrodes on their face and scalp to record electrical signals emitted from the skull from the brain cells.
Each person's brain activity helped them train AI models to guess what they typed. Essentially, some of the models have learned to match the patterns of brain activity to the letters they were typing at the time. Researchers provided another part of the model to a bundle of Wikipedia articles to teach us how text works and how it is often adjacent to each other in different contexts. Using this information, someone was meant to type “I Love You”, but their brain signal read “I Lovr Yoi”. Letters and words must function in the context.
Using a much more portable EEG than fMRI or MEG scanner, meta researchers can use AI to decode the exact characters that someone had been entering about a third of their time. It's done. It's not particularly impressive until you think that EEG records brain cells through electrodes outside the skull. It's like trying to eavesdrop on conversations in a crowded bar. Standing outside, holding glass on the wall. Considering all the noise, even a third of that conversation is pretty challenging.
MEG captures brain activity more accurately than EEG because magnetic signals from brain cells are not as distorted by the skull as they are by the electrical signals. By supplying MEG data to AI models, meta-students have accurately deciphered between 70-80% of what people typed, blowing previous models out of the water. Therefore, if Meta wants to build a mind-reading headband, recording a magnetic field may be the best bet.
Similar to fMRI, the MEG devices used in this study were huge and expensive. However, Megscanners like wearable helmets that weigh only a few pounds are already present and are even more sensitive than portable scanners. These portable meg devices are a few pounds heavier and almost stupid than Meta's latest VR headset, Quest 3. These MEG devices are not yet functional outside of a special magnetically shielded room (not yet open to the public), but it is not difficult to imagine a possible future.
High-tech companies won't protect unless they create brain data
He is not the only high-tech giant that invests heavily in neuroscience research. Both Google and Microsoft have teams dedicated to brain research. Both Nvidia and IBM are collaborating with neuroscience research institutes.
The field of AI and neuroscience has a long history of mutual pollination. The brain has many features that technology developers want to replicate on computers, such as energy efficiency and learning without massive training data. Tech companies build the tools neuroscientists want to use. (The idea of using non-invasive brain scans to diagnose mental illness has been a neuroscience epidemic for decades. After all, diagnosis of depression is a rapid EEG scan. It's very useful for practitioners if it was as simple as that.)
Here, while meta uses collected brain data, people study how the brain transforms abstract ideas into words, syllables and letters, and AI chatbots help the same thing. We looked into our long-term goal of coming up with ways to do this.
This data supports long-standing hypotheses held by neuroscientists and linguists, and proposes to create a speech from above. When I'm preparing to say something, my brain first draws the whole thing (“I'm going to go to lunch soon”), then a word (“go”), then one Zoom in to one syllable (“go”). As I type, my brain focuses on each specific letter (“G”, “O”). Meta saw that these expressions (context, words, syllables, letters) overlap during language production, peaking, and fade strength at various times.
Understanding language production, in theory, helps the meta achieve its goal of “restoration”[ing] Communication for people who have lost the ability to speak. “And in reality there are millions of people recovering from traumatic brain damage, stroke, or another neurological disorder that is difficult to talk about. A wearable device that makes communication easier again can be a very positive force in someone else's life.
But we know that isn't their only motivation. For Silicon Valley, the brain represents the ultimate barrier between humans and their devices.
Quick Sanity Check: Meta's goal is not to fuse humans with a computer (the mask thing), but to create a portable, removable headset that someone can use to input or play video games in their own mind. It was to sell. To uncover such devices, Meta will need to overcome two huge technical hurdles and even larger ethical hurdles.
First, we need to decode implicit thoughts from outside the skull. check.
Second, they need to do it with a device that someone can reasonably afford, stay in their home and wear on their heads. For now, this is quite far.
Most importantly, when these devices exist, there is a need for robust protection for people's cognitive freedom, that is, our fundamental right to control our own consciousness. These safeguard times aren't after they hit the store. It's now.
“Facebook is already good at peering into the brain without the need for electrodes or fMRI,” Roland Nadler, a neuroethicist at the University of British Columbia, told my colleague Cigal Samuel in 2019. .
Meta already uses AI to extrapolate mental health from its digital footprint. They use AI to flag and sometimes delete posts about self-harm and suicide, and can cause unconsensual “wellness checks” when detected regarding messenger or WhatsApp messages.
Many have given up on digital privacy completely when they consider how much convenience they can get by providing personal data that is connected to food delivery, remote work, and online friends. Many people feel uncomfortable with the amount of personal information companies take from us, but they believe they have no control over their privacy.
Last year, neuroscientists, lawyers and lawmakers began passing laws that explicitly include neural data in state privacy laws. Some small neurotechnology companies are already collecting brain data from consumer products. Before a large company like Meta can do the same, stronger protections should be implemented.
Zuckerberg has been racing to make the Meta a card for the past two months. His company is unlikely to process our most private data carefully, not at least not being adopted.
But in a world where text headbands from the metabrand brain become as normal as keyboards, sharing brain data may feel like a prerequisite for participating in normal life. Instead of providing a monitor and keyboard in the office, imagine a workplace where you provide a helmet to create text and strap it in. Text devices feel like they avoid smartphones. However, it is certainly not a path where there is little resistance.
With less guarantees for our mental security, we need to determine whether the convenience of controlling things in our minds is worth colonizing our last true space. there is.
This version of the story originally appeared in the future Perfect Newsletter. Sign up here!