In 2021, it should come as no surprise that Mark Zuckerberg and Elon Musk are competing to implant microchips inside our brains. Before cancelling the project in July, Facebook was tinkering with a headset that would allow you to text by thinking. In April, Musk’s neurotechnology company Neuralink released footage of a microchip-wielding macaque playing Pong with its mind. (A man equipped with another microchip challenged the monkey — no further word on the match, but I put $10 on the primate.)
These are just a couple examples of so-called brain-computer interfaces (BCIs), and while an animal playing Pong with its mind may sound like a publicity stunt, BCIs are a very real — and kinda alarming — emerging technology. Let’s explore their applications and some big, bad ethical concerns so you’re prepared when the tech dweebs progress from macaques to people.
Wait, what’s a brain-computer interface again?
A gadget that collects brain signals, analyzes them, then usually translates those signals into commands that are broadcast to appliances, which perform relevant actions. To that end, Facebook’s purported BCI was intended to transcribe your thoughts into text messages so you wouldn’t have to type them out with your actual thumbs.
And people already have them implanted into their brains?
A few do! In an experiment, which was reported in the journal Nature earlier this year, a BCI created by a team of researchers called BrainGate translated a paralyzed man’s thoughts into text with 94 percent accuracy. Keep in mind, however, that BCIs don’t necessarily have to be implanted — they can also take the form of wearable devices, like Facebook’s proposed headset.
What else can they do?
All sorts of things. Researchers have already used BCIs to make mind-controlled drones and video games. Rudimentary versions of brain-to-brain communication similar to telepathy have been performed by BCIs, too (although, that still needs a lot of work). A Chinese study also found that they can effectively monitor the performance of bus drivers, which could be expanded to other kinds of employees. There are even in-the-works products like the Neurodildo, a BCI-controlled sex toy.
Hell, even the consumer market has BCIs at the ready: A Japanese company called Neurowear released a set of mind-controlled cat ears and a fluffy tail that wags when the wearer is excited. They also put out headphones that match songs to your mood by scanning your brainwaves.
Moreover, BCIs have heaps of potential applications, depending on how quickly and effectively the technology evolves. For example, we already have microchips that manage Parkinson’s disease symptoms, but BCIs could, in theory, mitigate other ailments like depression or Alzheimer’s. It’s possible that they could be used for military training as well, especially when combined with video games to recreate realistic scenarios. Likewise, BCIs can help a person better control robotic limbs, essentially turning them into real-life Terminators.
One of the more promising potential applications of BCIs is in the field of brain research. Case in point: Some researchers have suggested that we use BCIs and virtual-reality gaming to research paraphilia, or “unusual” sexual preferences. Together, BCIs and VR can create extremely immersive experiences, and they allow scientists to collect data directly from the brain, rather than relying on possibly biased self-reported data.
In short, there’s a world of possibilities.
Sounds cool, but scary. Should I be concerned?
Well, yes and no. As you can imagine, BCIs pose a number of ethical problems — e.g., closely monitoring the performance of bus drivers (or other workers that toil away in dangerous conditions) could prevent accidents, but it’s the same kind of micromanaging that results in Amazon workers peeing in bottles.
The most blatant ethical concern, however, is the issue of data. If we’re pulling data directly from a person’s brain, that takes Big Data and privacy breaches to a whole new level. “If you have a lot of data about people’s thoughts and emotions, you could presumably use it to manipulate them,” says Steffen Steinert, author of Wired Emotions: Ethical Issues of Affective Brain-Computer Interfaces. An obvious example of this is selling things to people using their data or creating a second Facebook-Cambridge Analytica data scandal, but BCIs could potentially be employed to influence a person’s emotional state, too — say, if an employer uses BCIs to detect an employee’s negative brain waves, they could respond by adjusting the music in the office to change those emotions.
This isn’t inherently a bad thing. After all, it would be kinda cool if BCIs could change an ebook or video game based on a user’s mood. But in the wrong hands, a person’s autonomy could be at risk. Likewise, in the workplace, actively manipulating a person’s mood creates an issue of responsibility. One hypothetical: If an employer uses a BCI to impact an employee’s actions and an accident happens, it’s tough to pin down the blame.
Jakub Binter, one of the researchers behind the previously mentioned use of BCIs and virtual-reality gaming, also expresses some concern about the extreme immersiveness of this technology, especially because he says it’s realistic enough to cause PTSD in certain scenarios. On the flip side, he worries that BCI and VR porn could potentially be a “superstimulus” that results in us spending even less time with real humans.
That said, just like cars, although BCIs can be incredibly harmful if used in certain ways, they can be immensely useful, too. It’s also important to remember that while medical and gaming uses of BCIs in particular are somewhat advanced, much of this technology is still probably decades away.
What’s the holdup?
For one, the technology isn’t quite there. The human skull is pretty good at disrupting brain signals from being read, and in order for a BCI to work well, it requires really detailed information. A.I. technology could use a boost as well to help make sense of the brain waves that BCIs detect.
The other hurdle is public acceptance. Less invasive BCIs tend to not work as well, but few people are willing to have a microchip implanted into their brain. Not to mention, these microchips are likely to move and corrode over time, so we’re still working out the kinks when it comes to installing and maintaining them.
How do we make sure BCIs aren’t used to make an army of robo-humans?
Basically, the government needs to step up and proactively formulate regulations so we don’t have another situation where Zuckerberg is standing trial and regulators don’t even understand the tech. “If we wait too long to think about this, it will be hard to catch up with the technology,” Steinert says. “The sooner we figure out the ethics and regulations, the better.”
Likewise, the government should work harder to educate the public about emerging technologies like BCIs, because right now they’re mostly being discussed in scientific journals or on Elon Musk’s Twitter, which isn’t exactly comforting.
Lastly, we can’t let sensationalism distract us from the real issues. Everyone’s worried about BCIs being used to control our thoughts, but the most likely problem is once again that Big Data will use them to collect even more information on us, then use that to influence our habits and make loads of money.
Sounds like the future will be totally normal and not at all terrifying, huh?