Brain-computer interfaces (BCIs) offer a new way for humans to interact with technology, but they also present significant risks. As we’ve previously pointed out, every new technology is also a new opportunity for hackers, and the coming age of BCIs is no exception.
The idea of hacking a BCI sounds like science fiction, because at first glance it appears that hackers might be able to read your mind. What if the first sign you’ve been hacked is a strange voice in your head? Or your innermost thoughts being posted on an internet message board somewhere?
In reality, the information gathered by BCIs is far less sophisticated than fully-formed thoughts. Nonetheless, BCIs do raise some security issues. In this article, we'll explore these.
Reading Your Thoughts?
The first wave of BCIs is now reaching the market, and the range of applications they are being deployed for is certainly impressive. Some of these systems offer users a way of measuring and recording their stress levels, controlling apps, and monitoring their emotions. In the medical field, BCIs also show a lot of promise: medical researchers use them to help those with spinal injuries move paralysed limbs and restore a lost sense of touch.
It’s important to recognize, however, that these systems are not literally “reading the minds” of their users. Brain scanning technology is still a long way from being able to convert electrical fluctuations into semantic content. Instead, these systems look at the aggregate activity in relatively large parts of the brain. In other words, and despite much speculation to this effect, no-one is likely to be stealing your thoughts anytime soon.
This is not to say that the information that BCIs collect isn’t valuable. Even though the level of detail available to those who manufacture BCIs – and therefore to hackers – is far less than fully-formed thoughts, it could still be lucrative in the wrong hands. For example, researchers have already shown that BCIs could be used to get people to disclose information from their PIN numbers or disclose their religious convictions or lack thereof.
If we apply that to the arena of online reputations, which are gold in the Internet Age, it’s easy to see how an attack launched against a high-profile individual could be extremely dangerous in terms of ruining a professional reputation built over many years. When looked at from this perspective, BCI manipulation can be seen to have serious real world consequences.
Other forms of attack remain speculative, but feasible in the near future. If, for instance, BCIs are eventually used to control vehicles (as is being researched by the US military), then the consequence of them being hacked is potentially severe. Similarly, researchers have identified malicious external stimuli as one of the most potentially damaging attacks that could be used on BCIs – showing users particular pieces of content in order to gauge their reaction, for instance.
Perhaps the most important piece of data that could be stolen from a BCI, though, relates to security itself. Researchers at Israel's Ben-Gurion University of the Negev wrote in a recent paper about the possibility of using brain activity data collected from BCIs as a secure form of biometric authentication. Unlike a fingerprint, they say, brain waves are invisible, and therefore extremely difficult to fake.
The irony here is that, if BCIs become a popular way of authenticating users in secure systems, hackers stealing our thoughts will be the least we have to worry about: instead, a hacked BCI could give cybercriminals access to the rest of our online lives.
Security and Privacy
To give credit where it is due, many manufacturers of BCIs are taking the security of these devices seriously. They recognize that BCIs are going to require a multi-level security system in order to keep them safe, and one that secures everything from the wireless signal coming out of the brainwave reader to the server on which these data are stored.
Unfortunately, and as we’ve seen with the IoT, we haven’t yet developed cybersecurity systems that are able to do this: encryption is still not a given in IoT devices, for instance, because many simply don’t have the computing power necessary to perform this on-the-fly. The idea that BCIs will be able to, given the hugely more complex data they are sharing, is a little hard to believe.
In fact, the release of the first generation of BCIs has many in the cybersecurity industry worried because we haven’t yet learned how to adequately work with the systems we already have. Users are uncomfortable with Facebook collecting information on them, despite the average Facebook session lasting 10 minutes. Imagine the outcry if these same companies were collecting information directly from your thoughts, 24 hours a day.
For this reason, it is not so much the technical security of BCIs that is most concerning – as these devices develop, so will the security systems that are designed to protect them. Rather, it’s the fact that we haven’t yet got in place a system for assessing which data can be legitimately collected through these devices, and of making users aware of this.
The Bottom Line
Unfortunately, therefore, to ask the question we started with is slightly to miss the point. The dangers of BCIs are not that hackers might get into them and steal your deepest darkest secrets. It’s that, when these devices finally make it to mainstream consumer markets, they are likely to be tied into tech ecosystems built by Google, Facebook, and Apple. Since all three companies make most of their revenue through advertising, it’s not difficult to imagine that these BCIs will be used to collect data to target advertisements.
Perhaps, then, the question should be turned around. Ethical hackers, if they manage to steal the plans and thoughts of tech leaders, might be doing us all a favor.
Note: This blog article was written by a guest contributor for the purpose of offering a wider variety of content for our readers. The opinions expressed in this guest author article are solely those of the contributor and do not necessarily reflect those of GlobalSign.