I just saw this article about the potential to involuntarily extract information from someone’s brain using an off-the-shelf brain-computer interface (BCI) such as the systems that we’ve previously blogged about and decided to take a quick side track from our Interface/Off series to address it. The idea is to use an electroencephalograph (aka EEG) headset and process the measurements when the subject thinks about various subjects to extract meaningful data. In the study referenced in the article, which was performed by researchers from UC Berkeley, Oxford and the University of Geneva, the data that was extracted included ATM PIN numbers and home addresses.
I find these results to be fascinating, exciting and a little bit disconcerting. Despite the imperfect success rate of this initial study (10-40% chance of obtaining useful information), it is clear that the potential exists to cross a threshold of human privacy that has never been violated – the sanctity of private thought. Obviously, this has the potential to change the world in fairly fundamental ways. I don’t actually think that we are on the cusp of a time in which your thoughts can be plucked out of your head right and left (if nothing else, I believe the necessity of the subject wearing an EEG headset is a limitation that is unlikely to be surmounted any time soon), but these results bring up a really interesting discussion about the ethics of progress.
This is not a new debate – people have been arguing over the benefits and drawbacks of scientific and technological development for centuries, in contexts from economic (robots will steal our jobs!) to medical (cloning, gene therapy, etc.) to apocalyptic (nuclear, biological and chemical weapons). However, a significant difference in this version of the debate is the ubiquity of this technology. I frequently write on this blog about how exciting and powerful I find it that the tools and materials to develop smart products and mechatronic systems are so accessible and inexpensive but this can be a double-edged sword when the resulting technology has the potential for misuse or abuse. For example, the Emotiv and Neurosky BCIs cost around $200-300, including access to their APIs.
I think this post is already long enough, so instead of getting into a detailed look at the philosophy and ethics of science and engineering, I’ll just give my two cents on the big picture and leave it there for now. I think that it is impossible and usually counter-productive to try to restrict development of science or technology, all the more so when there are not natural barriers (such as enormous capital requirements). I also believe that there is inherent good in the pursuit and acquisition of knowledge. However, I think that as a scientist or a developer / engineer, we have a responsibility to let our work be guided by our personal morals and ethics. Hopefully this is enough to ensure that none of us have to worry about stolen thoughts any time soon.