melanie_swan's picture
Philosophy and Economic Theory, the New School for Social Research
Neural Data Privacy Rights: An Invitation For Progress In The Guise Of An Approaching Worry

A worry that is not yet on the scientific or cultural agenda is neural data privacy rights. Not even biometric data privacy rights are in purview yet which is surprising given the personal data streams that are amassing from quantified self-tracking activities. There are several reasons why neural data privacy rights could become an important concern. First, personalized neural data streams are already starting to be available from sleep-monitoring devices, and this could expand to eventually include data from eye-tracking glasses, continuously-worn consumer EEGs, and portable MRIs. At some point, the validity and utility of neural data may be established with correlation to a variety of human health and physical and mental performance states. Despite the sensitivity of these data, security may be practically impossible. Malicious hacking of personal biometric data could occur, and would need an in-kind response. There could be many required and optional uses of personal biometric and neurometric data in the future for which different permissioning models would be needed.

Personal biometric data streams are growing as individuals engage in quantified self-tracking with smartphone applications, biomonitoring gadgets, and other Internet-connected tools. The adoption of wearable electronics (e.g.; smartwatches, disposable patches, augmented eyewear) could further hasten this, and might have even faster adoption than tablets (to date, the most-quickly adopted electronics platform). This could allow the unobtrusive collection of vast amounts of previously-unavailable objective metric data. Not only could this include biometrics like cortisol (e.g.; stress) levels, galvanic skin response, heart rate variability, and neurotransmitter levels (e.g.; dopamine, serotonin, oxytocin), but also robust neurometrics such as brain signals and eye-tracking data that were formerly obtainable only through lab-based equipment. These data might then be mapped to predict an individual's mental state and behavior.

Objective metrics could prompt growth in many scientific fields, with a new understanding of cognition and emotion, and the possibility to address ongoing neuroscience problems like consciousness. The potential ability to apply objective metrics and quantitative definitions to mental processes also raises the issue of neural data privacy rights, especially if technological advancement means the increasingly facile detection of the states of others (imagine a ceiling-based reader detecting the states of a whole room full of people). Biometric data is sensitive as a health data privacy concern, and neural data even more so. There is something special about the brain which is deeply personal, and the immediate reaction is towards strong privacy in this area. For example, many individuals are willing to share their personal genomic data generally, but not their Alzheimer's disease risk profile. Neural data privacy rights could be a worry, but are overall an invitation for progress since tools are already in development that could help: diverse research ecosystems, tiered user participation models, and a response to malicious hacking.

Given the high potential value of neural data to science, it is likely that privacy models will be negotiated to move forward with projects. There could be pressure to achieve scale quickly, both in the amount and types of data collected, and in the validity and utility of these data (still at issue in some areas of personalized genomics). Raw data streams need to be linked to neurophysiological states. Already an ecosystem of open and closed research models is evolving to accommodate different configurations of those conducting and participating in research. One modern means of realizing scale is through crowdsourcing, both for data acquisition and analysis. This could be particularly true here as low-cost tools for neural data monitoring become consumer-available, and interested individuals contribute their information to an open data commons. Different levels of privacy preferences are accommodated as a small percentage of individuals that are comfortable sharing their data opt in to create a valuable public good usable by all. Even more than has happened in genomics (but not in longitudinal phenotypic data), open-access data could become a norm in neural studies.

Perhaps not initially, but in a more mature mainstream future for neural data, a granular tiered permissioning system could be implemented for enhanced privacy and security. A familiar example is the access tiers (e.g.; family, friends) used in social networks like Facebook and Google Plus. With neural data similar (and greater) specificity could exist, for example, permissioning professional colleagues into certain neural data streams at certain times of day. However, there may be limitations related to the current lack of understanding of neural data streams generally, and how signals may be transmitted, processed, and controlled.

The malicious hacking of neural data streams is a potential problem. Issues could arise in both hacking external data streams and devices (similar to any other data security breach), and hacking communication going back into the human. The latter is too far ahead for useful speculation, but the precedent could be that of spam, malware, and computer viruses. These are ‘Red Queen' problems, where perpetrators and responders compete in lock-step, effectively running to stay in place, often innovating incrementally to temporarily out-compete the other. Malicious neural data stream hacking would likely not occur in a vacuum—unfortunate side effects would be expected, and responses analogous to anti-virus software would be required.

In conclusion, the area of neural data privacy rights is a concern that has not yet fallen into scientific or cultural purview. Rather than being an inhibitory worry, this area provides an invitation for advancing to a next node in societal progress. The potential long-term payoff of the continuous bioneurometric information climate is significant. Objective metrics data collection and its permissioned broadcast might greatly improve both the knowledge of the self, and the ability to understand and collaborate with others. As personalized genomics has helped to destigmatize health issues, neural data could help similarly to destigmatize mental health and behavioral issues, especially by inferring the issues of the many from the data of the few. Improved interactions with the self and others could free humanity to focus on higher levels of capacity, spending less time, emotion, and cognitive load on evolutionary-relic communications problems whilst transitioning to a truly advanced society.