Neurotechnology (Neurotech) is a type of technology that takes data directly from the brain and nervous system. The acclaimed sci-fi anthology series ‘Black Mirror’ does a great job of showing the capabilities of this technology. In a particular episode (‘the entire history of you’), you would see the characters being able to record, review and replay all of their past memories in real time using a memory chip device that interacts with the brain. Although this is a fictitious example from a TV series, there are in fact real-life neurotech projects that are already underway and developing as rapidly as advancements in AI. As confirmed in the UK Information Commissioner’s report, Neurotechnology is already being employed in the UK healthcare sector and is fast being developed for application in the areas of personal welfare, sports, marketing sectors, and workplace surveillance.

For all the positive technological advancements this field heralds, the ICO takes the view that the uniquely sensitive nature of neurodata raises critical concerns regarding data privacy and is concerned about the possibility of inherent bias and erroneous data being incorporated into the technology if such systems are not developed and tested appropriately.

The ICO’s recent report seeks to throw light on the risks of neurotech and by so doing set the stage for a more tailored guidance to regulate the activities of participants and tech developers in this sector, as it anticipates that Neurotech will become widespread over the next decade. We’ve distilled some of its notable points below:

Key points from the report:

Areas of significant risk: The report notes that processing neurodata poses significant risks to the information rights of individuals in three distinct ways:

  • The automatic and intrinsic character of neurodata, which is generated subconsciously, with humans having no direct control over the exact information that is given;
  • The likelihood of corporations creating large scale, complicated data sets about individuals, which could lead to organisations drawing precise inferences about highly sensitive information, such as mental health;
  • Neurotechnology has the capacity to influence neuro-patterns and alter behaviour in addition to monitoring and collecting neurodata. This could enhance the risks associated with the automated use of personal information and result in a lack of transparency and understanding about why and how companies utilise personal information.

Current regulatory issues: The report also emphasises several regulatory difficulties concerning the handling of personally identifiable neurodata, such as:

  • Regulatory definitions: For one, there is no explicit definition of neurodata as either a specific form of personal information or special category data under the UK GDPR.
  • Neuro-discrimination: Where devices are not tested and evaluated on a wide range of people to guarantee that data collecting stays accurate and dependable, risks of discrimination can arise.
  • Consent, neurodata and appropriate bases of processing: An example is given of where a person uses a neurotech (EEG) headset to increase their online gaming performance. In such situations, can the user (data subject) fully comprehend the nature of the information that they are going to reveal? Can the organisation (controller/processor) also be fully aware of this?
  • Closed-loop processing: This sort of processing involves the use of AI or machine learning (ML) to take automated action unprompted by the user and without any significant human intervention. It was noted that this can heighten the risk of inappropriate automated processing under Article 22 of the UK GDPR.
  • Other considerations include accuracy and data minimisation, neurodata and research (in connection with data sharing and transparency); and information rights (i.e. the rights to erasure, rectification and portability).

What to expect:

The ICO plans to publish new neurodata guidance by 2025 to specifically address these challenges. The guidance will cover the interpretation of basic legal and technical neurotechnology definitions, relate this to the current ICO guidance and provide sector-specific case studies, among other things. While we wait and keep an eye on developments, organisations that use or are considering deploying neurotech should begin thinking about any relevant data protection issues that may arise from the handling of such technologies, and stay tuned to further developments in this space. Indeed, businesses that are or expect to engage with neurotech should consider the impact of applying the data protection framework to such practices to allow sufficient time to engage and consult on the inevitable thorny aspects of compliance. We are sure the ICO would welcome your thoughts.

(Co-authored with Amaka Uzoukwu)

Author

Vin leads our London Data Privacy practice and is also a member of our Global Privacy & Security Leadership team bringing his vast experience in this specialist area for over 22 years, advising clients from various data-rich sectors including retail, financial services/fin-tech, life sciences, healthcare, proptech and technology platforms.

Author

Chiemeka works as a privacy specialist in Baker McKenzie's Intellectual Property & Technology Practice Group and is based in the firm's London office. He is a Nigerian-qualified lawyer who focuses in data protection, privacy, and technology transactions.