Neurotechnology Will Spur Novel Privacy Issues and Regulations

Jan. 9, 2024, 9:30 AM UTC

We’ve all seen the movies where superheroes and wizards can read minds. What if everyone in real life had that power? Your partner could know what you were thinking during an argument. Your boss could know how many times you thought about quitting. Thanks to neurotechnology, these are no longer purely hypothetical scenarios.

While neurotechnology has already led to major medical advancements, it also raises privacy concerns that could soon receive greater regulatory focus. Neurotechnology companies should establish industry standards and best practices to develop workable frameworks to guide future legislation.

How it Works

Neurotechnology ranges from non-invasive electroencephalogram headbands used to improve your golf game to invasive brain implants paired with artificial intelligence to enable amyotrophic lateral sclerosis, or ALS, patients to speak again.

Researchers have used neurotechnology like MRIs to determine previously unknowable information, such as the exact picture an individual is viewing in a 100,000-picture database, or what choice individuals will make 11 seconds before they’re even conscious of making a choice.

Businesses are investing heavily in neurotechnology. Elon Musk’s Neuralink Corp., an implantable brain chip, has gained approval from the Food and Drug Administration to begin human trials. Meta recently teased a wristband “brain-machine interface” that reads electrical nerve impulses from the brain that direct arm muscles to move. AI then translates these electrical impulses into commands that allow users to control electronic devices such as VR headsets.

Neurotechnology generates unique neurodata collected from an individual’s brain, spinal cord, or nervous system through brain waves, electrical nerve impulses, and areas of the brain engaged during certain actions or thoughts.

Neurodata can create a “brain fingerprint” by measuring brainwaves using EEG to identify individuals, particularly in the criminal setting. It can even provide insights on an individual’s thoughts, memories, emotions, biases, attention, preferences, or intentions. Essentially, it could allow the public and institutions, including the government, to access your mind, the one place that’s still private—for now.

Privacy Implications

Chile is the only country so far that has enacted a law addressing neurorights by requiring scientific and technological developments to “especially protect brain activity as well as the information [collected] from it.”

Neither the US federal government nor the states have passed legislation directly applying privacy principles to neurodata; however, certain categories may be covered by existing laws in limited circumstances.

For example, the Health Insurance Portability and Accountability Act would apply to neurodata considered protected health information, or PHI. But neurodata derived from implanted devices inserted for commercial purposes, such as those proposed by Neuralink, likely wouldn’t qualify as PHI, rendering the Health Insurance Portability and Accountability Act inapplicable.

Other laws may partially address neuroprivacy concerns:

  • The Health Breach Notification Rule is applicable to non-PHI neurodata that is considered personal health records. The rule requires notice if an individual’s personal health record has been disclosed without the individual’s authorization. This is typically obtained by disclosing collection and uses of data via a privacy policy. However, given the complexities and privacy implications of neurodata, that process likely doesn’t provide an individual adequate notice or information to provide such authorization.
  • State privacy laws may apply to neurodata that can be linked to an identifiable consumer or household. Some laws may also limit use of neurodata to advertise to consumers in such states.
  • Biometric privacy laws would likely apply to any neurodata used to identify an individual (e.g., brain fingerprints).
  • The Fourth Amendment and Fifth Amendment to the Constitution may provide guardrails for law enforcement seeking to use defendants’ neurodata to aid in criminal investigations. It is difficult, however, to predict how a court would apply current jurisprudence to neurodata.
  • Laws applicable to workplace monitoring of employees, including wiretapping laws, likely will apply to neurodata used for that purpose. Notably, some employers already monitor employee attention using EEG.

Next Steps

Neurotechnology and neurodata may lead to incredible advances that should be encouraged, such as enabling paralyzed patients to walk.

Still, companies in the neurotech space shouldn’t expect spotty regulation to last forever. Neurodata’s sensitivity and value make it a prime candidate for legislative and regulatory focus. For example, in April 2023, Washington state passed the My Health My Data Act specifically to regulate consumer health data, possibly including neurodata.

Although neurodata legislation and regulations may be inevitable, neurotech companies and trade groups can be proactive. By developing neurodata industry standards and best practices now, companies can provide legislators frameworks that are workable but also provide strong protections for consumers. If done well, such frameworks may provide models for future laws.

When considering what industry standards could look like, companies could draw on principles such as transparency in privacy policies, autonomy through informed consent, and security via strong safeguards for neurodata that consumers and regulators will likely expect.

Additionally, neurotech companies could consider providing internal training on permitted (and unpermitted) uses of neurodata—such as whether it can be used for marketing or training AI models.

The story of neurotechnology, neurodata, and the potential impacts on society has just begun, but steps can be taken now to protect neuroprivacy.

This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.

Author Information

Sara Pullen Guercio is an associate in Alston & Bird’s technology and privacy group and former ICU nurse.

Alston & Bird’s Daniel Felz contributed to this article.

Write for Us: Author Guidelines

To contact the editors responsible for this story: Daniel Xu at dxu@bloombergindustry.com; Melanie Cohen at mcohen@bloombergindustry.com

Learn more about Bloomberg Law or Log In to keep reading:

Learn About Bloomberg Law

AI-powered legal analytics, workflow tools and premium legal & business news.

Already a subscriber?

Log in to keep reading or access research tools.