- Neuro tech ranges from medical devices to consumer gadgets
- State, federal privacy efforts pile on amid data misuse fears
Health gadgets and medical devices that track brainwaves—and the companies that make them—are at the core of a next-level debate over privacy protections.
State and federal lawmakers worried about data misuse, manipulation, and even “brain-control weaponry” want to restrict how neurotechnology companies ranging from
States are leading the charge—with California and Colorado having already enacted neural data-focused laws, and at least seven other states considering them.
National legislation appears less likely, but some politicians are taking action. On April 28, Senators Charles Schumer, Maria Cantwell, and Edward Markey, all Democrats, urged the Federal Trade Commission to investigate whether neurotechnology companies were engaging in deceptive or unfair practices related to neural data. The FTC confirmed it received the letter, but declined to comment.
Neural data encompasses all the information generated by nervous system activity. It offers clues about general health, and can be used to treat paralysis or even predict seizures.
The regulatory push is creating costly red tape and raising uncertainty, companies told Bloomberg Law. Lawmakers have cast a wide net, which is concerning not just to their desired target—largely consumer-facing products—but also to medical space companies that must already comply with the federal Health Insurance Portability and Accountability Act, or HIPAA.
“What does become clear out of what is currently proposed or implemented is that there’s insufficient differentiation of what is under that big umbrella of brain-computer interface,” said Florian Solzbacher, co-founder of Blackrock Neurotech, a developer of neural implants.
“Any regulation needs to be intelligent enough to recognize that those exists,” he added, “and recognize that they have differences that may require different handling.”
Apples to Oranges?
Some neurotechnology companies operate solely within the medical realm, like Blackrock Neurotech. Others, like Neurable or Neurode, market directly to consumers, saying their headsets can measure and improve productivity and focus.
Others fall into the “invasive” bucket, making implantable devices to help patients control prosthetics or maneuver wheelchairs through brainwaves. NeuraLink, for example, says its implants allow patients paralyzed after injuries to play chess and video games.
Businesses elsewhere could use neural data to monitor worker attentiveness or better tailor advertising to consumer tastes, according to Morrison Foerster. It is these areas, with their potential for manipulating and monetizing sensitive data, that regulations are meant to address.
Some companies are already preparing for compliance, setting and following self-governance principles. Others are trying to keep up with state proposals, monitoring when—or if—they might fall into scope.
“We can’t afford to ignore that,” Solzbacher said.
Others—often small, one-stop shops—are awaiting the emergence of a consensus before dedicating limited resources.
“My position is—and I may get in trouble for saying this—but I’ll wait for someone to come to me,” said Ryan Field, CEO of Kernel, which sells brain-scanning helmets to researchers. “You come to me and say I’m infringing on something, or I’m in violation of the thing, and then we’ll respond to it.”
Since Kernel isn’t doesn’t currently selling consumer products, Field said he’s stayed “away from these privacy laws,” and built his research around a consent-centered process. If Kernel did launch a clinical product, it’d fall under HIPAA’s scope.
Investing in Data Governance
Waiting also carries risk.
Entities that handle brain wave data should think about privacy and ethics early on—even if they’re not yet in lawmakers’ crosshairs, said Linda K. Clark, partner in Morrison Foerster’s privacy and data security group.
“The time to invest in this is now,” she said, as safeguards are harder to implement once data has been collected.
Step one: Create a governance model for all neural data that aligns with your company’s values and risk tolerance. Step two: Create a process to audit and monitor compliance with your model.
“This is a message that we continue to drill on,” Clark added. “There are five or 10 things that you can do right now to try and reduce your risk.”
For those in scope, the compliance journey should have already begun. But a 2024 report from The Neurorights Foundation, which supports neural data privacy protections, found “substantial gaps” in the practices of companies with products already on the market.
The report cited several companies as not having policies explaining how neural data might be collected, stored, shared, or secured, nor information about consumers’ rights to their data. Most companies allowed the sharing of personal information with corporate partners, research affiliates, and government and law enforcement bodies, it said.
Adam Molnar, co-founder of Neurable said his company based its practices on state privacy laws, even before neural data protections were added. When Neurable first released its products, users had to opt in for every data point collected. When that became too burdensome, Neurable determined which data could be shared with privacy protections “while giving us flexibility” to troubleshoot when the app crashed.
Regulatory Pushback
The laws and proposed bills so far don’t introduce concepts “that are foreign to existing regulatory bodies or agencies in the medical space,” said Sumner L. Norman, co-founder and CEO of Forest Neurotech, a research non-profit.
But by attempting to future-proof laws for hypothetical use cases, there’s a potential to add compliance burdens to what is “already one of the most regulated places on earth.”
The compliance costs imposed by a patchwork of privacy laws could limit businesses to certain states or solely to the academic realm, company leaders said.
Even companies with mature data governance will have to grapple with how to obtain meaningful, informed consent when some users don’t understand the technology or what inferences brain data enable.
And some of neurotechnology’s capabilities might be overblown.
“There are things that you can do,” Field said. “But they’re not the sci-fi future that a lot of people have painted to justify the existence of these state laws.”
To contact the reporter on this story:
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
Learn About Bloomberg Law
AI-powered legal analytics, workflow tools and premium legal & business news.
Already a subscriber?
Log in to keep reading or access research tools.