Privacy has traditionally been defined as the right to seclude oneself from the public and the right to be left alone. But in the digital realm, privacy requires more than physical boundaries; it demands digitally adapted rights and protections over one’s personal data—especially sensitive information.
Today, people increasingly rely on digital platforms for health services, such as health-tracking apps or devices, mental health chatbots, or online patient portals. These platforms often collect personal data constantly, raising privacy challenges.
Congress should work to establish a federal privacy statute that would enable individuals to delete their personal data. Such a law would ensure that privacy rights keep up with evolving data practices.
A right to delete personal data has taken on increased prominence as tools for exercising data privacy haven’t proven as effective as hoped. For example, besides deletion, individuals can withhold data or seek to obscure it using tools such as virtual private networks—an encrypted way of communicating over the internet.
Yet these protections may be more illusory than real: As data sharing becomes essential for accessing everyday services, including health care, fully opting out by withholding data is becoming less feasible. Also, companies can block access for users using VPNs or other attempts to obscure data. Sophisticated users may circumvent such barriers, but average users may not.
Hence the emphasis on data deletion. Although it faces complex challenges, deletion is emerging as a strategic pillar of data privacy protection and may become a fundamental element of a future US federal privacy law.
This shift is already evident in the legal system. As we discuss below, courts and regulators have resorted to data deletion as the central privacy remedy in class-action settlements in the face of mounting data privacy violations. But outside of litigated settlements, how well is data deletion working for the average privacy-concerned American?
One major challenge is that data deletion depends on companies or platforms granting and honoring deletion requests. The US has no general federal privacy law or enforceable mechanism requiring companies to universally comply with such requests. In the absence of a unified privacy framework, protections rely on fragmented sectoral statutes such as HIPAA for limited health data, GLBA for financial data, and COPPA for children’s online data, alongside a patchwork of state laws that diverge in development, definitions, enforcement, and scope.
Recent landmark court actions illustrate both the power and the limits of personal data deletion as a remedy under existing US privacy and consumer‑protection laws.
In FTC, In re Flo Health, Inc. (2021), the Federal Trade Commission found that the Flo fertility app shared users’ sensitive health data with third parties, even though it had promised to keep that information private.
As part of the settlement, Flo was ordered by the FTC to instruct third parties to delete the data and notify users about the breach. The settlement also required Flo to obtain affirmative express consent from its users before sharing health data with third parties. But the decision only applied to Flo and didn’t set any broader rules for the growing wellness tech industry.
In ACLU v. Clearview AI (2022), Clearview AI was sued for collecting billions of publicly available photos to build a vast facial recognition database, which it then sold or provided access to private companies and law enforcement.
The settlement banned Clearview from selling access to its database to most private entities nationwide and barred its use by any Illinois entity for five years. It also required Clearview to delete all faceprints collected before the company ended the services for its clients and other entities. But the settlement doesn’t prevent Clearview from re-creating facial vectors under relevant Illinois privacy law.
These cases show that regulators are increasingly turning to data deletion as a remedy for sensitive data misuse, but the effect remains limited without strong privacy protections and laws. Enforcement is case-by-case, applies only to specific companies, and often lacks technical safeguards to ensure full deletion.
The Clearview case highlights this technical limitation. Although the company was required to delete certain data, it retained the ability to regenerate face prints if they comply with relevant but narrow state privacy laws. As artificial intelligence technologies advance, it has become easier than ever to piece together different datasets or reassemble seemingly unrelated details to reveal private information and run large-scale analytics.
In response, new tools are being developed to address these technical challenges at different stages of data processing. Differential privacy, for instance, adds a layer of protection by introducing noise into datasets to obscure individual identities before data analytics, while machine unlearning aims to remove data after it was used to train AI models.
But because data spreads rapidly and widely once collected, deletion capability must be built into the systems from the start to be effective. Any effective future national privacy law should require technologies to be designed from the outset to enable data deletion.
The US now faces two paths: enact a federal privacy law centering individuals’ data control or continue relying on the patchwork of sectoral and state laws.
A federal statute would set consistent standards, allow rigorous oversight, and promote predictable compliance. With data deletion rights as a central element, it could reshape the entire data ecosystem and push institutions to redesign systems to track personal data and honor deletion rights, making usage traceable and auditable. It also would make it easier to detect data brokers operating against users’ privacy preferences, enabling enforcement and public accountability.
By contrast, maintaining the status quo means allowing current data practices to outpace individuals’ ability to protect their privacy.
Without effective national privacy protections, Americans will continue to face a troubling choice: access essential services only by giving up sensitive personal information without protections, or avoid them to protect their privacy. Data deletion could shift that balance, serving as a safeguard for individuals’ control over their privacy.
This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law, Bloomberg Tax, and Bloomberg Government, or its owners.
Author Information
Guillermo Astudillo is a graduate student in psychology at Harvard University whose research focuses on data privacy.
I. Glenn Cohen is a law professor, deputy dean, and faculty director of the Petrie-Flom Center at Harvard Law School.
Write for Us: Author Guidelines
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
Learn About Bloomberg Law
AI-powered legal analytics, workflow tools and premium legal & business news.
Already a subscriber?
Log in to keep reading or access research tools.