Insurers and health tech companies developing mobile apps to let patients track Covid-19 symptoms and connect with doctors need to be mindful that their data storage practices don’t run afoul of federal and state privacy laws, attorneys said.
The new coronavirus crisis is still evolving and the federal government is trying to keep up by waiving certain federal privacy rules, in particular under the Health Insurance Portability and Accountability Act. But developers of mobile apps and websites aimed at fighting the virus still have to navigate state privacy laws and a host of other regulations, like those from the Federal Trade Commission.
“In my opinion, the health-care industry is the most regulated of all,” Kevin Haynes, chief privacy officer in the Nemours Children’s Health System Office of Compliance, said.
Health insurer Anthem Inc. March 17 announced it’s making coronavirus assessment available through the Sydney Care mobile app. Alphabet Inc.’s health-care unit Verily Life Sciences launched a web-based assessment platform March 15. Both products intend to help consumers quickly and safely evaluate their symptoms, analyze their risk of having Covid-19, and potentially get tested.
But privacy issues could arise around what type data is being collected, how patients’ health information is currently being used, and how it will be used moving forward.
Companies need to ensure their privacy policies clearly state how they collect and store data in order to stay in line with the FTC’s prohibition against deceptive commercial practices, attorneys say.
Verily and other companies operating in California must also comply with the state’s new privacy law, which prohibits them from selling a consumer’s data without explicit permission. It also gives consumers the right to sue if their information is stolen due to company negligence.
Federal Privacy Watchdog
The FTC issued guidance in 2016 listing the federal laws that may apply to mobile apps. Those include the FTC Act, the FTC’s Health Breach Notification Rule, HIPAA, and the Federal Food, Drug and Cosmetics Act.
The type of health data that is collected and how it’s compiled will determine which federal regulators will be alerted should a privacy issue come up, said Lee Kim, director of privacy and security at the Healthcare Information and Management Systems Society.
“There are a few things to keep in mind. It always goes to what’s my universe and what’s the data that’s being transacted?” Kim said. “The FTC acts as a consumer privacy watchdog through its enforcement powers under Section 5 of the FTC Act.”
Section 5 of the FTC Act prohibits unfair or deceptive business practices. It comes into play when companies fail to safeguard a consumer’s personal information.
Anthem’s Sydney Care app follows all applicable policies under the FTC Act, in addition to being HIPAA-compliant, Rajeev Ronanki, the insurer’s chief digital officer, said.
“Sydney Care keeps user information in the user’s profile, subject to the terms and conditions of use, so we can provide users with the most relevant, personalized information,” he said. “We do not sell or rent health data to third parties. Any physician-verified information provided during users chats are used to improve the app’s medical accuracy and conversational skills.”
Verily Life Sciences recently launched its Project Baseline website to help screen patients for coronavirus in four California counties. The company expanded to two more sites March 23. A Google account is necessary in order to access the platform and for authentication purposes.
That requirement has given some lawmakers, consumers, and health tech advocates pause over how sensitive health data will be stored. Google faced backlash in 2019 over concerns that its Ascension health system partnership meant the tech giant would have access to millions of patients’ data.
Five senators—Sens. Cory Booker (D-N.J.), Kamala Harris (D-Calif.), Bob Menendez (D-N.J.), Richard Blumenthal (D-Conn.), and Sherrod Brown (D-Ohio)—wrote a letter on March 18 to Vice President Mike Pence and Google CEO Sundar Pichai over privacy concerns surrounding the Verily project.
“We are concerned that neither the Administration nor Google has fully contemplated the range of threats to Americans’ personally identifiable information,” the senators said, also referencing the Google-Ascension partnership.
A Verily spokesperson said the company is separate from Google, and that it uses Google’s infrastructure to ensure the safe encryption of health information.
That data will never be joined with consumer’s information that’s stored in Google products without their explicit permission, according to the company’s website.
The data that consumers share during the screening or testing process may be shared with health-care professionals, the clinical laboratory that processes specimens, the California Department of Public Health, and potentially other federal, state, and local health authorities, as requested or mandated for public health purposes, according to its site.
Medical Advice Not Allowed
App and website developers also need to be careful of patients trying to self-diagnose based on the products’ assessment results, attorneys said.
Companies should include disclaimers to reduce the chance of self-diagnoses, and to avoid allegations that the app is providing medical advice, Linda A. Malek, chair of the health-care and privacy and cybersecurity practice groups at Moses & Singer LLP, said.
Products that appear to give medical advice “could result in a violation of state corporate practice of medicine rules,” Malek said.
A patient’s self-diagnosis could also land their sensitive information in the wrong hands, according to Kevin Coy, a partner and co-chair of the privacy and consumer regulatory practice at Arnall Golden Gregory LLP.
“In addition to the medical concerns and accuracy concerns, individuals trying to self-diagnose should be sensitive to what information they provide to any application or website and only provide the minimum amount necessary,” Coy said.
“Individuals should be on guard against potential bad actors using the crisis as a means of tricking consumers to provide social security numbers, payment information, or other sensitive data so that they can engage in fraud, identity theft or the sale of bogus treatments.”