Amazon.com Inc., Alphabet Inc.'s Google and other tech companies could face lawsuits and government crackdowns over smart speakers and video assistants if the companies fail to protect users’ privacy, attorneys told Bloomberg Law.
Amazon’s Echo, Google’s Home Hub, Facebook’s Portal, and other smart home devices are projected to rake in $3.8 billion in revenue this holiday season—a 93 percent jump from 2017, according to the Consumer Technology Association. Demand for smart speaker units, which had about 27 million shipments last year, will rise to nearly 44 million shipments this year and nearly 57 million shipments in 2019, CTA projected.
If Amazon, Apple Inc. and other companies aren’t careful, the popular devices could mean costly class action, regulatory enforcement, and reputation risks that can have a huge impact on the revenues, attorneys said.
There’s a growing concern about the invasion of privacy risks that come with internet of things smart devices. About 55 percent of consumers surveyed by PwC in 2018 said that IoT products, including smart home devices, threaten their personal privacy.
In-home personal assistants “can collect an enormous amount of data about you and your personal habits,” Melissa Kern, privacy and information security law partner at Frost Brown Todd LLC, told Bloomberg Law. Kern said that users are sometimes aware that their data is being collected, but that companies also sometimes collect it without informing users.
Echo and Home Hub capture consumer data when a “wake word” is used, such as “Alexa” or “Ok Google,” according to the companies’ terms of service. The devices don’t record all conversations, and data isn’t sent to the cloud unless a wake word activates the device, according to the policies.
A Google spokesperson told Bloomberg Law in an email that “all the devices that come with the Google Assistant are designed with privacy in mind.” Speech data is only processed after a wake word is used, “otherwise, the audio snippet stays local on the device and is discarded,” the spokesperson said. Google takes data security seriously and provides regular updates to protect data, the spokesperson said.
Amazon also designs its Echo devices with privacy in mind including multiple layers of privacy protections, a company spokesperson told Bloomberg Law in an email. For example, Echo devices only upload information to the cloud when a wake word is used and customers can review or delete any voice recording tied to its accounts, the spokesperson said.
Device makers face legal risks “if there is a security breach that results in unauthorized access to the personal data they have collected, whether due to a security flaw or not,” Kern said. Companies face the most financial risks if they “knew of the flaw and failed to take adequate steps to address it,” she said.
If hackers access an in-home personal assistant, the device maker may have to tell users under state data breach notification laws. Notifying customers can be expensive and required “when there is no known privacy or security flaw in a product,” Kern said. “Many state data breach notification laws allow the state attorney general to enforce and impose civil fines and damages,” she said.
The Federal Trade Commission could come down on device markers that break data collection promises, collect children’s data without permission, or fail to provide adequate security to collect consumer data, Andi Wilson Thompson, policy analyst at New America’s Open Technology Institute, told Bloomberg Law. The FTC in January brought an enforcement action against IoT company VTech because it didn’t properly secure data about children, she said.
Customers should read terms of services and understand what a how a smart home device protects their most sensitive data, Raj Samani, chief scientist at McAfee Inc., told Bloomberg Law. Some devices, he said, have shown vulnerabilities that make it easy to eavesdrop on childrens’ location.
For its part, Facebook limits the data it collects and takes steps to encrypt communications data, according to a Nov. 7 statement. The social-media giant said that it doesn’t collect data through Portal for advertising, but such information could be used “to improve the product.” A company spokesperson declined to comment on specific privacy concerns.
Consumers, as well as device makers, should be fully aware of what data is being collected, how it’s used, and if it’s sold to third parties, Kern said. Giving consumers full knowledge how their data is used will help grow revenue by limiting customer privacy concerns, she said.
Amazon and Apple didn’t immediately respond to Bloomberg Law’s email requests for comment.
Suppliers and manufacturers of in-home personal assistants have tried to minimize privacy risks by testing device risks before they hit the market, cybersecurity professionals said.
Amazon and Google run product security tests to check for major device flaws, Steven Andres, lecturer at the San Diego State University and managing principal at Special Ops Security, told Bloomberg Law. Amazon and other large tech companies test the data pathways from the hardware device to the cloud, he said.
Large tech companies “know that the currency of the realm is user data, so if people do not feel comfortable with their handling of data they don’t stand a chance of having consumers install what amount to recording devices in their homes,” Andres said.
In-home personal assistants inadvertently can pick up conversations and other sensitive data, Thompson said. Devices can “misinterpret conversation as commands, responding to something that sounds similar to the wake word, and executing commands like sending messages to your contacts based on a conversation you don’t know that it is listening to,” she said.
Device makers should go one step further and ensure that third-party data sharing contracts uphold security standards, Kern said.