Subscriber Benefit
As a subscriber you can listen to articles at work, in the car, or while you work out. Subscribe NowTech companies have long encouraged putting listening devices in homes and pockets, attempting to convince consumers to rely on their voice assistants for any little need that pops up. But some are growing concerned that these devices are recording even when they’re not supposed to—and they’re taking their fears to the courts.
On Thursday, a judge ruled that Apple will have to continue fighting a lawsuit brought by users in federal court in California, alleging that the company’s voice assistant Siri has improperly recorded private conversations.
The judge said that most of the lawsuit could move forward, despite Apple’s request to have it thrown out. Judge Jeffrey S. White, of federal district court in Oakland, did dismiss one piece involving users’ economic harm. But he ruled that the plaintiffs, who are trying to make the suit a class action case, could continue pursuing claims that Siri turned on unprompted and recorded conversations that it shouldn’t have and passed the data along to third parties, therefore violating user privacy.
The case is one of several that have been brought against Apple, Google and Amazon that involve allegations of violation of privacy by voice assistants. The technologies, often referred to by their names—Siri, Alexa and, predictably, Google—are meant to help with everyday tasks. They connect to speakers and can play music, or set a timer or add an item to a shopping list. (Amazon founder Jeff Bezos owns The Washington Post.)
The companies deny that they are listening to conversations for any purpose other than the intended ones of helping with tasks or playing music. Amazon did not immediately respond to requests for comment. Apple pointed to court filings, and Google said it will fight the lawsuit.
This type of technology is designed to listen for its wake word, said Noah Goodman, an associate professor of computer science and psychology at Stanford University. This is a challenging task because voices often vary significantly from person to person.
Try as the companies might, it’s unlikely they can “get rid of false alarms completely,” he said.
This suit, and a similar one against Google that is also making its way through the federal court system in California, threatens to plunge the companies once again into hot water for the way they handle private information they collect from millions of users. Voice assistants have skyrocketed in popularity—eMarketer estimated late last year that 128 million people in the U.S. would use one at least monthly.
But as they grow in popularity, more people are waking up to concerns that they might be listening a little too closely for comfort.
A Washington Post investigation in 2019 found that Amazon kept a copy of everything Alexa records after it thinks it hears its name—even if users didn’t realize.
The Google suit is brought by the same plaintiff’s lawyers and alleges the company should not be using information learned when it incorrectly turns on for advertising, according to Reuters.
In responses to the lawsuit, Apple says it does not sell Siri recordings and recordings are not associated with an “identifiable individual.”
“Apple believes that privacy is a fundamental human right and designed Siri so users could enable or disable it at any time,” the company said in its motion to dismiss. “Apple actively works to improve Siri to prevent inadvertent triggers and provides visual and audio cues (acknowledged by several Plaintiffs) so users know when Siri is triggered.”
In an emailed statement Thursday, Google said it keeps information secure.
“By default, we don’t retain your audio recordings and make it easy to manage your privacy preferences, with things like simple answers to your privacy questions or enabling Guest Mode,” spokesman José Castañeda said. “We dispute the claims in this case and will vigorously defend ourselves.”
The voice assistants are supposed to turn on when prompted—saying “Hey, Siri,” for example—but the lawsuit alleges that plaintiffs saw their devices activate even when they didn’t call out the wake word. That conversation was recorded without their consent and the information was then used to target advertisements toward them and sent on to third-party contractors to review, they allege.
“These conversations occurred in their home, bedroom, and car, as well as other places where Plaintiffs Lopez and A.L. were alone or had a reasonable expectation of privacy,” the lawsuit alleges.
In 2019, Apple largely suspended its use of a practice that allowed human reviewers to listen to and “grade” recordings made with Siri. At the time, it said it would use computer-generated transcripts for review. Tech companies say they use these reviews to learn what’s working and what’s not in order to improve their products.
Just months later, the Associated Press reported that Apple had once again started using human reviewers to listen to recordings, giving users the option to opt out.
The lawsuits ask the companies to contend with what they do once they hear something they weren’t intended to. Nicole Ozer, the technology and civil liberties director of the ACLU of California, said the suits are a sign that people are realizing how much information the voice technology is collecting.
“I think this lawsuit is part of people finally starting to realize that Siri doesn’t work for us, it works for Apple,” she said.
Please enable JavaScript to view this content.