Watch your language. The digital assistants might be listening.
Aided by those assistants, millions of gadgets today—from Apple’s iPhone to Mattel’s Hello Barbie—can receive voice commands, understand what you’re saying, and respond in a relatively intelligible fashion.
Amazon’s Alexa, Apple’s Siri, Google Now, and Microsoft’s Cortana presumably generate vasts amounts of data. How the companies behind these top voice-driven digital assistants manage it concerns security and privacy advocates: When are the assistants recording? Who can access the recordings, and how secure are they?
Here’s what we found out.
When are the assistants listening and recording?
All four can continually listen for a phrase like “Hey Siri” that triggers them to pay attention to what you say next.
Once activated, they record and upload your voice to the cloud. While recording, Echo turns on a blue light ring, Cortana displays a throbbing blue circle, and Siri displays, “Go ahead, I’m listening…”
Their automated features can result in unintended consequences. Some people listening to a recent NPR report about Alexa, for example, said the report activated and prompted their own Echo devices to reset thermostats or launch other NPR programs.
What other data does Siri access?
Besides your voice, virtual assistants can capture data related to your contacts, calendar, browsing histories, location, music library, purchases, and other personal preferences. They can use it to initiate phone calls, schedule appointments, pull up traffic or weather reports en route to your destination, or suggest nearby restaurants.
Amazon, Google, and Microsoft associate this data with your username, encrypt it, then store it indefinitely in the cloud. Apple associates voice recordings with a randomly generated identification number, stores them securely for six months, then assigns a new identifier so the recordings are no longer connected to your account.
Could they identify you simply by your voice?
“Your voice is inherently unique,” says Lynn Terwoerds, executive director of the Voice Privacy Alliance, a new industry group that promotes best practices for voice interfaces. “People have questions around how something so personal can be used for biometrics.”
But actually identifying your voice at random isn’t as easy as it may seem, says Tim Tuttle, founder and CEO of MindMeld, which develops intelligent voice interfaces for devices and applications.
“To identify someone uniquely from their voice, you’d need to have a database with voice samples from every one of the 2 billion people with smartphones,” Tuttle says. “And then you’d have to train a machine-learning model to pick one out from all the others. I don’t think any company has collected a voice print database that large.”
Could law enforcement agencies or hackers gain access to your recordings?
When served with a court order, each company will either surrender your personal data to the relevant legal authority or challenge it. Apple says that in 2015, it received about 2,000 requests from U.S. law enforcement agencies for iCloud account data, and it complied with more than 80 percent of them.
So if you ask Siri how to murder your spouse, and his or her death appears suspicious, don’t be surprised if your Siri recordings—along with your Web searches—are introduced as evidence in court.
Could your digital assistant be hacked to eavesdrop on you?
Possibly. So far, though, the only known attacks on voice assistants have occurred in a lab.
Researchers in France last year demonstrated a way to remotely control phones using Siri or Google Now. The hack required each handset to use an external microphone and have the voice assistant enabled from the lock screen.
An April 2015 report by security vendor Veracode detailed flaws in voice-activated home automation hubs from Ubi and Wink that would enable them to be used as listening devices. (These devices have since been modified or removed from the market.)
And Mattel’s voice-driven Hello Barbie doll initially came with a wide range of security flaws, some of which could theoretically allow hackers to eavesdrop on kids. ToyTalk, which built and manages Hello Barbie’s voice technology, says it has since addressed the “relevant and credible vulnerability issues.”
Can you delete the recordings?
Amazon, Microsoft, and Google let you individually view and delete your recordings online. Apple doesn’t show you individual recordings to delete, but it automatically deletes all of the information it’s gathered after you deactivate Siri.
Beware that deactivating your digital assistant also nukes any data the service uses to customize the information it provides, such as your favorite restaurants or sports teams. So after reactivating it, you essentially start from scratch.
And remember that the sky is not falling, Terwoerds says. Voice interfaces are here to stay; the challenge is to make them secure out of the box.
“We want to help developers build trust with customers by understanding the emotional issues they have with voice, and addressing them up front,” she says.