Your Digital Assistant, Your Privacy


By Gille Ann Rabbin, Esq., CIPP/US

About a month ago my family sat down to dinner and midway through the meal my son mentioned my husband’s “man cave.” My husband corrected him, stating that in actuality the room spoken of was his office. Inconsequential table banter no doubt, but later that evening my husband received an email advertising décor options for the perfect man cave.

Coincidence?  Or was my husband’s iPhone spying on us?

Digital assistants, like Apple’s Siri, Amazon’s Alexa, Google’s Home, and Microsoft’s Cortana are present in millions of consumers’ homes, and voice searching has become more prevalent. Virtual voice assistants use artificial intelligence to analyze speech and respond to our voices. Search requests and location data can be connected to our user accounts and when compiled can produce profiles of our personal lives, with interests, preferences, and routines. This enables the devices to deliver accurate results.

But questions have arisen regarding what these devices do with our data and how they handle privacy. Consumers have become increasingly fearful, with a recent study showing that 41% of virtual assistant users are concerned about privacy, trust, and eavesdropping.

In April, the media reported that thousands of employees working for Amazon are accessing voice and text transcripts of Alexa interactions. According to Amazon, this data is being reviewed and used by humans to help improve Alexa. At that time, Amazon also said that all consumers had to do if they didn’t want their data retained was to manually delete the audio files.

Subsequently, Delaware Senator Chris Coons sent a letter to Amazon inquiring about Alexa’s data retention practices. Amazon’s response, published in the media in early July, called into question its earlier statements about the ability of consumers to delete their information. In part, Amazon stated that not all records of conversations with Alexa can be deleted by consumers, even if a consumer manually removes the audio files from their account. For example, transactional data likely containing personal data is retained.

In response to Amazon’s recent admissions, Senator Coons stated that "Amazon's response leaves open the possibility that transcripts of user voice interactions with Alexa are not deleted from all of Amazon's servers, even after a user has deleted a recording of his or her voice…the extent to which this data is shared with third parties, and how those third parties use and control that information, is still unclear."

Why is our data being retained? Who has access to it? Will it be shared or sold? How securely is it being kept? Is it accessible to hackers?

The takeaway is this: if you’re thinking about buying a virtual voice assistant, think carefully about whether the convenience of having one, outweighs your privacy.  If you decide to proceed, read the product’s consumer privacy disclosures to get as much information as possible about how your privacy will be handled, and don’t use the product until you’re fully informed.  

You should carefully examine any new technology you are considering using. If you don’t understand how it works or if you’re not sure you do, don’t use it.