Law

The Privacy Implications of Virtual Personal Assistants

Be Careful with Your ‘Wake Word’

By Andrew Levchuk and Lauren Ostberg

Andrew Levchuk

Andrew Levchuk

Lauren Ostberg

Lauren Ostberg

Everyone is now familiar with Alexa, Siri, or Google Assistant, virtual personal assistants (VPAs) marketed by Amazon, Apple, and Google, respectively.

VPAs contain voice-activated applications that promise users a chipper, responsive intelligence for dealing with everyday tasks like phone calls, calendar reminders, coffee orders, streaming entertainment, and list making. In the courtroom, however, law enforcement, digital privacy activists, technology companies — and, yes, Alexa herself — have been exploring the First and Fourth Amendment implications of VPAs’ eclectic résumé.

While VPAs are working for their users, they are also working for Google, Amazon, Apple, and other companies interested in consumers’ habits, interests, and data. Alexa, for example, is regularly ‘listening’ and scanning for her ‘wake word.’ When she hears it, she records the vocal input and her response, then uploads that data to a server in the cloud, effectively reporting it up the chain to her digital overlords at Amazon.

According to the Alexa terms of use, Amazon retains these ‘Alexa interactions,’ which include music playlists and shopping lists, in addition to ‘vocal input,’ for an unspecified amount of time. This is allegedly to provide, personalize, and improve those services, but it is also undoubtedly to provide those technology companies with a valuable, veritable harvest of data.

Looking to access that data, law enforcement is now attempting to identify or eliminate suspects in its investigations with information created by VPAs.

First Amendment

These competing interests came to a head in State v. Bates, a murder case filed in the Arkansas Circuit Court. One witness interviewed during the investigation mentioned hearing music playing during the night in question. Police pursued warrants for multiple digital devices, including the suspect’s Amazon Echo, which played music through a voice command to Alexa.

Amazon moved to quash the subpoena — it did not want Alexa’s recordings, and, with them, its proprietary data — on the public record, nor would it have been good for Alexa’s public image if she disclosed information her user believed to be private.

Amazon invoked the First Amendment, which prohibits laws “abridging the freedom of speech,” in its defense. First, Amazon argued that users’ requests to Alexa were protected speech because they were exercises of a right to anonymously browse and purchase expressive materials — in this case, audio books, music, and podcasts — without fear of government discovery.

Amazon also argued that Alexa’s response “constitutes Amazon’s First Amendment-protected speech” and goes on to say that “Alexa’s decision about what information to include in its response, like the ranking of search results, is ‘constitutionally protected opinion.’” It bears repeating that Amazon argued that “Alexa’s decision” — i.e. the decision of a VPA — was “constitutionally protected opinion.”

Alexa was not only being asked to testify against her user; now, she was being imbued with her own perspective. The extent to which the result of proprietary algorithms is ‘speech,’ and the extent to which such speech may be protected, is uncharted legal ground.

The court did not need to address these open questions about the First Amendment’s relationship to a VPA’s speech, because Bates eventually consented to have the recordings released, and the prosecutor dismissed the case (“Alexa, share my alibi”).

Fourth Amendment

Also not addressed by the court, but relevant when considering your VPA’s loyalty, is the ‘third-party doctrine,’ which essentially holds that a person has no reasonable expectation of privacy for Fourth Amendment purposes in information voluntarily shared with a third party, such as an Internet service provider or cell-phone provider. Anything communicated to your VPA is arguably not covered by the Fourth Amendment, because by communicating with your VPA, you have voluntarily shared information with the VPA’s digital overlord (e.g., Amazon in the case of Alexa).

Given the breadth of the third-party doctrine in the digital age, it is now under assault in the courts. The Supreme Court recently held in United States v. Carpenter that access to a person’s historical cell-site records — geographic records of the particular cell towers a person’s phone has been near — is a Fourth Amendment search because it violates the person’s “legitimate expectation of privacy in the record of his physical movements.” We should expect the attacks on the third-party doctrine to continue.

More generally, electronic evidence of the sort generated by VPAs and other devices is becoming a focus of law-enforcement investigations. For example, a warrant issued in 2017 in Minnesota sought personal details of anyone searching for a victim’s name in Google. Internet searches can be conducted on VPAs, so VPA users will likely be subject to similar warrants in the future.

Whether you are slipping Siri secrets about your business practices, asking Alexa to order cleaning supplies, or using any other various VPAs to verify an address, be aware that your assistant — that chipper, algorithm-driven intelligence — serves multiple masters.

Perhaps when we use the wake word “Alexa,” Alexa should respond with, “you have a right to remain silent.”

Andrew Levchuk is counsel and Lauren Ostberg is an associate at the Springfield-based law firm Bulkley Richardson. Levchuk is a 24-year veteran of the U.S. Department of Justice and now focuses on litigation and leading the cybersecurity practice. Ostberg’s practice consists of cybersecurity, commercial litigation, and intellectual-property matters.

Related Posts