Negotiating Privacy with Voice Assistants

A post-phenomenological approach to privacy and data sovereignty in human machine interaction

When you imagine HCI as a spectrum, you have machines on one side, and humans on the other. For the longest time, HCI was closer to the machine. Starting with punch cards, then the keyword and later the mouse, we always had to learn how to talk to machines. With touch came a tipping point and with voice control we are definitely closer to the human side now.

Human communication relies on more than one channel. Oral communication happens always within a context. Cues are exchanged, eye contact or body posture signalize attention, face and hands add intent. The cues are missing in the interaction with a voice assistant. The communication with them is, despite being on the human side of the HCI spectrum, not always very easy.

Within this framing the problem of privacy is of special concern. Voice assistants are by default always on and listening. More often than not, the assistant misunderstands commands - something I experienced myself. This leads to severe privacy breaches. For example, mistaken recordings of the user are sent to be checked by other humans, making intimate content available for an undisclosed public.

Privacy, from a legal-juridical perspective, is perceived as intimacy from a user point of view. Taken together the problem arises, that the user is giving up privacy/intimacy in order to be able to use this technology without having many possibilities of negotiating this.

These findings are rooted in my own fieldwork as well as related literature.

I would like to answer the question from a user experience as well as interaction design perspective, inquiring if the introduction of physical cues help the user negotiate privacy/intimacy with the machine. Physical cues could be a stronger visual representation of the state of the assistant in order to inform the user, as well as using touch, movement or a beacon-object to signalise certain intents to inform the assistant.

Having more control over the interaction with voice assistants could bring back trust that was lost in the process of establishing this technology, opening up the way for its application in other areas then consumer-tech. The introduction of other forms of interactions could diversify the market, which is dominated by Amazon and Google, as well as establish niches for open source approaches that already exist, but are dependent on the market-leaders in their imagination.


What effects have physical cues within the negotiation of privacy between voice assistants and their users?