Conditioning the User
More often than not, the user has to be conditioned into a way of communicating or behaving with the assistant as according to the manufacturer and developer of the device and assistant. They hardly ever have a say in this themselves.
Behind the acquaintance of a voice assistant device, or devices in many cases, lay a specific image of a promised future. Home automation, kitchen and household aid, extension of the brain.
These can be rudimentary be merged into one big theme of Control and convenience, also known as the preservation of energy by offloading tasks to the computer and its attached devices. The user becomes conditioned into believing in this future through advertisement and clever storytelling. One is promised a simpler life, sometimes also a more social or more creative one.
In my research, I did not find this promised future to manifest itself leaving the users more often than not, frustrated and the buggy, glitchy, and dumb technology that they brought into their homes. Nonetheless, the user sticks with their assistant’s limited capabilities but need to drastically change their Communication, Language and Behavior towards and with the assistant.
Users feel that they are having an unnatural interaction with the assistant. They need to change their speech patterns, the language they usually use or are given irradicate responses by the devices. The user thus gets conditioned into behaving as the manufacturer of the device wants, to accommodate the limited capabilities. Or otherwise, no interaction would be possible at all.
They try repeatedly to play music but the device responds with “Je suis désolé, il y a un problème”. At other times, the device misunderstands the command but it still doesn’t work. The music they wish for can’t be played. They give up, slightly annoyed, and concentrate on the situation around the table again, eating and socializing.
Of course, there are also good moments, and most of the time, the assistant does what it is asked for. But this bond, and the evoked emotions, are mediocre at best. What sticks with the user are mostly frustration, sometimes anger, and in a few cases, hate and rage. It is for sure not the promised future that the user was tricked into buying into.
By conditioning the user, by controlling the technological narrative, controlling the users’ imagination, GAFA can bring about their utopia. The users know very well about the unsustainability of voice assistant technologies, electronics, and the cloud. But they don’t see any other alternative, they can’t imagine one. And it is the same with privacy issues. Users rather have an anonymous big data interface in their home, recording their family, than a human researcher observing them.
Yes, many of these, let’s say, gadgets are not very sustainable in principle. Because they are rather cheap products and are thrown on the market relatively cheaply so that they can be distributed as quickly as possible. From that point of view, yes, it’s now probably not necessarily the most sustainable.