Regular intensive use of intelligent personal assistants such as Amazon's "Alexa", Google's "Assistant" or Apple's "Siri" over a longer period of time can cause stress. Researchers at the Ruhr University Bochum (RUB) come to this conclusion in their new study published in the “Journal of the Academy of Marketing Science”.

Human-technology relationship

RUB researcher Sascha Alavi and colleagues from the University of Neuchâtel conducted surveys with more than 1,000 users of intelligent voice assistants as well as qualitative in-depth interviews with eleven users. Result: Many of them attribute intelligent personal assistants to have an almost human mind.

“Our study results initially confirm a positive effect of the human-technology relationship,” says Alavi. On the other hand, the team was also able to prove for the first time that this attribution of human characteristics to the technology can have negative effects.

First honeymoon, then problem

“According to our study, around 30 percent of users sometimes feel their identity is threatened by the systems. They perceive the assistants as rivals. They worry about their independence and see their privacy at risk,” explains Alavi. These damaging effects would occur particularly frequently if there was already a close, long-term relationship between the person and the assistance system.

“There are no problems in the first eight months. You could talk about a 'honeymoon effect,'” says Alavi. Then the stressful effect sets in. “For around 20 percent of consumers, the intensive use of artificial intelligence assistants over a period of eight months affects their well-being,” explains Alavi. The stress mainly comes from people's worries that such technologies could one day replace them.

Original publication: Ertugrul Uysal, Sascha Alavi, Valéry Bezençon: Trojan horse or useful helper? A relationship perspective on artificial intelligence assistants with humanlike features, in: Journal of the Academy of Marketing Science, 2022, DOI: 10.1007/s11747-022-00856-9

Source: pte


If you enjoyed this post and value the importance of well-founded information, become part of the exclusive Mimikama Club! Support our work and help us promote awareness and combat misinformation. As a club member you receive:

📬 Special Weekly Newsletter: Get exclusive content straight to your inbox.
🎥 Exclusive video* “Fact Checker Basic Course”: Learn from Andre Wolf how to recognize and combat misinformation.
📅 Early access to in-depth articles and fact checks: always be one step ahead.
📄 Bonus articles, just for you: Discover content you won't find anywhere else.
📝 Participation in webinars and workshops : Join us live or watch the recordings.
✔️ Quality exchange: Discuss safely in our comment function without trolls and bots.

Join us and become part of a community that stands for truth and clarity. Together we can make the world a little better!

* In this special course, Andre Wolf will teach you how to recognize and effectively combat misinformation. After completing the video, you have the opportunity to join our research team and actively participate in the education - an opportunity that is exclusively reserved for our club members!


Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )