Photo: @rohane via Twenty20
A university study finds people increasingly attributing lifelike qualities to virtual assistants and warns this may cause them to reveal more personal information to the companies that use them than they otherwise would.
“These agents are data gathering tools that companies are using to sell us stuff,” said Edward Lank, a professor at the University of Waterloo’s David R. Cheriton School of Computer Science, in a statement. “People need to reflect a little to see if they are formulating impressions of these agents rather than seeing them as just a piece of technology and trusting them in ways based on these impressions.”
Researchers had 20 subjects interact with Alexa, Google Assistant and Siri and then asked them about the personalities of the virtual agents and to create an avatar for each. Alexa’s sentiment was seen as genuine and caring, while Siri’s was viewed as disingenuous and cunning. Alexa’s individuality was commonly described as neutral and ordinary, while participants considered the individuality of Google — and Siri, especially — more defined and pronounced.
The researchers also found that participants described varied perceptions of height, age, style and hair when asked to physically describe the virtual agents. Alexa, for instance, was seen as average height or slightly shorter and older than the others. Her hair was described as darker, wavy and worn down.
Asked about privacy implications, Mr. Lank wrote in an email to MediaPost, “Understanding how and why users may trust technologies such as conversational agents allows us to better understand how to train [users], regulate [information] and design [products] in a way that balances empowering users against respect for privacy.”
Infusing virtual assistants with personality rather than robotic responses are expected to drive engagement and usage. Alexa is often called out for her surprising answers to questions about her tastes and opinions.
“We knew we wanted the assistant to have characteristics that were important to us at Amazon as builders,” Amazon VP of Alexa experience and Echo devices told Variety last year. “Smart, humble, helpful, sometimes funny.”
In late November, Amazon told app developers it was looking to add feelings like disappointment or excitement to Alexa’s responses.
- People too trusting of virtual assistants – University of Waterloo
- How Humanizing Virtual Assistants Earns Consumer Trust – MediaPost
- How Alexa Got Her Personality – Variety
- Use New Alexa Emotions and Speaking Styles to Create a More Natural and Intuitive Voice Experience – Amazon
- Alexa, Should We Trust You? – The Atlantic