Does humanizing virtual assistants undermine consumer privacy?

Discussion
Photo: @rohane via Twenty20
Jan 13, 2020
Tom Ryan

A university study finds people increasingly attributing lifelike qualities to virtual assistants and warns this may cause them to reveal more personal information to the companies that use them than they otherwise would.

“These agents are data gathering tools that companies are using to sell us stuff,” said Edward Lank, a professor at the University of Waterloo’s David R. Cheriton School of Computer Science, in a statement. “People need to reflect a little to see if they are formulating impressions of these agents rather than seeing them as just a piece of technology and trusting them in ways based on these impressions.”

Researchers had 20 subjects interact with Alexa, Google Assistant and Siri and then asked them about the personalities of the virtual agents and to create an avatar for each. Alexa’s sentiment was seen as genuine and caring, while Siri’s was viewed as disingenuous and cunning. Alexa’s individuality was commonly described as neutral and ordinary, while participants considered the individuality of Google — and Siri, especially — more defined and pronounced. 

The researchers also found that participants described varied perceptions of height, age, style and hair when asked to physically describe the virtual agents. Alexa, for instance, was seen as average height or slightly shorter and older than the others. Her hair was described as darker, wavy and worn down.

Asked about privacy implications, Mr. Lank wrote in an email to MediaPost, “Understanding how and why users may trust technologies such as conversational agents allows us to better understand how to train [users], regulate [information] and design [products] in a way that balances empowering users against respect for privacy.” 

Infusing virtual assistants with personality rather than robotic responses are expected to drive engagement and usage. Alexa is often called out for her surprising answers to questions about her tastes and opinions. 

“We knew we wanted the assistant to have characteristics that were important to us at Amazon as builders,” Amazon VP of Alexa experience and Echo devices told Variety last year. “Smart, humble, helpful, sometimes funny.” 

In late November, Amazon told app developers it was looking to add feelings like disappointment or excitement to Alexa’s responses. 

DISCUSSION QUESTIONS: Do you agree that the more personable a virtual assistant is, the more likely a user is to share personal data? Should that concern retailers or brands?

Please practice The RetailWire Golden Rule when submitting your comments.
Braintrust
"Creators of AI personas will have more power to shape culture than it might seem at the moment."
"...the only reason to make these assistants more “friendly” is to take advantage of the friendship to mine data from the user."
"Voice virtual assistants will define human interaction with technology in the 2020s."

Join the Discussion!

13 Comments on "Does humanizing virtual assistants undermine consumer privacy?"


Sort by:   newest | oldest | most voted
Richard Hernandez
BrainTrust

I don’t know if making Alexa or Siri more personable would make me want to share my data more readily. I still think it’s a little creepy when Amazon sends me email recommendations on things I looked at but didn’t buy…

Art Suriano
Guest
There is no doubt that technology continues to get better every day, and that includes virtual assistants, robots, and anything automated. We’re just not there yet, and it will take time before we arrive at a point where the technology is so amazing that we won’t know whether we’re speaking with a human or a computer. Think back to 20 years ago and look at where we are now with how cell phones, computers, and tablets have all improved immensely. Now try to imagine 20 years from now. By then, I believe we will find that the in-store “robot” will be as common as a kiosk or self-checkout machine. So it’s just a matter of time. As for sharing personal information, we are all aware of how we are currently being tracked with everything we do, whether it be Google, Amazon, Facebook, or so many other sources. Yet most of us allow it and accept it as a part of life. It’s dangerous in some ways, but very few people are resisting. So in time,… Read more »
Suresh Chaganti
BrainTrust
Suresh Chaganti
Co-Founder and Executive Partner, VectorScient
8 months 13 days ago

Privacy concerns around virtual assistants are real. People will probably give out more qualitative information to them than they do today. Besides the creep factor, it is debatable whether there is a potential for these virtual assistants to actually do more harm. Identity theft and related risks are already pervasive with current ways of transacting online.

I see an option to have some kind of premium subscription where businesses promise not to use conversations with the assistant to market other products, or even an option to not store data in their cloud. All this depends on how serious the privacy concerns become.

Doug Garnett
BrainTrust

The fundamental problem here is that Google and Amazon need those interactions in order to gather data. In other words, the only reason to make these assistants more “friendly” is to take advantage of the friendship to mine data from the user.

We are also still living in the early days of these — when Star Trek driven fantasies of talking with the computer also help drive interaction. People don’t yet have a good sense of the limitations so the novelty can over-ride common sense.

Retailers and brands should take care. There will be blowback when it becomes clear how much people have been manipulated by the companies behind the assistants.

Steve Montgomery
BrainTrust

I am sure I will be in the minority on this, but I wouldn’t share any personal data with any of the virtual assistants. Actually I wouldn’t and won’t have one in my home. I realize that privacy today is an illusion but see no reason to assist in the process.

Evan Snively
BrainTrust

Twenty subjects is a pretty small sample size to draw conclusions from, but it certainly is an interesting premise for an experiment – I wish they had a bigger participant pool!

I don’t think that simply making a virtual assistant “personable” is enough to be successful, but a persona certainly needs to be carefully thought out. For instance, the personality for a beauty retailer should probably skew feminine and have a more bubbly personality, but one for a doctor’s office will find more success instilling a feeling of expertise and confidence which right now might require the persona of a more to-the-point, male persona.
The unintentional (or unavoidable?) stereotyping of gender roles and the impact that will have on reinforcing those perceptions is something that must also be taken into consideration. Creators of AI personas will have more power to shape culture than it might seem at the moment.

Jeff Weidauer
BrainTrust

The explosion of always-on speakers has happened with little thought to the privacy implications for most people. Adding personality to Alexa and her peers has made the AI much friendlier and accessible, and absolutely invites a sense of trust from users. A backlash will arise as use and comfort with the technology grows, and the interface becomes more conversational. For now, retailers and brands should be mindful of how they utilize these ill-gotten gains.

Ryan Mathews
BrainTrust
I think it depends on a variety of more nuanced factors. If one watches children interact with Alexa for example, it is clearly a relationship with a “person” rather than an “object.” Younger children tend to treat Alexa as a playmate, not a device. This might also manifest itself at the opposite end of the age spectrum. I could easily foresee a time when Alexa might serve as a, “digital companion or caregiving assistant,” for the elderly, listening to stories, making occasional comments, and reminding them to take medications. For the rest of us, if you want to believe that you have a “relationship” with Alexa, or Siri, or Google that is so real you feel compelled to overshare — you need a relationship with a therapist or at least to get out more. The only real concern for branders is that this confusion of what is and is not appropriate to “share” with a voice-activated device … I can’t really believe I just typed this … may create some blowback as people in a… Read more »
John Karolefski
BrainTrust

I may be wrong, but I believe that virtual assistants like Siri and Alexa are always listening. We don’t have to share information with them. They are silently gathering data about us. That is why some folks refuse to invite them into their homes.

Mohamed Amer
BrainTrust

Voice virtual assistants will define human interaction with technology in the 2020s. While voice assistants will be designed and based on functional objectives (car assistant, a child’s cuddly toy, a search or shopping engine, etc…), we will relate, identify, and engage with them based on how the technology is materialized and integrated in our daily activities. The more we interact with them, the more they become a part of our identity, how we make sense of the world, and overall meaning-making.

This will be far reaching and more revolutionary than any human-technology interface designed to date. It has the potential to bring comfort to young and old, simplify life, or be a trusted friend. Yet, with such well deserved optimism comes a very dark side ripe for abuse and a world in which our worst instincts are unleashed. There’s no going back on this trajectory, technology providers must own up to a greater responsibility on how their technology impacts society and data usage and privacy must become a top policy priority.

Ralph Jacobson
BrainTrust

Yes, the more realistic the chatbot, the more likely the relevant conversation with the user. Also I believe shoppers generally have no problem sharing personal data when they see a direct, value-added benefit for sharing. Privacy concerns are far more prevalent in the financial security area than in personal areas.

Morgan Linton
Guest

I do think that as virtual assistants continue to become more personable, people will in turn be more likely to share personal data. That being said, right now there’s still a rather thick line between a bot and a person and it’s going to take more time for people to truly start to feel like the virtual assistant is a real person.

As for retailers and brands being concerned, I think they would actually be excited by the opportunity. If consumers do share more personal data with virtual assistants, this data can be used to better personalize the customer experience which in the end benefits both the customer and the retailer/brand.

Brian Cluster
BrainTrust

Yes, I think for most households the more comfortable you are with an assistant whether real or virtual the more you will share with it. While the adults in the house may be more guarded with sharing information, there will be children that have access to the smart speakers and will likely be very open with technology that sounds like a human friend.

Companies willing to be responsible and committing to truly serving their customers should offer various levels of privacy and customization allowed by the virtual assistance so consumers have more control over their technology in the house and the data that is shared.

wpDiscuz
Braintrust
"Creators of AI personas will have more power to shape culture than it might seem at the moment."
"...the only reason to make these assistants more “friendly” is to take advantage of the friendship to mine data from the user."
"Voice virtual assistants will define human interaction with technology in the 2020s."

Take Our Instant Poll

Do you see more benefits than drawbacks to retail if more personable virtual assistants encourage users to share more information?

View Results

Loading ... Loading ...