How to Combine Personalization and Privacy in Virtual Assistants
With the rise of AI and rapid consumer adoption of virtual assistants such as Amazon Alexa, Google Assistant, Apple Siri and Samsung Bixby, brands face a different set of privacy challenges. In addition to new consumer privacy laws, companies must determine how to comply with older laws that were enacted before voice technology was in use.
Smart speakers in a crowded house
You’ve set up your smart speaker – with or without a screen - at home, where you live with several other people. Everyone talks to the device. But does the device know who is speaking? In many cases, no.
During set up of smart speakers, users may be asked to set up a voice profile. Did everyone in the household provide one? Probably not. What happens when you have visitors who use your smart speakers?
Voice profiles help the AI natural language processing software that drives these devices recognize different users. But the device is still registered to a single user.
Although generally thought of as easier to link to an individual than smart speakers, there is still no guarantee that only one person uses any smart phone, tablet or vehicle. Parents monitor children’s devices. Spouses may answer calls and texts for each other. Families may share a tablet(s) and vehicles. Virtual assistants may find it difficult to keep up with the device juggling, even with voice profiles.
In this case, software and hardware manufacturers can only go so far in protecting users’ privacy. Device owners must determine who can, and cannot, user their device and take steps to protect them with strong passwords and PINS.
Smart watches, ear buds, smart glasses and even clothing bring us closer to a one-to-one relationship between humans and a device. Not many people share them. But it may be easier to lose them. Consider what data may, or may not, be stored on these small devices.
Personalization is an integral part of the conversational and technical voice app design process for brands who want to integrate with their customer relationship management (CRM) systems. Some brands mistakenly rely only upon the registered user of a device or voice assistant to "personalize" the interaction. This can result in privacy issues and an inappropriate voice-first user experience (VUX).
Voice technology experts can help brands determine what level of personalization is possible and optimal for an effective and enjoyable voice-first user experience. Additionally, they can help verify users through various authentication methods that apply specifically to virtual assistants. Due to the limited amount of personal data provided by the device platforms, personalization has some limitations on how personal a voice app can be.
Consumers may trade some personally identifiable information (PII) for something of value from a brand – convenience, coupon, etc. But brands are ultimately responsible for honoring consumers’ privacy preferences.
The personalization-versus-privacy discussion applies across all technology. Voice assistant technology is on the upswing just as new privacy laws are going into effect. Seek professional help to avoid setbacks.
Consumer-driven Privacy and Laws
During the process of registering your smart speaker or setting up your voice assistant, you set the privacy level, based upon options provided by the manufacturer. You have the option, also, of keeping the device from listening during private times and erasing your voice recordings. That has an impact on how much personalization brands can provide.
Regardless of what technology you use, privacy and security are important to both users and brands. Consumer privacy laws going into effect in January 2020 in California, with other U.S. states to follow. Now is a good time to integrate voice technology into your privacy policies and procedures.